WorldWideScience

Sample records for release consistent distributed

  1. Maintaining consistency in distributed systems

    Science.gov (United States)

    Birman, Kenneth P.

    1991-01-01

    In systems designed as assemblies of independently developed components, concurrent access to data or data structures normally arises within individual programs, and is controlled using mutual exclusion constructs, such as semaphores and monitors. Where data is persistent and/or sets of operation are related to one another, transactions or linearizability may be more appropriate. Systems that incorporate cooperative styles of distributed execution often replicate or distribute data within groups of components. In these cases, group oriented consistency properties must be maintained, and tools based on the virtual synchrony execution model greatly simplify the task confronting an application developer. All three styles of distributed computing are likely to be seen in future systems - often, within the same application. This leads us to propose an integrated approach that permits applications that use virtual synchrony with concurrent objects that respect a linearizability constraint, and vice versa. Transactional subsystems are treated as a special case of linearizability.

  2. Techniques for Reducing Consistency-Related Communication in Distributed Shared Memory System

    OpenAIRE

    Zwaenepoel, W; Bennett, J.K.; Carter, J.B.

    1995-01-01

    Distributed shared memory 8DSM) is an abstraction of shared memory on a distributed memory machine. Hardware DSM systems support this abstraction at the architecture level; software DSM systems support the abstraction within the runtime system. One of the key problems in building an efficient software DSM system is to reduce the amount of communication needed to keep the distributed memories consistent. In this paper we present four techniques for doing so: 1) software release consistency; 2)...

  3. SIMPLE ESTIMATOR AND CONSISTENT STRONGLY OF STABLE DISTRIBUTIONS

    Directory of Open Access Journals (Sweden)

    Cira E. Guevara Otiniano

    2016-06-01

    Full Text Available Stable distributions are extensively used to analyze earnings of financial assets, such as exchange rates and stock prices assets. In this paper we propose a simple and strongly consistent estimator for the scale parameter of a symmetric stable L´evy distribution. The advantage of this estimator is that your computational time is minimum thus it can be used to initialize intensive computational procedure such as maximum likelihood. With random samples of sized n we tested the efficacy of these estimators by Monte Carlo method. We also included applications for three data sets.

  4. The consistency service of the ATLAS Distributed Data Management system

    CERN Document Server

    Serfon, C; The ATLAS collaboration

    2011-01-01

    With the continuously increasing volume of data produced by ATLAS and stored on the WLCG sites, the probability of data corruption or data losses, due to software and hardware failures is increasing. In order to ensure the consistency of all data produced by ATLAS a Consistency Service has been developed as part of the DQ2 Distributed Data Management system. This service is fed by the different ATLAS tools, i.e. the analysis tools, production tools, DQ2 site services or by site administrators that report corrupted or lost files. It automatically corrects the errors reported and informs the users in case of irrecoverable file loss.

  5. The Consistency Service of the ATLAS Distributed Data Management system

    CERN Document Server

    Serfon, C; The ATLAS collaboration

    2010-01-01

    With the continuously increasing volume of data produced by ATLAS and stored on the WLCG sites, the probability of data corruption or data losses, due to software and hardware failure is increasing. In order to ensure the consistency of all data produced by ATLAS a Consistency Service has been developed as part of the DQ2 Distributed Data Management system. This service is fed by the different ATLAS tools, i.e. the analysis tools, production tools, DQ2 site services or by site administrators that report corrupted or lost files. It automatically correct the errors reported and informs the users in case of irrecoverable file loss.

  6. Parton Distributions based on a Maximally Consistent Dataset

    Science.gov (United States)

    Rojo, Juan

    2016-04-01

    The choice of data that enters a global QCD analysis can have a substantial impact on the resulting parton distributions and their predictions for collider observables. One of the main reasons for this has to do with the possible presence of inconsistencies, either internal within an experiment or external between different experiments. In order to assess the robustness of the global fit, different definitions of a conservative PDF set, that is, a PDF set based on a maximally consistent dataset, have been introduced. However, these approaches are typically affected by theory biases in the selection of the dataset. In this contribution, after a brief overview of recent NNPDF developments, we propose a new, fully objective, definition of a conservative PDF set, based on the Bayesian reweighting approach. Using the new NNPDF3.0 framework, we produce various conservative sets, which turn out to be mutually in agreement within the respective PDF uncertainties, as well as with the global fit. We explore some of their implications for LHC phenomenology, finding also good consistency with the global fit result. These results provide a non-trivial validation test of the new NNPDF3.0 fitting methodology, and indicate that possible inconsistencies in the fitted dataset do not affect substantially the global fit PDFs.

  7. Understanding and Improving the Performance Consistency of Distributed Computing Systems

    NARCIS (Netherlands)

    Yigitbasi, M.N.

    2012-01-01

    With the increasing adoption of distributed systems in both academia and industry, and with the increasing computational and storage requirements of distributed applications, users inevitably demand more from these systems. Moreover, users also depend on these systems for latency and throughput

  8. Norepinephrine storage, distribution, and release in diabetic cardiomyopathy

    International Nuclear Information System (INIS)

    Ganguly, P.K.; Beamish, R.E.; Dhalla, K.S.; Innes, J.R.; Dhalla, N.S.

    1987-01-01

    The ability of hearts to store, distribute, and release norepinephrine (NE) was investigated in rats 8 wk after the induction of diabetes by an injection of streptozotocin. Chronic diabetes was associated with increased content and concentration of NE in heart and in other tissues such as kidney, brain, and spleen. Reserpine or tyramine treatment resulted in depletion of endogenous cardiac NE in control and diabetic rats. The depletion of NE stores at different times after a dose of reserpine was greater in diabetic hearts. On the other hand, NE stores in diabetic hearts were less sensitive than control hearts to low doses of tyramine but were more sensitive to high doses. The uptake of [ 3 H]NE was greater in diabetic hearts in isolated perfused preparations. In comparison with the control values, diabetic hearts showed a decrease in [ 3 H]NE in the granular fraction and an increase in the supernatant fraction. Diabetic hearts also showed an accelerated spontaneous release of [ 3 H]NE. The increased cardiac NE and the uptake and release of NE in diabetic animals were reversible upon treatment with insulin. These results are consistent with the view that sympathetic activity is increased in diabetic cardiomyopathy and indicate that cardiac NE in diabetic rats is maintained at a higher level partly due to an increased uptake of released NE by adrenergic nerve terminals

  9. 21 CFR 211.165 - Testing and release for distribution.

    Science.gov (United States)

    2010-04-01

    ... (CONTINUED) DRUGS: GENERAL CURRENT GOOD MANUFACTURING PRACTICE FOR FINISHED PHARMACEUTICALS Laboratory Controls § 211.165 Testing and release for distribution. (a) For each batch of drug product, there shall be... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Testing and release for distribution. 211.165...

  10. Consistent energy barrier distributions in magnetic particle chains

    International Nuclear Information System (INIS)

    Laslett, O.; Ruta, S.; Chantrell, R.W.; Barker, J.; Friedman, G.; Hovorka, O.

    2016-01-01

    We investigate long-time thermal activation behaviour in magnetic particle chains of variable length. Chains are modelled as Stoner–Wohlfarth particles coupled by dipolar interactions. Thermal activation is described as a hopping process over a multidimensional energy landscape using the discrete orientation model limit of the Landau–Lifshitz–Gilbert dynamics. The underlying master equation is solved by diagonalising the associated transition matrix, which allows the evaluation of distributions of time scales of intrinsic thermal activation modes and their energy representation. It is shown that as a result of the interaction dependence of these distributions, increasing the particle chain length can lead to acceleration or deceleration of the overall relaxation process depending on the initialisation procedure.

  11. Assessment of particle size distribution in CO 2 accidental releases

    NARCIS (Netherlands)

    Hulsbosch-Dam, C.E.C.; Spruijt, M.P.N.; Necci, A.; Cozzani, V.

    2012-01-01

    A model was developed to calculate the particle size distribution following the release of pressurised supercritical CO 2. The model combines several sub-models for the different stages of jet break-up and specifically addresses the possible formation of solid particles, which is important for CO 2

  12. Predicting the distribution of contamination from a chlorinated hydrocarbon release

    Energy Technology Data Exchange (ETDEWEB)

    Lupo, M.J. [K.W. Brown Environmental Services, College Station, TX (United States); Moridis, G.J. [Lawrence Berkeley Laboratory, Berkeley, CA (United States)

    1995-03-01

    The T2VOC model with the T2CG1 conjugate gradient package was used to simulate the motion of a dense chlorinated hydrocarbon plume released from an industrial plant. The release involved thousands of kilograms of trichloroethylene (TCE) and other chemicals that were disposed of onsite over a period of nearly twenty years. After the disposal practice ceased, an elongated plume was discovered. Because much of the plume underlies a developed area, it was of interest to study the migration history of the plume to determine the distribution of the contamination.

  13. Global and local consistencies in distributed fault diagnosis for discrete-event systems

    NARCIS (Netherlands)

    Su, R.; Wonham, W.M.

    2005-01-01

    In this paper, we present a unified framework for distributed diagnosis. We first introduce the concepts of global and local consistency in terms of supremal global and local supports, then present two distributed diagnosis problems based on them. After that, we provide algorithms to achieve

  14. Self-consistent particle distribution of a bunched beam in RF field

    CERN Document Server

    Batygin, Y K

    2002-01-01

    An analytical solution for the self-consistent particle equilibrium distribution in an RF field with transverse focusing is found. The solution is attained in the approximation of a high brightness beam. The distribution function in phase space is determined as a stationary function of the energy integral. Equipartitioning of the beam distribution between degrees of freedom follows directly from the choice of the stationary distribution function. Analytical expressions for r-z equilibrium beam profile and maximum beam current in RF field are obtained.

  15. Managing Consistency Anomalies in Distributed Integrated Databases with Relaxed ACID Properties

    DEFF Research Database (Denmark)

    Frank, Lars; Ulslev Pedersen, Rasmus

    2014-01-01

    In central databases the consistency of data is normally implemented by using the ACID (Atomicity, Consistency, Isolation and Durability) properties of a DBMS (Data Base Management System). This is not possible if distributed and/or mobile databases are involved and the availability of data also...... has to be optimized. Therefore, we will in this paper use so called relaxed ACID properties across different locations. The objective of designing relaxed ACID properties across different database locations is that the users can trust the data they use even if the distributed database temporarily...... is inconsistent. It is also important that disconnected locations can operate in a meaningful way in socalled disconnected mode. A database is DBMS consistent if its data complies with the consistency rules of the DBMS's metadata. If the database is DBMS consistent both when a transaction starts and when it has...

  16. Tunneling and reflection in unimolecular reaction kinetic energy release distributions

    Science.gov (United States)

    Hansen, K.

    2018-02-01

    The kinetic energy release distributions in unimolecular reactions is calculated with detailed balance theory, taking into account the tunneling and the reflection coefficient in three different types of transition states; (i) a saddle point corresponding to a standard RRKM-type theory, (ii) an attachment Langevin cross section, and (iii) an absorbing sphere potential at short range, without long range interactions. Corrections are significant in the one dimensional saddle point states. Very light and lightly bound absorbing systems will show measurable effects in decays from the absorbing sphere, whereas the Langevin cross section is essentially unchanged.

  17. Nicotine content of electronic cigarettes, its release in vapour and its consistency across batches: regulatory implications.

    Science.gov (United States)

    Goniewicz, Maciej L; Hajek, Peter; McRobbie, Hayden

    2014-03-01

    Electronic cigarettes (EC) may have a potential for public health benefit as a safer alternative to smoking, but questions have been raised about whether EC should be licensed as a medicine, with accurate labelling of nicotine content. This study determined the nicotine content of the cartridges of the most popular EC brands in the United Kingdom and the nicotine levels they deliver in the vapour, and estimated the safety and consistency of nicotine delivery across batches of the same product as a proxy for quality control for individual brands and within the industry. We studied five UK brands (six products) with high internet popularity. Two samples of each brand were purchased 4 weeks apart, and analysed for nicotine content in the cartridges and nicotine delivery in vapour. The nicotine content of cartridges within the same batch varied by up to 12% relative standard deviation (RSD) and the mean difference between different batches of the same brand ranged from 1% [95% confidence interval (CI) = -5 to 7%] to 20% (95% CI=14-25%) for five brands and 31% (95% CI=21-39%) for the sixth. The puffing schedule used in this study vaporized 10-81% of the nicotine present in the cartridges. The nicotine delivery from 300 puffs ranged from ∼2 mg to ∼15 mg and was not related significantly to the variation of nicotine content in e-liquid (r=0.06, P=0.92). None of the tested products allowed access to e-liquid or produced vapour nicotine concentrations as high as conventional cigarettes. There is very little risk of nicotine toxicity from major electronic cigarette (EC) brands in the United Kingdom. Variation in nicotine concentration in the vapour from a given brand is low. Nicotine concentration in e-liquid is not well related to nicotine in vapour. Other EC brands may be of lower quality and consumer protection regulation needs to be implemented, but in terms of accuracy of labelling of nicotine content and risks of nicotine overdose, regulation over and above

  18. Towards an Information Model of Consistency Maintenance in Distributed Interactive Applications

    Directory of Open Access Journals (Sweden)

    Xin Zhang

    2008-01-01

    Full Text Available A novel framework to model and explore predictive contract mechanisms in distributed interactive applications (DIAs using information theory is proposed. In our model, the entity state update scheme is modelled as an information generation, encoding, and reconstruction process. Such a perspective facilitates a quantitative measurement of state fidelity loss as a result of the distribution protocol. Results from an experimental study on a first-person shooter game are used to illustrate the utility of this measurement process. We contend that our proposed model is a starting point to reframe and analyse consistency maintenance in DIAs as a problem in distributed interactive media compression.

  19. Calculation of the self-consistent current distribution and coupling of an RF antenna array

    International Nuclear Information System (INIS)

    Ballico, M.; Puri, S.

    1993-10-01

    A self-consistent calculation of the antenna current distribution and fields in an axisymmetric cylindrical geometry for the ICRH antenna-plasma coupling problem is presented. Several features distinguish this calculation from other codes presently available. 1. Variational form: The formulation of the self consistent antenna current problem in a variational form allows good convergence and stability of the algorithm. 2. Multiple straps: Allows modelling of (a) the current distribution across the width of the strap (by dividing it up into sub straps) (b) side limiters and septum (c) antenna cross-coupling. 3. Analytic calculation of the antenna field and calculation of the antenna self-consistent current distribution, (given the surface impedance matrix) gives rapid calculation. 4. Framed for parallel computation on several different parallel architectures (as well as serial) gives a large speed improvement to the user. Results are presented for both Alfven wave heating and current drive antenna arrays, showing the optimal coupling to be achieved for toroidal mode numbers 8< n<10 for typical ASDEX upgrade plasmas. Simulations of the ASDEX upgrade antenna show the importance of the current distribution across the antenna and of image currents flowing in the side limiters, and an analysis of a proposed asymmetric ITER antenna is presented. (orig.)

  20. Toward a consistent model for strain accrual and release for the New Madrid Seismic Zone, central United States

    Science.gov (United States)

    Hough, S.E.; Page, M.

    2011-01-01

    At the heart of the conundrum of seismogenesis in the New Madrid Seismic Zone is the apparently substantial discrepancy between low strain rate and high recent seismic moment release. In this study we revisit the magnitudes of the four principal 1811–1812 earthquakes using intensity values determined from individual assessments from four experts. Using these values and the grid search method of Bakun and Wentworth (1997), we estimate magnitudes around 7.0 for all four events, values that are significantly lower than previously published magnitude estimates based on macroseismic intensities. We further show that the strain rate predicted from postglacial rebound is sufficient to produce a sequence with the moment release of one Mmax6.8 every 500 years, a rate that is much lower than previous estimates of late Holocene moment release. However, Mw6.8 is at the low end of the uncertainty range inferred from analysis of intensities for the largest 1811–1812 event. We show that Mw6.8 is also a reasonable value for the largest main shock given a plausible rupture scenario. One can also construct a range of consistent models that permit a somewhat higher Mmax, with a longer average recurrence rate. It is thus possible to reconcile predicted strain and seismic moment release rates with alternative models: one in which 1811–1812 sequences occur every 500 years, with the largest events being Mmax∼6.8, or one in which sequences occur, on average, less frequently, with Mmax of ∼7.0. Both models predict that the late Holocene rate of activity will continue for the next few to 10 thousand years.

  1. Distribution of corticotropin-releasing factor receptors in primate brain

    International Nuclear Information System (INIS)

    Millan, M.A.; Jacobowitz, D.M.; Hauger, R.L.; Catt, K.J.; Aguilera, G.

    1986-01-01

    The distribution and properties of receptors for corticotropin-releasing factor (CRF) were analyzed in the brain of cynomolgus monkeys. Binding of [ 125 I]tyrosine-labeled ovine CRF to frontal cortex and amygdala membrane-rich fractions was saturable, specific, and time- and temperature-dependent, reaching equilibrium in 30 min at 23 0 C. Scatchard analysis of the binding data indicated one class of high-affinity sites with a K/sub d/ of 1 nM and a concentration of 125 fmol/mg. As in the rat pituitary and brain, CRF receptors in monkey cerebral cortex and amygdala were coupled to adenylate cyclase. Autoradiographic analysis of specific CRF binding in brain sections revealed that the receptors were widely distributed in the cerebral cortex and limbic system. Receptor density was highest in the pars tuberalis of the pituitary and throughout the cerebral cortex, specifically in the prefrontal, frontal, orbital, cingulate, insular, and temporal areas, and in the cerebellar cortex. A low binding density was present in the superior colliculus, locus coeruleus, substantia gelatinosa, preoptic area, septal area, and bed nucleus of the stria terminalis. These data demonstrate that receptors for CRF are present within the primate brain at areas related to the central control of visceral function and behavior, suggesting that brain CRF may serve as a neurotransmitter in the coordination of endocrine and neural mechanisms involved in the response to stress

  2. Liver cancer cells: targeting and prolonged-release drug carriers consisting of mesoporous silica nanoparticles and alginate microspheres.

    Science.gov (United States)

    Liao, Yu-Te; Liu, Chia-Hung; Yu, Jiashing; Wu, Kevin C-W

    2014-01-01

    A new microsphere consisting of inorganic mesoporous silica nanoparticles (MSNs) and organic alginate (denoted as MSN@Alg) was successfully synthesized by air-dynamic atomization and applied to the intracellular drug delivery systems (DDS) of liver cancer cells with sustained release and specific targeting properties. MSN@Alg microspheres have the advantages of MSN and alginate, where MSN provides a large surface area for high drug loading and alginate provides excellent biocompatibility and COOH functionality for specific targeting. Rhodamine 6G was used as a model drug, and the sustained release behavior of the rhodamine 6G-loaded MSN@Alg microspheres can be prolonged up to 20 days. For targeting therapy, the anticancer drug doxorubicin was loaded into MSN@Alg microspheres, and the (lysine)4-tyrosine-arginine-glycine-aspartic acid (K4YRGD) peptide was functionalized onto the surface of MSN@Alg for targeting liver cancer cells, hepatocellular carcinoma (HepG2). The results of the 3-[4,5-dimethylthiazol-2-yl]-2,5 diphenyl tetrazolium bromide (MTT) assay and confocal laser scanning microscopy indicate that the MSN@Alg microspheres were successfully uptaken by HepG2 without apparent cytotoxicity. In addition, the intracellular drug delivery efficiency was greatly enhanced (ie, 3.5-fold) for the arginine-glycine-aspartic acid (RGD)-labeled, doxorubicin-loaded MSN@Alg drug delivery system compared with the non-RGD case. The synthesized MSN@Alg microspheres show great potential as drug vehicles with high biocompatibility, sustained release, and targeting features for future intracellular DDS.

  3. Liver cancer cells: targeting and prolonged-release drug carriers consisting of mesoporous silica nanoparticles and alginate microspheres

    Directory of Open Access Journals (Sweden)

    Liao YT

    2014-06-01

    Full Text Available Yu-Te Liao,1 Chia-Hung Liu,2 Jiashing Yu,1 Kevin C-W Wu1,3 1Department of Chemical Engineering, National Taiwan University, Taipei, Taiwan; 2Department of Urology, Taipei Medical University-Shuang Ho Hospital, New Taipei City, Taiwan; 3Division of Medical Engineering Research, National Health Research Institutes, Zhunan Township, Miaoli County, Taiwan Abstract: A new microsphere consisting of inorganic mesoporous silica nanoparticles (MSNs and organic alginate (denoted as MSN@Alg was successfully synthesized by air-dynamic atomization and applied to the intracellular drug delivery systems (DDS of liver cancer cells with sustained release and specific targeting properties. MSN@Alg microspheres have the advantages of MSN and alginate, where MSN provides a large surface area for high drug loading and alginate provides excellent biocompatibility and COOH functionality for specific targeting. Rhodamine 6G was used as a model drug, and the sustained release behavior of the rhodamine 6G-loaded MSN@Alg microspheres can be prolonged up to 20 days. For targeting therapy, the anticancer drug doxorubicin was loaded into MSN@Alg microspheres, and the (lysine4-tyrosine-arginine-glycine-aspartic acid (K4YRGD peptide was functionalized onto the surface of MSN@Alg for targeting liver cancer cells, hepatocellular carcinoma (HepG2. The results of the 3-[4,5-dimethylthiazol-2-yl]-2,5 diphenyl tetrazolium bromide (MTT assay and confocal laser scanning microscopy indicate that the MSN@Alg microspheres were successfully uptaken by HepG2 without apparent cytotoxicity. In addition, the intracellular drug delivery efficiency was greatly enhanced (ie, 3.5-fold for the arginine-glycine-aspartic acid (RGD-labeled, doxorubicin-loaded MSN@Alg drug delivery system compared with the non-RGD case. The synthesized MSN@Alg microspheres show great potential as drug vehicles with high biocompatibility, sustained release, and targeting features for future intracellular DDS. Keywords

  4. Longitudinal halo in beam bunches with self-consistent 6-D distributions

    International Nuclear Information System (INIS)

    Gluckstern, R. L.; Fedotov, A. V.; Kurennoy, S. S.; Ryne, R. D.

    1998-01-01

    We have explored the formation of longitudinal and transverse halos in 3-D axisymmetric beam bunches by starting with a self-consistent 6-D phase space distribution. Stationary distributions allow us to study the halo development mechanism without being obscured by beam redistribution and its effect on halo formation. The beam is then mismatched longitudinally and/or transversely, and we explore the rate, intensity and spatial extent of the halos which form, as a function of the beam charge and the mismatches. We find that the longitudinal halo forms first because the longitudinal tune depression is more severe than the transverse one for elongated bunches and conclude that it plays a major role in halo formation

  5. Nonlinear Ion-Acoustic Waves in a Plasma Consisting of Warm Ions and Isothermal Distributed Electrons

    International Nuclear Information System (INIS)

    Abourabia, A.M.; Hassan, K.M.; Shahein, R.A.

    2008-01-01

    The formation of (1+1) dimensional ion-acoustic waves (IAWs) in an unmagnetized collisionless plasma consisting of warm ions and isothermal distributed electrons is investigated. The electrodynamics system of equations are solved analytically in terms of a new variable ξκ χ -φ τ, where k=k(ω) is a complex function, at a fixed position. The analytical calculations gives that the critical value σ = τ/τ ∼ 0.25 distinguishes between the linear and nonlinear characters of IAW within the nanosecond time scale. The flow velocity, pressure, number density, electric potential, electric field, mobility and the total energy in the system are estimated and illustrated

  6. Integrated Scheduling of Production and Distribution with Release Dates and Capacitated Deliveries

    Directory of Open Access Journals (Sweden)

    Xueling Zhong

    2016-01-01

    Full Text Available This paper investigates an integrated scheduling of production and distribution model in a supply chain consisting of a single machine, a customer, and a sufficient number of homogeneous capacitated vehicles. In this model, the customer places a set of orders, each of which has a given release date. All orders are first processed nonpreemptively on the machine and then batch delivered to the customer. Two variations of the model with different objective functions are studied: one is to minimize the arrival time of the last order plus total distribution cost and the other is to minimize total arrival time of the orders plus total distribution cost. For the former one, we provide a polynomial-time exact algorithm. For the latter one, due to its NP-hard property, we provide a heuristic with a worst-case ratio bound of 2.

  7. Prediction of HAMR Debris Population Distribution Released from GEO Space

    Science.gov (United States)

    Rosengren, A.; Scheeres, D.

    2012-09-01

    in inclination. When the nodal rate of the system is commensurate with the nodal rate of the Moon, the perturbations build up more effectively over long periods to produce significant effects on the orbit. Such resonances, which occurs for a class of HAMR objects that are not cleared out of orbit, gives rise to strongly changing dynamics over longer time periods. In this paper, we present the averaged model, and discuss its fundamental predictions and comparisons with explicit long-term numerical integrations of HAMR objects in GEO space. Using this tool, we study a range of HAMR objects, released in geostationary orbit, with various area-to-mass ratios, and predict the spatiotemporal distribution of the population. We identified a unique systematic structure associated with their distribution in inclination and ascending node phase space. Given that HAMR objects are the most difficult to target from an observational point of view, this work will have many implications for the space surveillance community, and will allow observers to implement better search strategies for this class of debris.

  8. Design of micro distribution systems consisting of long channels with arbitrary cross sections

    International Nuclear Information System (INIS)

    Misdanitis, S; Valougeorgis, D

    2012-01-01

    Gas flows through long micro-channels of various cross sections have been extensively investigated over the years both numerically and experimentally. In various technological applications including microfluidics, these micro-channels are combined together in order to form a micro-channel network. Computational algorithms for solving gas pipe networks in the hydrodynamic regime are well developed. However, corresponding tools for solving networks consisting of micro-channels under any degree of gas rarefaction is very limited. Recently a kinetic algorithm has been developed to simulate gas distribution systems consisting of long circular channels under any vacuum conditions. In the present work this algorithm is generalized and extended into micro-channels of arbitrary cross-section etched by KOH in silicon (triangular and trapezoidal channels with acute angle of 54.74°). Since a kinetic approach is implemented, the analysis is valid and the results are accurate in the whole range of the Knudsen number, while the involved computational effort is very small. This is achieved by successfully integrating the kinetic results for the corresponding single channels into the general solver for designing the gas pipe network. To demonstrate the feasibility of the approach two typical systems consisting of long rectangular and trapezoidal micro-channels are solved.

  9. Topologically Consistent Models for Efficient Big Geo-Spatio Data Distribution

    Science.gov (United States)

    Jahn, M. W.; Bradley, P. E.; Doori, M. Al; Breunig, M.

    2017-10-01

    Geo-spatio-temporal topology models are likely to become a key concept to check the consistency of 3D (spatial space) and 4D (spatial + temporal space) models for emerging GIS applications such as subsurface reservoir modelling or the simulation of energy and water supply of mega or smart cities. Furthermore, the data management for complex models consisting of big geo-spatial data is a challenge for GIS and geo-database research. General challenges, concepts, and techniques of big geo-spatial data management are presented. In this paper we introduce a sound mathematical approach for a topologically consistent geo-spatio-temporal model based on the concept of the incidence graph. We redesign DB4GeO, our service-based geo-spatio-temporal database architecture, on the way to the parallel management of massive geo-spatial data. Approaches for a new geo-spatio-temporal and object model of DB4GeO meeting the requirements of big geo-spatial data are discussed in detail. Finally, a conclusion and outlook on our future research are given on the way to support the processing of geo-analytics and -simulations in a parallel and distributed system environment.

  10. Self-consistent relativistic Boltzmann-Uehling-Uhlenbeck equation for the Δ distribution function

    International Nuclear Information System (INIS)

    Mao, G.; Li, Z.; Zhuo, Y.

    1996-01-01

    We derive the self-consistent relativistic Boltzmann-Uehling-Uhlenbeck (RBUU) equation for the delta distribution function within the framework which we have done for nucleon close-quote s. In our approach, the Δ isobars are treated in essentially the same way as nucleons. Both mean field and collision terms of Δ close-quote s RBUU equation are derived from the same effective Lagrangian and presented analytically. We calculate the in-medium NΔ elastic and inelastic scattering cross sections up to twice nuclear matter density and the results show that the in-medium cross sections deviate substantially from Cugnon close-quote s parametrization that is commonly used in the transport model. copyright 1996 The American Physical Society

  11. Reconfigurable magnonic crystal consisting of periodically distributed domain walls in a nanostrip

    International Nuclear Information System (INIS)

    Li, Zhi-xiong; Wang, Xi-guang; Wang, Dao-wei; Nie, Yao-zhuang; Tang, Wei; Guo, Guang-hua

    2015-01-01

    We study spin wave propagation in a new type of magnonic crystal consisting of a series of periodically distributed magnetic domain walls in a nanostrip by micromagnetic simulation. Spin wave bands and bandgaps are observed in frequency spectra and dispersion curves. Some bandgaps are caused by the Bragg reflection of the spin wave modes at the Brillouin zone boundaries, while others originate from the coupling between different incident and reflected spin wave modes. The control of the spin wave band structure by changing the magnetocrystalline anisotropy or applying an external magnetic field is studied. Increasing the magnetocrystalline anisotropy leads to an increase of the bandgaps. The external field applied perpendicular to the nanostrip gives rise to a doubling of the domain-wall magnonic crystal period. As a result, more bandgaps appear on the frequency spectra of propagating spin waves. The results presented here may find their use in the design of reconfigurable magnonic devices. - Highlights: • A reconfigurable magnonic crystal consisting of domain walls in a uniform nanostrip is proposed. • Propagating characteristics of spin waves in such magnonic crystal are studied. • Spin-wave band structures can be effectively manipulated by magnetic anisotropy or magnetic field

  12. The influence of spray-drying parameters on phase behavior, drug distribution, and in vitro release of injectable microspheres for sustained release.

    Science.gov (United States)

    Meeus, Joke; Lenaerts, Maité; Scurr, David J; Amssoms, Katie; Davies, Martyn C; Roberts, Clive J; Van Den Mooter, Guy

    2015-04-01

    For ternary solid dispersions, it is indispensable to characterize their structure, phase behavior, and the spatial distribution of the dispersed drug as this might influence the release profile and/or stability of these formulations. This study shows how formulation (feed concentration) and process (feed rate, inlet air temperature, and atomizing air pressure) parameters can influence the characteristics of ternary spray-dried solid dispersions. The microspheres considered here consist of a poly(lactic-co-glycolic acid) (PLGA) surface layer and an underlying polyvinylpyrrolidone (PVP) phase. A poorly soluble active pharmaceutical ingredient (API) was molecularly dispersed in this matrix. Differences were observed in component miscibility, phase heterogeneity, particle size, morphology, as well as API surface coverage for selected spray-drying parameters. Observed differences are likely because of changes in the droplet generation, evaporation, and thus particle formation processes. However, varying particle characteristics did not influence the drug release of the formulations studied, indicating the robustness of this approach to produce particles of consistent drug release characteristics. This is likely because of the fact that the release is dominated by diffusion from the PVP layer through pores in the PLGA surface layer and that observed differences in the latter have no influence on the release. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  13. Southern San Andreas Fault seismicity is consistent with the Gutenberg-Richter magnitude-frequency distribution

    Science.gov (United States)

    Page, Morgan T.; Felzer, Karen

    2015-01-01

    The magnitudes of any collection of earthquakes nucleating in a region are generally observed to follow the Gutenberg-Richter (G-R) distribution. On some major faults, however, paleoseismic rates are higher than a G-R extrapolation from the modern rate of small earthquakes would predict. This, along with other observations, led to formulation of the characteristic earthquake hypothesis, which holds that the rate of small to moderate earthquakes is permanently low on large faults relative to the large-earthquake rate (Wesnousky et al., 1983; Schwartz and Coppersmith, 1984). We examine the rate difference between recent small to moderate earthquakes on the southern San Andreas fault (SSAF) and the paleoseismic record, hypothesizing that the discrepancy can be explained as a rate change in time rather than a deviation from G-R statistics. We find that with reasonable assumptions, the rate changes necessary to bring the small and large earthquake rates into alignment agree with the size of rate changes seen in epidemic-type aftershock sequence (ETAS) modeling, where aftershock triggering of large earthquakes drives strong fluctuations in the seismicity rates for earthquakes of all magnitudes. The necessary rate changes are also comparable to rate changes observed for other faults worldwide. These results are consistent with paleoseismic observations of temporally clustered bursts of large earthquakes on the SSAF and the absence of M greater than or equal to 7 earthquakes on the SSAF since 1857.

  14. Training Classifiers under Covariate Shift by Constructing the Maximum Consistent Distribution Subset

    OpenAIRE

    Yu, Xu; Yu, Miao; Xu, Li-xun; Yang, Jing; Xie, Zhi-qiang

    2015-01-01

    The assumption that the training and testing samples are drawn from the same distribution is violated under covariate shift setting, and most algorithms for the covariate shift setting try to first estimate distributions and then reweight samples based on the distributions estimated. Due to the difficulty of estimating a correct distribution, previous methods can not get good classification performance. In this paper, we firstly present two types of covariate shift problems. Rather than estim...

  15. An Analysis of Weakly Consistent Replication Systems in an Active Distributed Network

    OpenAIRE

    Amit Chougule; Pravin Ghewari

    2011-01-01

    With the sudden increase in heterogeneity and distribution of data in wide-area networks, more flexible, efficient and autonomous approaches for management and data distribution are needed. In recent years, the proliferation of inter-networks and distributed applications has increased the demand for geographically-distributed replicated databases. The architecture of Bayou provides features that address the needs of database storage of world-wide applications. Key is the use of weak consisten...

  16. Exploring Environmental Inequity in South Korea: An Analysis of the Distribution of Toxic Release Inventory (TRI Facilities and Toxic Releases

    Directory of Open Access Journals (Sweden)

    D. K. Yoon

    2017-10-01

    Full Text Available Recently, location data regarding the Toxic Release Inventory (TRI in South Korea was released to the public. This study investigated the spatial patterns of TRIs and releases of toxic substances in all 230 local governments in South Korea to determine whether spatial clusters relevant to the siting of noxious facilities occur. In addition, we employed spatial regression modeling to determine whether the number of TRI facilities and the volume of toxic releases in a given community were correlated with the community’s socioeconomic, racial, political, and land use characteristics. We found that the TRI facilities and their toxic releases were disproportionately distributed with clustered spatial patterning. Spatial regression modeling indicated that jurisdictions with smaller percentages of minorities, stronger political activity, less industrial land use, and more commercial land use had smaller numbers of toxic releases, as well as smaller numbers of TRI facilities. However, the economic status of the community did not affect the siting of hazardous facilities. These results indicate that the siting of TRI facilities in Korea is more affected by sociopolitical factors than by economic status. Racial issues are thus crucial for consideration in environmental justice as the population of Korea becomes more racially and ethnically diverse.

  17. Longitudinal motion in high current ion beams: a self-consistent phase space distribution with an envelope equation

    International Nuclear Information System (INIS)

    Neuffer, D.

    1979-03-01

    Many applications of particle acceleration, such as heavy ion fusion, require longitudinal bunching of a high intensity particle beam to extremely high particle currents with correspondingly high space charge forces. This requires a precise analysis of longitudinal motion including stability analysis. Previous papers have treated the longitudinal space charge force as strictly linear, and have not been self-consistent; that is, they have not displayed a phase space distribution consistent with this linear force so that the transport of the phase space distribution could be followed, and departures from linearity could be analyzed. This is unlike the situation for transverse phase space where the Kapchinskij--Vladimirskij (K--V) distribution can be used as the basis of an analysis of transverse motion. In this paper a self-consistent particle distribution in longitudinal phase space is derived which is a solution of the Vlasov equation and an envelope equation for this solution is derived

  18. Simulation of distributed parameter system consisting of charged and neutral particles

    International Nuclear Information System (INIS)

    Grover, P.S.; Sinha, K.V.

    1986-01-01

    The time-dependent behavior of positively charged light particles have been simulated in an assembly of heavy gas atoms. The system is formulated in terms of partial differential equation. The stability and convergence of the numerical algorithm has been examined. Using this formulation effects of external electric field and temperature have been investigated on the lifetime and distribution function characteristics of charged particles

  19. Portable memory consistency for software managed distributed memory in many-core SoC

    NARCIS (Netherlands)

    Rutgers, J.H.; Bekooij, Marco Jan Gerrit; Smit, Gerardus Johannes Maria

    2013-01-01

    Porting software to different platforms can require modifications of the application. One of the issues is that the targeted hardware supports another memory consistency model. As a consequence, the completion order of reads and writes in a multi-threaded application can change, which may result in

  20. Investigation on Oracle GoldenGate Veridata for Data Consistency in WLCG Distributed Database Environment

    OpenAIRE

    Asko, Anti; Lobato Pardavila, Lorena

    2014-01-01

    Abstract In the distributed database environment, the data divergence can be an important problem: if it is not discovered and correctly identified, incorrect data can lead to poor decision making, errors in the service and in the operative errors. Oracle GoldenGate Veridata is a product to compare two sets of data and identify and report on data that is out of synchronization. IT DB is providing a replication service between databases at CERN and other computer centers worldwide as a par...

  1. Are range-size distributions consistent with species-level heritability?

    DEFF Research Database (Denmark)

    Borregaard, Michael Krabbe; Gotelli, Nicholas; Rahbek, Carsten

    2012-01-01

    The concept of species-level heritability is widely contested. Because it is most likely to apply to emergent, species-level traits, one of the central discussions has focused on the potential heritability of geographic range size. However, a central argument against range-size heritability has...... been that it is not compatible with the observed shape of present-day species range-size distributions (SRDs), a claim that has never been tested. To assess this claim, we used forward simulation of range-size evolution in clades with varying degrees of range-size heritability, and compared the output...

  2. Phosphate absorption and distribution in flue-cured tobacco under different ozone consistency by using 32P

    International Nuclear Information System (INIS)

    Qiang Jiye

    2004-01-01

    The absorption and distribution of phosphate in flue-cured tobacco under different ozone consistencies was studied by using 32 P. The results showed that the percentage of root of whole tobacco plant assimilating 32 p reduced as growing, but in stem it increased as growing in the sand culture. Root and stem of flue-cured tobacco assimilating 32 P varied little in the whole growing period in the solution culture. Distribution situation in leaf with two consistencies was in the order of lower leaf>cutters leaf>upper leaf, and the ratio of radioactivity showed root>stem>lower leaf>cutters leaf>upper leaf. However, flue-cured tobacco assimilating phosphate in the two consistencies showed significantly positive correlation with length of growth period. Assimilating phosphate in the solution culture was more and faster than in the low ozone consistency culture

  3. Predicting intragranular misorientation distributions in polycrystalline metals using the viscoplastic self-consistent formulation

    DEFF Research Database (Denmark)

    Zecevic, Miroslav; Pantleon, Wolfgang; Lebensohn, Ricardo A.

    2017-01-01

    In a recent paper, we reported the methodology to calculate intragranular fluctuations in the instantaneous lattice rotation rates in polycrystalline materials within the mean-field viscoplastic self-consistent (VPSC) model. This paper is concerned with the time integration and subsequent use......, we calculate intragranular misorientations in face-centered cubic polycrystals deformed in tension and plane-strain compression. These predictions are tested by comparison with corresponding experiments for polycrystalline copper and aluminum, respectively, and with full-field calculations....... It is observed that at sufficiently high strains some grains develop large misorientations that may lead to grain fragmentation and/or act as driving forces for recrystallization. The proposed VPSC-based prediction of intragranular misorientations enables modeling of grain fragmentation, as well as a more...

  4. Defined drug release from 3D-printed composite tablets consisting of drug-loaded polyvinylalcohol and a water-soluble or water-insoluble polymer filler.

    Science.gov (United States)

    Tagami, Tatsuaki; Nagata, Noriko; Hayashi, Naomi; Ogawa, Emi; Fukushige, Kaori; Sakai, Norihito; Ozeki, Tetsuya

    2018-05-30

    3D-printed tablets are a promising new approach for personalized medicine. In this study, we fabricated composite tablets consisting of two components, a drug and a filler, by using a fused deposition modeling-type 3D printer. Polyvinylalcohol (PVA) polymer containing calcein (a model drug) was used as the drug component and PVA or polylactic acid (PLA) polymer without drug was used as the water-soluble or water-insoluble filler, respectively. Various kinds of drug-PVA/PVA and drug-PVA/PLA composite tablets were designed, and the 3D-printed tablets exhibited good formability. The surface area of the exposed drug component is highly correlated with the initial drug release rate. Composite tablets with an exposed top and a bottom covered with a PLA layer were fabricated. These tablets showed zero-order drug release by maintaining the surface area of the exposed drug component during drug dissolution. In contrast, the drug release profile varied for tablets whose exposed surface area changed. Composite tablets with different drug release lag times were prepared by changing the thickness of the PVA filler coating the drug component. These results which used PVA and PLA filler will provide useful information for preparing the tablets with multi-components and tailor-made tablets with defined drug release profiles using 3D printers. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Interesting association rule mining with consistent and inconsistent rule detection from big sales data in distributed environment

    Directory of Open Access Journals (Sweden)

    Dinesh J. Prajapati

    2017-06-01

    Full Text Available Nowadays, there is an increasing demand in mining interesting patterns from the big data. The process of analyzing such a huge amount of data is really computationally complex task when using traditional methods. The overall purpose of this paper is in twofold. First, this paper presents a novel approach to identify consistent and inconsistent association rules from sales data located in distributed environment. Secondly, the paper also overcomes the main memory bottleneck and computing time overhead of single computing system by applying computations to multi node cluster. The proposed method initially extracts frequent itemsets for each zone using existing distributed frequent pattern mining algorithms. The paper also compares the time efficiency of Mapreduce based frequent pattern mining algorithm with Count Distribution Algorithm (CDA and Fast Distributed Mining (FDM algorithms. The association generated from frequent itemsets are too large that it becomes complex to analyze it. Thus, Mapreduce based consistent and inconsistent rule detection (MR-CIRD algorithm is proposed to detect the consistent and inconsistent rules from big data and provide useful and actionable knowledge to the domain experts. These pruned interesting rules also give useful knowledge for better marketing strategy as well. The extracted consistent and inconsistent rules are evaluated and compared based on different interestingness measures presented together with experimental results that lead to the final conclusions.

  6. Spatial consistency of chinook salmon redd distribution within and among years in the Cowlitz River, Washington

    Science.gov (United States)

    Klett, Katherine J.C.; Torgersen, Christian E.; Henning, Julie A.; Murray, Christopher J.

    2013-01-01

    We investigated the spawning patterns of Chinook Salmon Oncorhynchus tshawytscha on the lower Cowlitz River, Washington, using a unique set of fine- and coarse-scale temporal and spatial data collected during biweekly aerial surveys conducted in 1991–2009 (500 m to 28 km resolution) and 2008–2009 (100–500 m resolution). Redd locations were mapped from a helicopter during 2008 and 2009 with a hand-held GPS synchronized with in-flight audio recordings. We examined spatial patterns of Chinook Salmon redd reoccupation among and within years in relation to segment-scale geomorphic features. Chinook Salmon spawned in the same sections each year with little variation among years. On a coarse scale, 5 years (1993, 1998, 2000, 2002, and 2009) were compared for reoccupation. Redd locations were highly correlated among years. Comparisons on a fine scale (500 m) between 2008 and 2009 also revealed a high degree of consistency among redd locations. On a finer temporal scale, we observed that Chinook Salmon spawned in the same sections during the first and last week. Redds were clustered in both 2008 and 2009. Regression analysis with a generalized linear model at the 500-m scale indicated that river kilometer and channel bifurcation were positively associated with redd density, whereas sinuosity was negatively associated with redd density. Collecting data on specific redd locations with a GPS during aerial surveys was logistically feasible and cost effective and greatly enhanced the spatial precision of Chinook Salmon spawning surveys.

  7. Spatial consistency of Chinook salmon redd distribution within and among years in the Cowlitz River, Washington

    Energy Technology Data Exchange (ETDEWEB)

    Klett, Katherine J.; Torgersen, Christian; Henning, Julie; Murray, Christopher J.

    2013-04-28

    We investigated the spawning patterns of Chinook salmon Oncorhynchus tshawytscha on the lower Cowlitz River, Washington (USA) using a unique set of fine- and coarse-scale 35 temporal and spatial data collected during bi-weekly aerial surveys conducted in 1991-2009 (500 m to 28 km resolution) and 2008-2009 (100-500 m resolution). Redd locations were mapped from a helicopter during 2008 and 2009 with a hand-held global positioning system (GPS) synchronized with in-flight audio recordings. We examined spatial patterns of Chinook salmon redd reoccupation among and within years in relation to segment-scale geomorphic features. Chinook salmon spawned in the same sections each year with little variation among years. On a coarse scale, five years (1993, 1998, 2000, 2002, and 2009) were compared for reoccupation. Redd locations were highly correlated among years resulting in a minimum correlation coefficient of 0.90 (adjusted P = 0.002). Comparisons on a fine scale (500 m) between 2008 and 2009 also revealed a high degree of consistency among redd locations (P < 0.001). On a finer temporal scale, we observed that salmon spawned in the same sections during the first and last week (2008: P < 0.02; and 2009: P < 0.001). Redds were clustered in both 2008 and 2009 (P < 0.001). Regression analysis with a generalized linear model at the 500-m scale indicated that river kilometer and channel bifurcation were positively associated with redd density, whereas sinuosity was negatively associated with redd density. Collecting data on specific redd locations with a GPS during aerial surveys was logistically feasible and cost effective and greatly enhanced the spatial precision of Chinook salmon spawning surveys.

  8. Pulling it all together: the self-consistent distribution of neutral tori in Saturn's Magnetosphere based on all Cassini observations

    Science.gov (United States)

    Smith, H. T.; Richardson, J. D.

    2017-12-01

    Saturn's magnetosphere is unique in that the plumes from the small icy moon, Enceladus, serve at the primary source for heavy particles in Saturn's magnetosphere. The resulting co-orbiting neutral particles interact with ions, electrons, photons and other neutral particles to generate separate H2O, OH and O tori. Characterization of these toroidal distributions is essential for understanding Saturn magnetospheric sources, composition and dynamics. Unfortunately, limited direct observations of these features are available so modeling is required. A significant modeling challenge involves ensuring that either the plasma and neutral particle populations are not simply input conditions but can provide feedback to each population (i.e. are self-consistent). Jurac and Richardson (2005) executed such a self-consistent model however this research was performed prior to the return of Cassini data. In a similar fashion, we have coupled a 3-D neutral particle model (Smith et al. 2004, 2005, 2006, 2007, 2009, 2010) with a plasma transport model (Richardson 1998; Richardson & Jurac 2004) to develop a self-consistent model which is constrained by all available Cassini observations and current findings on Saturn's magnetosphere and the Enceladus plume source resulting in much more accurate neutral particle distributions. We present a new self-consistent model of the distribution of the Enceladus-generated neutral tori that is validated by all available observations. We also discuss the implications for source rate and variability.

  9. Effects of sulfate on heavy metal release from iron corrosion scales in drinking water distribution system.

    Science.gov (United States)

    Sun, Huifang; Shi, Baoyou; Yang, Fan; Wang, Dongsheng

    2017-05-01

    Trace heavy metals accumulated in iron corrosion scales within a drinking water distribution system (DWDS) could potentially be released to bulk water and consequently deteriorate the tap water quality. The objective of this study was to identify and evaluate the release of trace heavy metals in DWDS under changing source water conditions. Experimental pipe loops with different iron corrosion scales were set up to simulate the actual DWDS. The effects of sulfate levels on heavy metal release were systemically investigated. Heavy metal releases of Mn, Ni, Cu, Pb, Cr and As could be rapidly triggered by sulfate addition but the releases slowly decreased over time. Heavy metal release was more severe in pipes transporting groundwater (GW) than in pipes transporting surface water (SW). There were strong positive correlations (R 2  > 0.8) between the releases of Fe and Mn, Fe and Ni, Fe and Cu, and Fe and Pb. When switching to higher sulfate water, iron corrosion scales in all pipe loops tended to be more stable (especially in pipes transporting GW), with a larger proportion of stable constituents (mainly Fe 3 O 4 ) and fewer unstable compounds (β-FeOOH, γ-FeOOH, FeCO 3 and amorphous iron oxides). The main functional iron reducing bacteria (IRB) communities were favorable for the formation of Fe 3 O 4 . The transformation of corrosion scales and the growth of sulfate reducing bacteria (SRB) accounted for the gradually reduced heavy metal release with time. The higher metal release in pipes transporting GW could be due to increased Fe 6 (OH) 12 CO 3 content under higher sulfate concentrations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. [Research on controlling iron release of desalted water transmitted in existing water distribution system].

    Science.gov (United States)

    Tian, Yi-Mei; Liu, Yang; Zhao, Peng; Shan, Jin-Lin; Yang, Suo-Yin; Liu, Wei

    2012-04-01

    Desalted water, with strong corrosion characteristics, would possibly lead to serious "red water" when transmitted and distributed in existing municipal water distribution network. The main reason for red water phenomenon is iron release in water pipes. In order to study the methods of controlling iron release in existing drinking water distribution pipe, tubercle analysis of steel pipe and cast iron pipe, which have served the distribution system for 30-40 years, was carried out, the main construction materials were Fe3O4 and FeOOH; and immersion experiments were carried in more corrosive pipes. Through changing mixing volume of tap water and desalted water, pH, alkalinity, chloride and sulfate, the influence of different water quality indexes on iron release were mainly analyzed. Meanwhile, based on controlling iron content, water quality conditions were established to meet with the safety distribution of desalted water: volume ratio of potable water and desalted water should be higher than or equal to 2, pH was higher than 7.6, alkalinity was higher than 200 mg x L(-1).

  11. The distribution characteristics of pollutants released at different cross-sectional positions of a river

    International Nuclear Information System (INIS)

    Huang Heqing; Chen Guang; Zhang Qianfeng

    2010-01-01

    The distribution characteristics of heavier or lighter pollutants released at different cross-sectional positions of a wide river is investigated with a well-tested three-dimensional numerical model of gravity flows based on Reynolds-Averaged Navier-Stokes equations and turbulence k-ε model. By focusing on investigating the influences of flow and buoyancy on pollutants, it is found that while carrying by the river flow downstream: i) a heavier pollutant released from the cross-sectional side position, forms transverse oscillation between two banks with decreased amplitude, i.e. forms kind of helical flow pattern along the straight part of channel bed; ii) a heavier pollutant released from the cross-sectional middle position, forms collapse oscillation in the middle of the straight channel part with reduced amplitude; iii) in the downstream sinuous channel, heavier pollutant is of higher concentration on the outer side of channel bends; iv) a light pollutant released from the cross-sectional side position, slips partly to the other side of the river, resulting in higher concentrations on two sides of the channel top; v) a light pollutant released from the cross-sectional middle position, splits into two parts symmetrically along two sides of the channel top; vi) in the downstream sinuous channel, light pollutant presents higher concentration on the inner side of channel bends. These findings may assist in cost-effective scientific countermeasures to be taken for accidental or planned pollutant releases into a river. - The distribution characteristics of heavier or lighter pollutants released at different cross-sectional positions of a river.

  12. On the burnout in annular channels at non-uniform heat release distribution in length

    International Nuclear Information System (INIS)

    Ornatskij, A.P.; Chernobaj, V.A.; Vasil'ev, A.F.; Struts, G.V.

    1982-01-01

    The effect of axial heat release non-uniformity on the conditions of the burnout in annular channels is investigated. The investigation is carried out in annular channels with different laws of heat flux density distribution by channel length. The heat release non-uniformity coefficient was varied from 4.4 to 10, the pressure from 9.8 to 17.6 MPa, mass rate from 500 to 1700 kg (m 2 xS), liquid temperature (chemically desalted water) at the channel inlet constituted 30-300 deg C. The experiments have been performed at the test bench with a closed circulation circuit. The data obtained testify to the fact that under non-uniform heat release the influence of main operating parameters on the value of critical power is of the same character as under uniform heat release. The character of wall temperature variation by channel length before the burnout is determined by the form of heat supply temperature profile. The temperature maximum is observed in the region lying behind the cross section with maximum heat flux. The conclusion is drawn that the dominant influence on the position of the cross section in which the burnout arises is exerted by the form of heat flux density distribution by length. Independently of this distribution form the burnout developes when the vapour content near the wall reaches a limiting value

  13. Essential iOS Build and Release A Comprehensive Guide to Building, Packaging, and Distribution

    CERN Document Server

    Roche, Ron

    2011-01-01

    Frustrated by the requirements for testing and distributing your iOS app? You're not alone. This concise book takes you step by step through the maze of certification and provisioning processes that have to happen before, during, and after development. You'll learn what's required to sign certificates, test your app on iOS devices, and release the finished product to the App Store. Whether you're a developer looking to spend more time coding and less time figuring out how to install your application, or a release engineer responsible for producing reliable builds, this guide will help you su

  14. Angular distribution measurement of fragment ions from a molecule using a new beamline consisting of a Grasshopper monochromator

    Science.gov (United States)

    Saito, Norio; Suzuki, Isao H.; Onuki, Hideo; Nishi, Morotake

    1989-07-01

    Optical characteristics of a new beamline consisting of a premirror, a Grasshopper monochromator, and a refocusing mirror have been investigated. The intensity of the monochromatic soft x-ray was estimated to be about 108 photons/(s 100 mA) at 500 eV with the storage electron energy of 600 MeV and the minimum slit width. This slit width provides a resolution of about 500. Angular distributions of fragment ions from an inner-shell excited nitrogen molecule have been measured with a rotatable time-of-flight mass spectrometer by using this beamline.

  15. Angular distribution measurement of fragment ions from a molecule using a new beamline consisting of a Grasshopper monochromator

    International Nuclear Information System (INIS)

    Saito, N.; Suzuki, I.H.; Onuki, H.; Nishi, M.

    1989-01-01

    Optical characteristics of a new beamline consisting of a premirror, a Grasshopper monochromator, and a refocusing mirror have been investigated. The intensity of the monochromatic soft x-ray was estimated to be about 10 8 photons/(s 100 mA) at 500 eV with the storage electron energy of 600 MeV and the minimum slit width. This slit width provides a resolution of about 500. Angular distributions of fragment ions from an inner-shell excited nitrogen molecule have been measured with a rotatable time-of-flight mass spectrometer by using this beamline

  16. Regional distribution of released earthquake energy in northern Egypt along with Inahass area

    International Nuclear Information System (INIS)

    El-hemamy, S.T.; Adel, A.A. Othman

    1999-01-01

    A review of the seismic history of Egypt indicates sone areas of high activity concentrated along Oligocene-Miocene faults. These areas support the idea of recent activation of the Oligocene-Miocene stress cycle. There are similarities in the special distribution of recent and historical epicenters. Form the tectonic map of Egypt, distribution of Intensity and magnitude show strong activity along Nile Delta. This due to the presence of a thick layers of recent alluvial sediments. The released energy of the earthquakes are effective on the structures. The present study deals with the computed released energies of the reported earthquakes in Egypt and around Inshas area . Its effect on the urban and nuclear facilities inside Inshas site is considered. Special consideration will be given to old and new waste repository sites. The application of the determined released energy reveals that Inshas site is affected by seismic activity from five seismo-tectonic source zones, namely the Red Sea, Nile Delta, El-Faiyum, the Mediterranean Sea and the Gulf of Aqaba seismo-tectonic zones. El-Faiyum seismo-tectonic source zone has the maximum effect on the site and gave a high released energy reaching to 5.4E +2 1 erg

  17. Pharmacokinetics, brain distribution, release and blood-brain barrier transport of Shunaoxin pills.

    Science.gov (United States)

    Wu, Kai; Wang, Zhan-Zhang; Liu, Dan; Qi, Xian-Rong

    2014-02-12

    Shunaoxin pills, a traditional Chinese medicine (TCM) product, have been used to treat cerebrovascular diseases in China since 2005. The main active components of Shunaoxin pills are ferulic acid and ligustilide from Chuanxiong (Ligusticum chuanxiong Hort, Umbelliferae) and Danggui (Angelica sinensis radix, Umbelliferae). As Shunaoxin shows excellent activity in the central nervous system (CNS), the extent to which the major constituents of Shunaoxin reach the CNS should be investigated. Moreover, the in vivo-in vitro correlations (IVIVC) of the formulation should be studied to elucidate the mechanisms of action of TCM in the CNS. However, these data have not previously been available. Thus we intended to investigate what the extent when these constituents of Shunaoxin pills reach the CNS, and evaluate the IVIVC of release and pharmacokinetics. In this study, we evaluated the release of ferulic acid and ligustilide from Shunaoxin pills, and their transport across an in vitro model of the BBB. We also evaluated their pharmacokinetics and brain distribution in vivo. High-performance liquid chromatography (HPLC) was used to quantify both compounds simultaneously. Based on the release in vitro and absorption of ferulic acid and ligustilide in vivo, IVIVC permitted prediction of the pharmacokinetics of these compounds. The release of ferulic acid and ligustilide reached a platform phase within 1h. Ferulic acid and ligustilide rapidly crossed the BBB in different patterns; the transport ratio increased over time. After intragastric (i.g.) administration of Shunaoxin pills, ferulic acid and ligustilide were rapidly absorbed and distributed into brain, which may result in a rapid onset of action. Ferulic acid and ligustilide were transported across a model BBB. After i.g. administration of Shunaoxin pills, ferulic acid and ligustilide were rapidly absorbed and distributed in brain; this may lead to rapid pharmacological onset. The IVIVC can be used to predict in vivo

  18. Evidence of arsenic release promoted by disinfection by-products within drinking-water distribution systems.

    Science.gov (United States)

    Andra, Syam S; Makris, Konstantinos C; Botsaris, George; Charisiadis, Pantelis; Kalyvas, Harris; Costa, Costas N

    2014-02-15

    Changes in disinfectant type could trigger a cascade of reactions releasing pipe-anchored metals/metalloids into finished water. However, the effect of pre-formed disinfection by-products on the release of sorbed contaminants (arsenic-As in particular) from drinking water distribution system pipe scales remains unexplored. A bench-scale study using a factorial experimental design was performed to evaluate the independent and interaction effects of trihalomethanes (TTHM) and haloacetic acids (HAA) on arsenic (As) release from either scales-only or scale-biofilm conglomerates (SBC) both anchored on asbestos/cement pipe coupons. A model biofilm (Pseudomonas aeruginosa) was allowed to grow on select pipe coupons prior experimentation. Either TTHM or HAA individual dosing did not promote As release from either scales only or SBC, detecting water. In the case of scales-only coupons, the combination of the highest spike level of TTHM and HAA significantly (pwater in pipe networks remains to be investigated in the field. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. A statistical model for deriving probability distributions of contamination for accidental releases

    International Nuclear Information System (INIS)

    ApSimon, H.M.; Davison, A.C.

    1986-01-01

    Results generated from a detailed long-range transport model, MESOS, simulating dispersal of a large number of hypothetical releases of radionuclides in a variety of meteorological situations over Western Europe have been used to derive a simpler statistical model, MESOSTAT. This model may be used to generate probability distributions of different levels of contamination at a receptor point 100-1000 km or so from the source (for example, across a frontier in another country) without considering individual release and dispersal scenarios. The model is embodied in a series of equations involving parameters which are determined from such factors as distance between source and receptor, nuclide decay and deposition characteristics, release duration, and geostrophic windrose at the source. Suitable geostrophic windrose data have been derived for source locations covering Western Europe. Special attention has been paid to the relatively improbable extreme values of contamination at the top end of the distribution. The MESOSTAT model and its development are described, with illustrations of its use and comparison with the original more detailed modelling techniques. (author)

  20. Active and reactive power sharing and frequency restoration in a distributed power system consisting of two UPS units

    Energy Technology Data Exchange (ETDEWEB)

    Parlak, Koray Sener; Oezdemir, Mehmet [Dept. of Electrical and Electronic Engineering, Firat University, Elazig, 23119 (Turkey); Aydemir, M. Timur [Dept. of Electrical and Electronic Engineering, Gazi University, Maltepe-Ankara 06570 (Turkey)

    2009-06-15

    A distributed power system consisting of two uninterrupted power supplies (UPS) is investigated in this paper. Parallel operation of the two sources increases the established power rating of the system. One of the sources can supply the system even when the other system is disconnected due to some faults, and this is an important feature. The control algorithm makes sure that the total load is shared between the supplies in accordance with their rated power levels, and the frequency of the supplies are restored to the rated values after the transitions. As the UPSs operate at an optimum power level, losses and faults due to overloading are prevented. The units safely operate without any means of communication between each other. The focus of the work is on the inverter stages of the UPSs. Simulations performed in Matlab Simulink environment have been verified with experimental work via DS1103 controller card. (author)

  1. The application of thermoluminescence dosimeters to studies of released activity distributions

    International Nuclear Information System (INIS)

    Ruden, B.I.

    1969-06-01

    In this report the theoretical conditions necessary for the study of the behaviour of released activity by the use of CaSO 4 : Mn thermoluminescence dosimeters are considered. A method is derived for calculating exposure distributions from drifting volume activity. The correlation between exposure distributions and concentration distributions is discussed. One of thirty experiments where Br 82 was released into water through a nozzle some metres above the bottom is described. The resulting exposure distribution was measured in a vertical plane at distances of 10, 50 and 200 metres by CaSO 4 : Mn thermoluminescence dosimeters. The measured exposures are described and discussed. The advantages and disadvantages of the technique are compared with other methods. The method using exposure measurements for the study of active release in water has given satisfactory results in practice. The measurements have been made at concentration levels which are considerably below that permissible for drinking water according to the recommendations by ICRPA special advantage with this method is that the measurements can be made simultaneously at a large number of places and that integration is possible over sufficiently long periods of time. An experiment is described where Ar 41 was released in free air at a height of one metre above ground and the resulting exposure distribution was measured in a vertical plane at 100 and 250 metres distance by CaSO 4 : Mn thermoluminescence dosimeters. Shielding problems in connection with the experiments have been small since the method permits the measurement of very small doses. An account is given of the possibility of using the beta emitting isotope Kr 85 instead of the gamma emitting Ar 41 for diffusion experiments in air. The results obtained from some experiments are presented and discussed. The thermoluminescent signal from the dosimeters are, at the same concentration and exposure time, 2.5 times greater for Kr 85 than for Ar 41. The Kr

  2. Screen-Space Normal Distribution Function Caching for Consistent Multi-Resolution Rendering of Large Particle Data

    KAUST Repository

    Ibrahim, Mohamed; Wickenhauser, Patrick; Rautek, Peter; Reina, Guido; Hadwiger, Markus

    2017-01-01

    Molecular dynamics (MD) simulations are crucial to investigating important processes in physics and thermodynamics. The simulated atoms are usually visualized as hard spheres with Phong shading, where individual particles and their local density can be perceived well in close-up views. However, for large-scale simulations with 10 million particles or more, the visualization of large fields-of-view usually suffers from strong aliasing artifacts, because the mismatch between data size and output resolution leads to severe under-sampling of the geometry. Excessive super-sampling can alleviate this problem, but is prohibitively expensive. This paper presents a novel visualization method for large-scale particle data that addresses aliasing while enabling interactive high-quality rendering. We introduce the novel concept of screen-space normal distribution functions (S-NDFs) for particle data. S-NDFs represent the distribution of surface normals that map to a given pixel in screen space, which enables high-quality re-lighting without re-rendering particles. In order to facilitate interactive zooming, we cache S-NDFs in a screen-space mipmap (S-MIP). Together, these two concepts enable interactive, scale-consistent re-lighting and shading changes, as well as zooming, without having to re-sample the particle data. We show how our method facilitates the interactive exploration of real-world large-scale MD simulation data in different scenarios.

  3. Screen-Space Normal Distribution Function Caching for Consistent Multi-Resolution Rendering of Large Particle Data

    KAUST Repository

    Ibrahim, Mohamed

    2017-08-28

    Molecular dynamics (MD) simulations are crucial to investigating important processes in physics and thermodynamics. The simulated atoms are usually visualized as hard spheres with Phong shading, where individual particles and their local density can be perceived well in close-up views. However, for large-scale simulations with 10 million particles or more, the visualization of large fields-of-view usually suffers from strong aliasing artifacts, because the mismatch between data size and output resolution leads to severe under-sampling of the geometry. Excessive super-sampling can alleviate this problem, but is prohibitively expensive. This paper presents a novel visualization method for large-scale particle data that addresses aliasing while enabling interactive high-quality rendering. We introduce the novel concept of screen-space normal distribution functions (S-NDFs) for particle data. S-NDFs represent the distribution of surface normals that map to a given pixel in screen space, which enables high-quality re-lighting without re-rendering particles. In order to facilitate interactive zooming, we cache S-NDFs in a screen-space mipmap (S-MIP). Together, these two concepts enable interactive, scale-consistent re-lighting and shading changes, as well as zooming, without having to re-sample the particle data. We show how our method facilitates the interactive exploration of real-world large-scale MD simulation data in different scenarios.

  4. Empirical phylogenies and species abundance distributions are consistent with pre-equilibrium dynamics of neutral community models with gene flow

    KAUST Repository

    Bonnet-Lebrun, Anne-Sophie

    2017-03-17

    Community characteristics reflect past ecological and evolutionary dynamics. Here, we investigate whether it is possible to obtain realistically shaped modelled communities - i.e., with phylogenetic trees and species abundance distributions shaped similarly to typical empirical bird and mammal communities - from neutral community models. To test the effect of gene flow, we contrasted two spatially explicit individual-based neutral models: one with protracted speciation, delayed by gene flow, and one with point mutation speciation, unaffected by gene flow. The former produced more realistic communities (shape of phylogenetic tree and species-abundance distribution), consistent with gene flow being a key process in macro-evolutionary dynamics. Earlier models struggled to capture the empirically observed branching tempo in phylogenetic trees, as measured by the gamma statistic. We show that the low gamma values typical of empirical trees can be obtained in models with protracted speciation, in pre-equilibrium communities developing from an initially abundant and widespread species. This was even more so in communities sampled incompletely, particularly if the unknown species are the youngest. Overall, our results demonstrate that the characteristics of empirical communities that we have studied can, to a large extent, be explained through a purely neutral model under pre-equilibrium conditions. This article is protected by copyright. All rights reserved.

  5. Empirical phylogenies and species abundance distributions are consistent with pre-equilibrium dynamics of neutral community models with gene flow

    KAUST Repository

    Bonnet-Lebrun, Anne-Sophie; Manica, Andrea; Eriksson, Anders; Rodrigues, Ana S.L.

    2017-01-01

    Community characteristics reflect past ecological and evolutionary dynamics. Here, we investigate whether it is possible to obtain realistically shaped modelled communities - i.e., with phylogenetic trees and species abundance distributions shaped similarly to typical empirical bird and mammal communities - from neutral community models. To test the effect of gene flow, we contrasted two spatially explicit individual-based neutral models: one with protracted speciation, delayed by gene flow, and one with point mutation speciation, unaffected by gene flow. The former produced more realistic communities (shape of phylogenetic tree and species-abundance distribution), consistent with gene flow being a key process in macro-evolutionary dynamics. Earlier models struggled to capture the empirically observed branching tempo in phylogenetic trees, as measured by the gamma statistic. We show that the low gamma values typical of empirical trees can be obtained in models with protracted speciation, in pre-equilibrium communities developing from an initially abundant and widespread species. This was even more so in communities sampled incompletely, particularly if the unknown species are the youngest. Overall, our results demonstrate that the characteristics of empirical communities that we have studied can, to a large extent, be explained through a purely neutral model under pre-equilibrium conditions. This article is protected by copyright. All rights reserved.

  6. MCNP(TM) Release 6.1.1 beta: Creating and Testing the Code Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Cox, Lawrence J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Casswell, Laura [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-06-12

    This report documents the preparations for and testing of the production release of MCNP6™1.1 beta through RSICC at ORNL. It addresses tests on supported operating systems (Linux, MacOSX, Windows) with the supported compilers (Intel, Portland Group and gfortran). Verification and Validation test results are documented elsewhere. This report does not address in detail the overall packaging of the distribution. Specifically, it does not address the nuclear and atomic data collection, the other included software packages (MCNP5, MCNPX and MCNP6) and the collection of reference documents.

  7. NEAMS Software Licensing, Release, and Distribution: Implications for FY2013 Work Package Planning

    International Nuclear Information System (INIS)

    Bernholdt, David E.

    2012-01-01

    The vision of the NEAMS program is to bring truly predictive modeling and simulation (M and S) capabilities to the nuclear engineering community in order to enable a new approach to the analysis of nuclear systems. NEAMS anticipates issuing in FY 2018 a full release of its computational 'Fermi Toolkit' aimed at advanced reactor and fuel cycles. The NEAMS toolkit involves extensive software development activities, some of which have already been underway for several years, however, the Advanced Modeling and Simulation Office (AMSO), which sponsors the NEAMS program, has not yet issued any official guidance regarding software licensing, release, and distribution policies. This motivated an FY12 task in the Capability Transfer work package to develop and recommend an appropriate set of policies. The current preliminary report is intended to provide awareness of issues with implications for work package planning for FY13. We anticipate a small amount of effort associated with putting into place formal licenses and contributor agreements for NEAMS software which doesn't already have them. We do not anticipate any additional effort or costs associated with software release procedures or schedules beyond those dictated by the quality expectations for the software. The largest potential costs we anticipate would be associated with the setup and maintenance of shared code repositories for development and early access to NEAMS software products. We also anticipate an opportunity, with modest associated costs, to work with the Radiation Safety Information Computational Center (RSICC) to clarify export control assessment policies for software under development.

  8. Direct releases to the surface and associated complementary cumulative distribution functions in the 1996 performance assessment for the Waste Isolation Pilot Plant: direct brine release

    International Nuclear Information System (INIS)

    Stoelzel, D.M.; O'Brien, D.G.; Garner, J.W.; Helton, J.C.; Johnson, J.D.; Smith, L.N.

    2000-01-01

    The following topics related to the treatment of direct brine releases to the surface environment in the 1996 performance assessment for the Waste Isolation Pilot Plant (WIPP) are presented: (i) mathematical description of models; (ii) uncertainty and sensitivity analysis results arising from subjective (i.e. epistemic) uncertainty for individual releases; (iii) construction of complementary cumulative distribution functions (CCDFs) arising from stochastic (i.e. aleatory) uncertainty; and (iv) uncertainty and sensitivity analysis results for CCDFs. The presented analyses indicate that direct brine releases do not constitute a serious threat to the effectiveness of the WIPP as a disposal facility for transuranic waste. Even when the effects of uncertain analysis inputs are taken into account, the CCDFs for direct brine releases fall substantially to the left of the boundary line specified in the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, 40 CFR 194)

  9. Resurgence of Minimal Stimulation In Vitro Fertilization with A Protocol Consisting of Gonadotropin Releasing Hormone-Agonist Trigger and Vitrified-Thawed Embryo Transfer

    Directory of Open Access Journals (Sweden)

    Zhang John

    2016-07-01

    Full Text Available Minimal stimulation in vitro fertilization (mini-IVF consists of a gentle controlled ovarian stimulation that aims to produce a maximum of five to six oocytes. There is a misbelief that mini-IVF severely compromises pregnancy and live birth rates. An appraisal of the literature pertaining to studies on mini-IVF protocols was performed. The advantages of minimal stimulation protocols are reported here with a focus on the use of clomiphene citrate (CC, gonadotropin releasing hormone (GnRH ago- nist trigger for oocyte maturation, and freeze-all embryo strategy. Literature review and the author’s own center data suggest that minimal ovarian stimulation protocols with GnRH agonist trigger and freeze-all embryo strategy along with single embryo transfer produce a reasonable clinical pregnancy and live birth rates in both good and poor responders. Additionally, mini-IVF offers numerous advantages such as: i. Reduction in cost and stress with fewer office visits, needle sticks, and ultrasounds, and ii. Reduction in the incidence of ovarian hyperstimulation syndrome (OHSS. Mini-IVF is re-emerging as a solution for some of the problems associated with conventional IVF, such as OHSS, cost, and patient discomfort.

  10. The configuration of the auroral distribution for interplanetary magnetic field Bz northward. 2. Ionospheric convection consistent with Viking observations

    International Nuclear Information System (INIS)

    Jankowska, K.; Elphinstone, R.D.; Murphree, J.S.; Cogger, L.L.; Hearn, D.; Marklund, G.

    1990-01-01

    Views of the northern hemisphere auroral distribution obtained by the Viking satellite present a qualitative means of inferring the convective patterns which occur during interplanetary magnetic field (IMF) B z northward. The approach is taken whereby upward field-aligned currents are assumed to be coincident with large-scale discrete auroral features and on this basis possible convective patterns are deduced. While the patterns are not unique solutions, they are found to be consistent with merging theory predictions. That is, for B z northward the auroral observations support the possibility of three and/or four cell patterns. When the IMF azimuthal angle is 90 degree (270 degree), a clockwise (anticlockwise) cell is found to be located in the polar region between the two standard viscous cells. When IMF B x dominates and is in a toward orientation, convection stagnates, whereas if B x is negative, a four-cell pattern may form with sunward flow at very high latitudes. The concept of using global auroral images as an additional tool when developing convection models could prove to be necessary in order to extend beyond the few isolated measurements taken in situ by satellites

  11. Consistent empirical physical formula construction for recoil energy distribution in HPGe detectors by using artificial neural networks

    International Nuclear Information System (INIS)

    Akkoyun, Serkan; Yildiz, Nihat

    2012-01-01

    The gamma-ray tracking technique is a highly efficient detection method in experimental nuclear structure physics. On the basis of this method, two gamma-ray tracking arrays, AGATA in Europe and GRETA in the USA, are currently being tested. The interactions of neutrons in these detectors lead to an unwanted background in the gamma-ray spectra. Thus, the interaction points of neutrons in these detectors have to be determined in the gamma-ray tracking process in order to improve photo-peak efficiencies and peak-to-total ratios of the gamma-ray peaks. In this paper, the recoil energy distributions of germanium nuclei due to inelastic scatterings of 1–5 MeV neutrons were first obtained by simulation experiments. Secondly, as a novel approach, for these highly nonlinear detector responses of recoiling germanium nuclei, consistent empirical physical formulas (EPFs) were constructed by appropriate feedforward neural networks (LFNNs). The LFNN-EPFs are of explicit mathematical functional form. Therefore, the LFNN-EPFs can be used to derive further physical functions which could be potentially relevant for the determination of neutron interactions in gamma-ray tracking process.

  12. Towards a multiscale description of microvascular flow regulation: O2-dependent release of ATP from human erythrocytes and the distribution of ATP in capillary networks

    Directory of Open Access Journals (Sweden)

    Daniel eGoldman

    2012-07-01

    Full Text Available Integration of the numerous mechanisms that have been suggested to contribute to optimization of O2 supply to meet O2 need in skeletal muscle requires a systems biology approach which permits quantification of these physiological processes over a wide range of length scales. Here we describe two individual computational models based on in vivo and in vitro studies which, when incorporated into a single robust multiscale model, will provide information on the role of erythrocyte-released ATP in perfusion distribution in skeletal muscle under both physiological and pathophysiological conditions. Healthy human erythrocytes exposed to low O2 tension release ATP via a well characterized signaling pathway requiring activation of the G-protein, Gi, and adenylyl cyclase leading to increases in cAMP. This cAMP then activates PKA and subsequently CFTR culminating in ATP release via pannexin 1. A critical control point in this pathway is the level of cAMP which is regulated by pathway-specific phosphodiesterases. Using time constants (~100ms that are consistent with measured erythrocyte ATP release, we have constructed a dynamic model of this pathway. The model predicts levels of ATP release consistent with measurements obtained over a wide range of hemoglobin O2 saturations (sO2. The model further predicts how insulin, at concentrations found in prediabetes, enhances the activity of PDE3 and reduces intracellular cAMP levels leading to decreased low O2-induced ATP release from erythrocytes. The second model, which couples O2 and ATP transport in capillary networks, shows how intravascular ATP and the resulting conducted vasodilation are affected by local sO2, convection and ATP degradation. This model also predicts network-level effects of decreased ATP release resulting from elevated insulin levels. Taken together, these models lay the groundwork for investigating the systems biology of the regulation of microvascular perfusion distribution by

  13. Distributed emergency response system to model dispersion and deposition of atmospheric releases

    International Nuclear Information System (INIS)

    Taylor, S.S.

    1985-04-01

    Aging hardware and software and increasing commitments by the Departments of Energy and Defense have led us to develop a new, expanded system to replace the existing Atmospheric Release Advisory Capability (ARAC) system. This distributed, computer-based, emergency response system is used by state and federal agencies to assess the environmental health hazards resulting from an accidental release of radioactive material into the atmosphere. Like its predecessor, the expanded system uses local meteorology (e.g., wind speed and wind direction), as well as terrain information, to simulate the transport and dispersion of the airborne material. The system also calculates deposition and dose and displays them graphically over base maps of the local geography for use by on-site authorities. This paper discusses the limitations of the existing ARAC system. It also discusses the components and functionality of the new system, the technical difficulties encountered and resolved in its design and implementation, and the software methodologies and tools employed in its development

  14. Noninvasive visualization of in vivo release and intratumoral distribution of surrogate MR contrast agent using the dual MR contrast technique.

    Science.gov (United States)

    Onuki, Yoshinori; Jacobs, Igor; Artemov, Dmitri; Kato, Yoshinori

    2010-09-01

    A direct evaluation of the in vivo release profile of drugs from carriers is a clinical demand in drug delivery systems, because drug release characterized in vitro correlates poorly with in vivo release. The purpose of this study is to demonstrate the in vivo applicability of the dual MR contrast technique as a useful tool for noninvasive monitoring of the stability and the release profile of drug carriers, by visualizing in vivo release of the encapsulated surrogate MR contrast agent from carriers and its subsequent intratumoral distribution profile. The important aspect of this technique is that it incorporates both positive and negative contrast agents within a single carrier. GdDTPA, superparamagnetic iron oxide nanoparticles, and 5-fluorouracil were encapsulated in nano- and microspheres composed of poly(D,L-lactide-co-glycolide), which was used as a model carrier. In vivo studies were performed with orthotopic xenograft of human breast cancer. The MR-based technique demonstrated here has enabled visualization of the delivery of carriers, and release and intratumoral distribution of the encapsulated positive contrast agent. This study demonstrated proof-of-principle results for the noninvasive monitoring of in vivo release and distribution profiles of MR contrast agents, and thus, this technique will make a great contribution to the field. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  15. Boltzmann-distribution-equivalent for Lévy noise and how it leads to thermodynamically consistent epicatalysis

    Science.gov (United States)

    Bier, Martin

    2018-02-01

    Nonequilibrium systems commonly exhibit Lévy noise. This means that the distribution for the size of the Brownian fluctuations has a "fat" power-law tail. Large Brownian kicks are then more common as compared to the ordinary Gaussian distribution. We consider a two-state system, i.e., two wells and a barrier in between. The barrier is sufficiently high for a barrier crossing to be a rare event. When the noise is Lévy, we do not get a Boltzmann distribution between the two wells. Instead we get a situation where the distribution between the two wells also depends on the height of the barrier that is in between. Ordinarily, a catalyst, by lowering the barrier between two states, speeds up the relaxation to an equilibrium, but does not change the equilibrium distribution. In an environment with Lévy noise, on the other hand, we have the possibility of epicatalysis, i.e., a catalyst effectively altering the distribution between two states through the changing of the barrier height. After deriving formulas to quantitatively describe this effect, we discuss how this idea may apply in nuclear reactors and in the biochemistry of a living cell.

  16. Difference in metallic wear distribution released from commercially pure titanium compared with stainless steel plates.

    Science.gov (United States)

    Krischak, G D; Gebhard, F; Mohr, W; Krivan, V; Ignatius, A; Beck, A; Wachter, N J; Reuter, P; Arand, M; Kinzl, L; Claes, L E

    2004-03-01

    Stainless steel and commercially pure titanium are widely used materials in orthopedic implants. However, it is still being controversially discussed whether there are significant differences in tissue reaction and metallic release, which should result in a recommendation for preferred use in clinical practice. A comparative study was performed using 14 stainless steel and 8 commercially pure titanium plates retrieved after a 12-month implantation period. To avoid contamination of the tissue with the elements under investigation, surgical instruments made of zirconium dioxide were used. The tissue samples were analyzed histologically and by inductively coupled plasma atomic emission spectrometry (ICP-AES) for accumulation of the metals Fe, Cr, Mo, Ni, and Ti in the local tissues. Implant corrosion was determined by the use of scanning electron microscopy (SEM). With grades 2 or higher in 9 implants, steel plates revealed a higher extent of corrosion in the SEM compared with titanium, where only one implant showed corrosion grade 2. Metal uptake of all measured ions (Fe, Cr, Mo, Ni) was significantly increased after stainless steel implantation, whereas titanium revealed only high concentrations for Ti. For the two implant materials, a different distribution of the accumulated metals was found by histological examination. Whereas specimens after steel implantation revealed a diffuse siderosis of connective tissue cells, those after titanium exhibited occasionally a focal siderosis due to implantation-associated bleeding. Neither titanium- nor stainless steel-loaded tissues revealed any signs of foreign-body reaction. We conclude from the increased release of toxic, allergic, and potentially carcinogenic ions adjacent to stainless steel that commercially pure Ti should be treated as the preferred material for osteosyntheses if a removal of the implant is not intended. However, neither material provoked a foreign-body reaction in the local tissues, thus cpTi cannot be

  17. Silver distribution and release from an antimicrobial denture base resin containing silver colloidal nanoparticles.

    Science.gov (United States)

    Monteiro, Douglas Roberto; Gorup, Luiz Fernando; Takamiya, Aline Satie; de Camargo, Emerson Rodrigues; Filho, Adhemar Colla Ruvolo; Barbosa, Debora Barros

    2012-01-01

    The aim of this study was to evaluate a denture base resin containing silver colloidal nanoparticles through morphological analysis to check the distribution and dispersion of these particles in the polymer and by testing the silver release in deionized water at different time periods. A Lucitone 550 denture resin was used, and silver nanoparticles were synthesized by reduction of silver nitrate with sodium citrate. The acrylic resin was prepared in accordance with the manufacturers' instructions, and silver nanoparticle suspension was added to the acrylic resin monomer in different concentrations (0.05, 0.5, and 5 vol% silver colloidal). Controls devoid of silver nanoparticles were included. The specimens were stored in deionized water at 37°C for 7, 15, 30, 60, and 120 days, and each solution was analyzed using atomic absorption spectroscopy. Silver was not detected in deionized water regardless of the silver nanoparticles added to the resin and of the storage period. Micrographs showed that with lower concentrations, the distribution of silver nanoparticles was reduced, whereas their dispersion was improved in the polymer. Moreover, after 120 days of storage, nanoparticles were mainly located on the surface of the nanocomposite specimens. Incorporation of silver nanoparticles in the acrylic resin was evidenced. Moreover, silver was not detected by the detection limit of the atomic absorption spectrophotometer used in this study, even after 120 days of storage in deionized water. Silver nanoparticles are incorporated in the PMMA denture resin to attain an effective antimicrobial material to help control common infections involving oral mucosal tissues in complete denture wearers. © 2011 by the American College of Prosthodontists.

  18. Validation of temporal and spatial consistency of facility- and speed-specific vehicle-specific power distributions for emission estimation: A case study in Beijing, China.

    Science.gov (United States)

    Zhai, Zhiqiang; Song, Guohua; Lu, Hongyu; He, Weinan; Yu, Lei

    2017-09-01

    Vehicle-specific power (VSP) has been found to be highly correlated with vehicle emissions. It is used in many studies on emission modeling such as the MOVES (Motor Vehicle Emissions Simulator) model. The existing studies develop specific VSP distributions (or OpMode distribution in MOVES) for different road types and various average speeds to represent the vehicle operating modes on road. However, it is still not clear if the facility- and speed-specific VSP distributions are consistent temporally and spatially. For instance, is it necessary to update periodically the database of the VSP distributions in the emission model? Are the VSP distributions developed in the city central business district (CBD) area applicable to its suburb area? In this context, this study examined the temporal and spatial consistency of the facility- and speed-specific VSP distributions in Beijing. The VSP distributions in different years and in different areas are developed, based on real-world vehicle activity data. The root mean square error (RMSE) is employed to quantify the difference between the VSP distributions. The maximum differences of the VSP distributions between different years and between different areas are approximately 20% of that between different road types. The analysis of the carbon dioxide (CO 2 ) emission factor indicates that the temporal and spatial differences of the VSP distributions have no significant impact on vehicle emission estimation, with relative error of less than 3%. The temporal and spatial differences have no significant impact on the development of the facility- and speed-specific VSP distributions for the vehicle emission estimation. The database of the specific VSP distributions in the VSP-based emission models can maintain in terms of time. Thus, it is unnecessary to update the database regularly, and it is reliable to use the history vehicle activity data to forecast the emissions in the future. In one city, the areas with less data can still

  19. Maternal-fetal distribution of mercury ( sup 203 Hg) released from dental amalgam fillings

    Energy Technology Data Exchange (ETDEWEB)

    Vimy, M.J.; Takahashi, Y.; Lorscheider, F.L. (Univ. of Calgary, Alberta (Canada))

    1990-04-01

    In humans, the continuous release of Hg vapor from dental amalgam tooth restorations is markedly increased for prolonged periods after chewing. The present study establishes a time-course distribution for amalgam Hg in body tissues of adult and fetal sheep. Under general anesthesia, five pregnant ewes had twelve occlusal amalgam fillings containing radioactive 203Hg placed in teeth at 112 days gestation. Blood, amniotic fluid, feces, and urine specimens were collected at 1- to 3-day intervals for 16 days. From days 16-140 after amalgam placement (16-41 days for fetal lambs), tissue specimens were analyzed for radioactivity, and total Hg concentrations were calculated. Results demonstrate that Hg from dental amalgam will appear in maternal and fetal blood and amniotic fluid within 2 days after placement of amalgam tooth restorations. Excretion of some of this Hg will also commence within 2 days. All tissues examined displayed Hg accumulation. Highest concentrations of Hg from amalgam in the adult occurred in kidney and liver, whereas in the fetus the highest amalgam Hg concentrations appeared in liver and pituitary gland. The placenta progressively concentrated Hg as gestation advanced to term, and milk concentration of amalgam Hg postpartum provides a potential source of Hg exposure to the newborn. It is concluded that accumulation of amalgam Hg progresses in maternal and fetal tissues to a steady state with advancing gestation and is maintained. Dental amalgam usage as a tooth restorative material in pregnant women and children should be reconsidered.

  20. Maternal-fetal distribution of mercury (203Hg) released from dental amalgam fillings

    International Nuclear Information System (INIS)

    Vimy, M.J.; Takahashi, Y.; Lorscheider, F.L.

    1990-01-01

    In humans, the continuous release of Hg vapor from dental amalgam tooth restorations is markedly increased for prolonged periods after chewing. The present study establishes a time-course distribution for amalgam Hg in body tissues of adult and fetal sheep. Under general anesthesia, five pregnant ewes had twelve occlusal amalgam fillings containing radioactive 203Hg placed in teeth at 112 days gestation. Blood, amniotic fluid, feces, and urine specimens were collected at 1- to 3-day intervals for 16 days. From days 16-140 after amalgam placement (16-41 days for fetal lambs), tissue specimens were analyzed for radioactivity, and total Hg concentrations were calculated. Results demonstrate that Hg from dental amalgam will appear in maternal and fetal blood and amniotic fluid within 2 days after placement of amalgam tooth restorations. Excretion of some of this Hg will also commence within 2 days. All tissues examined displayed Hg accumulation. Highest concentrations of Hg from amalgam in the adult occurred in kidney and liver, whereas in the fetus the highest amalgam Hg concentrations appeared in liver and pituitary gland. The placenta progressively concentrated Hg as gestation advanced to term, and milk concentration of amalgam Hg postpartum provides a potential source of Hg exposure to the newborn. It is concluded that accumulation of amalgam Hg progresses in maternal and fetal tissues to a steady state with advancing gestation and is maintained. Dental amalgam usage as a tooth restorative material in pregnant women and children should be reconsidered

  1. Accidental hazardous material releases with human impacts in the United States: exploration of geographical distribution and temporal trends.

    Science.gov (United States)

    Sengul, Hatice; Santella, Nicholas; Steinberg, Laura J; Chermak, Christina

    2010-09-01

    To investigate the circumstances and geographic and temporal distributions of hazardous material releases and resulting human impacts in the United States. Releases with fatalities, injuries, and evacuations were identified from reports to the National Response Center between 1990 and 2008, correcting for data quality issues identified in previous studies. From more than 550,000 reports, 861 deaths, 16,348 injuries and 741,427 evacuations were identified. Injuries from releases of chemicals at fixed facilities and natural gas from pipelines have decreased whereas evacuations from petroleum releases at fixed facilities have increased. Results confirm recent advances in chemical and pipeline safety and suggest directions for further improvement including targeted training and inspections and adoption of inherently safer design principles.

  2. Angular distributions of evaporated particles, fission and intermediate-mass fragments: on the search for consistent models

    International Nuclear Information System (INIS)

    Alexander, J.M.

    1987-01-01

    During the last two years there has been a true cacophony concerning the meaning of experimental angular distributions for fission and fission-like fragments. The heavily used, saddle-point, transition-state model has been shown to be of limited value for high-spin systems, and a wide variety of proposals has appeared often with mutual inconsistencies and conflicting views. Even though equilibrium statistical models for fragment emission and particle evaporation must have a very close kinship, this relationship, often left as murky, has now come onto center stage for understanding reactions at ≥ 100 MeV. Basic questions concern the nature of the decision-point configurations, their degrees of freedom, the role of deformation and the relevant moments of inertia. This paper points out serious inconsistencies in several recent scission-point models and discusses conditions for applicability of saddle-point and scission-point approaches

  3. Interaction between seed dormancy-release mechanism, environment and seed bank strategy for a widely distributed perennial legume, Parkinsonia aculeata (Caesalpinaceae).

    Science.gov (United States)

    Van Klinken, Rieks D; Lukitsch, Bert; Cook, Carly

    2008-08-01

    Parkinsonia aculeata (Caesalpinaceae) is a perennial legume with seeds that have hard-seeded (physical) dormancy and are potentially very long-lived. Seed dormancy is a characteristic that can both help maximize the probability of seedling establishment and spread the risk of recruitment failure across years (bet-hedging). In this study, dormancy-release patterns are described across the diverse environments in which this species occurs in order to test whether wet heat (incubation under wet, warm-to-hot, conditions) alone can explain those patterns, and in order to determine the likely ecological role of physical dormancy across this species distribution. A seed burial trial was conducted across the full environmental distribution of P. aculeata in Australia (arid to wet-dry tropics, uplands to wetlands, soil surface to 10 cm deep). Wet heat explained the pattern of dormancy release across all environments. Most seeds stored in the laboratory remained dormant throughout the trial (at least 84 %). Dormancy release was quickest for seeds buried during the wet season at relatively high rainfall, upland sites (only 3 % of seeds remained dormant after 35 d). The longest-lived seeds were in wetlands (9 % remained dormant after almost 4 years) and on the soil surface (57 % after 2 years). There was no consistent correlation between increased aridity and rate of dormancy release. The results suggest that physical dormancy in P. aculeata is a mechanism for maximizing seedling establishment rather than a bet-hedging strategy. However, seed persistence can occur in environmental refuges where dormancy-release cues are weak and conditions for germination and establishment are poor (e.g. under dense vegetation or in more arid micro-environments) or unsuitable (e.g. when seeds are inundated or on the soil surface). Risks of recruitment failure in suboptimal environments could therefore be reduced by inter-year fluctuations in microclimate or seed movement.

  4. A comparison of alternative methods of calculating complementary cumulative distribution functions of health effects following an atmospheric radioactive release

    International Nuclear Information System (INIS)

    Ponting, A.C.; Nair, S.

    1984-04-01

    A concept extensively used in studying the consequences of accidental atmospheric radioactive releases is that of the Complementary Cumulative Distribution Function, CCDF. Various methods of calculating CCDFs have been developed with particular applications in putting degraded core accidents in perspective and in identifying release sequences leading to high risks. This note compares three methods with specific reference to their accuracy and computational efficiency. For two of the methods (that used in the US Reactor Safety Study code CRAC2 and extended version of that method), the effects of varying the sector width and considering site-specific population distributions have been determined. For the third method it is only necessary to consider the effects of site-specific population distributions. (author)

  5. Nanoscale distribution of presynaptic Ca(2+) channels and its impact on vesicular release during development.

    Science.gov (United States)

    Nakamura, Yukihiro; Harada, Harumi; Kamasawa, Naomi; Matsui, Ko; Rothman, Jason S; Shigemoto, Ryuichi; Silver, R Angus; DiGregorio, David A; Takahashi, Tomoyuki

    2015-01-07

    Synaptic efficacy and precision are influenced by the coupling of voltage-gated Ca(2+) channels (VGCCs) to vesicles. But because the topography of VGCCs and their proximity to vesicles is unknown, a quantitative understanding of the determinants of vesicular release at nanometer scale is lacking. To investigate this, we combined freeze-fracture replica immunogold labeling of Cav2.1 channels, local [Ca(2+)] imaging, and patch pipette perfusion of EGTA at the calyx of Held. Between postnatal day 7 and 21, VGCCs formed variable sized clusters and vesicular release became less sensitive to EGTA, whereas fixed Ca(2+) buffer properties remained constant. Experimentally constrained reaction-diffusion simulations suggest that Ca(2+) sensors for vesicular release are located at the perimeter of VGCC clusters (<30 nm) and predict that VGCC number per cluster determines vesicular release probability without altering release time course. This "perimeter release model" provides a unifying framework accounting for developmental changes in both synaptic efficacy and time course. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  6. New polyvinyl chloride (PVC) nanocomposite consisting of aromatic polyamide and chitosan modified ZnO nanoparticles with enhanced thermal stability, low heat release rate and improved mechanical properties

    Science.gov (United States)

    Hajibeygi, Mohsen; Maleki, Mahdiye; Shabanian, Meisam; Ducos, Franck; Vahabi, Henri

    2018-05-01

    New ternary nanocomposite systems containing polylvinyl chloride (PVC), chitosan modified ZnO (CMZN) nanoparticles and new synthesized polyamide (PA) were designed and prepared by solution casting method. As a potential reinforcement, CMZN was used in PVC system combined with and without PA. Morphology, mechanical, thermal and combustion properties of the all PVC systems were studied. In the presence of the CMZN, PA showed a synergistic effect on improvement of the all investigated properties of PVC. The 5 mass% loss temperature (T5) was increased from 195 °C to 243 °C in PVC/CMZN-PA nanocomposite containing 1 mass% of each PA and CMZN (PZP 2). The peak of heat release rate was decreased from 131 W/g for PVC to 104 W/g for PVC/CMZN-PA nanocomposite containing 3 mass% of each PA and CMZN (PZP 6). According to the tensile tests, compared to the neat PVC, the tensile strength was increased from 35.4 to 53.4 MPa for PZP 6.

  7. CONTAINER DISTRIBUTION AND SLOW RELEASE FERTILIZERS APPLICATION ALONG THE PRE-NURSERY INFLUENCING OIL PALM SEEDLINGS GROWTH

    Directory of Open Access Journals (Sweden)

    Paulo César Teixeira

    2009-09-01

    Full Text Available This research had as objective to verify the influence in growth, nutrition and dry matter partition in oil palm seedling by type and dosages of slow release fertilizers (SRF and percentage of tray occupation by plastic containers during pre-nursery. The experiment consisted of 16 treatments, in factorial scheme: two types of SRF (Osmocote® e Basacote mini, two dosages (0 and 3 kg/m3 and four schemes for the container distribution used to attain 100%, 66%, 50% and 25% of tray occupation. An additional treatment composed of 15 x 15 cm plastic bags filled with soil was added. Pre-germinated seeds of oil palm were put in plastic containers of 120 cm3 containing substratum and in plastic bags containing soil. After three months, the seedlings were transplanted to 40 x 40 cm plastic bags containing soil. At this time, height, diameter, dry matter and concentration of N, P, K, Ca and Mg were evaluated. After 10 months, seedlings were evaluated for height and diameter and after 16 months, seedlings had the height, diameter and dry matter weight evaluated. Addition of SRF was fundamental for seedlings development. Different percentages of tray occupation by containers during pre-nursery did not influence height and diameter of oil palm seedlings at 10 and 16 months old. The evaluation after 10 months showed that plants fertilized with Osmocote® were higher than those fertilized with Basacote mini. The evaluations after 16 months showed that plants fertilized during the pre-nursery had higher height, diameter and leaflets, leaf, aboveground and total dry matter than plants not fertilized.

  8. A consistent NPMLE of the joint distribution function with competing risks data under the dependent masking and right-censoring model.

    Science.gov (United States)

    Li, Jiahui; Yu, Qiqing

    2016-01-01

    Dinse (Biometrics, 38:417-431, 1982) provides a special type of right-censored and masked competing risks data and proposes a non-parametric maximum likelihood estimator (NPMLE) and a pseudo MLE of the joint distribution function [Formula: see text] with such data. However, their asymptotic properties have not been studied so far. Under the extention of either the conditional masking probability (CMP) model or the random partition masking (RPM) model (Yu and Li, J Nonparametr Stat 24:753-764, 2012), we show that (1) Dinse's estimators are consistent if [Formula: see text] takes on finitely many values and each point in the support set of [Formula: see text] can be observed; (2) if the failure time is continuous, the NPMLE is not uniquely determined, and the standard approach (which puts weights only on one element in each observed set) leads to an inconsistent NPMLE; (3) in general, Dinse's estimators are not consistent even under the discrete assumption; (4) we construct a consistent NPMLE. The consistency is given under a new model called dependent masking and right-censoring model. The CMP model and the RPM model are indeed special cases of the new model. We compare our estimator to Dinse's estimators through simulation and real data. Simulation study indicates that the consistent NPMLE is a good approximation to the underlying distribution for moderate sample sizes.

  9. Electric ignition energy evaluation and the energy distribution structure of energy released in electrostatic discharge process

    International Nuclear Information System (INIS)

    Liu Qingming; Huang Jinxiang; Shao Huige; Zhang Yunming

    2017-01-01

    Ignition energy is one of the important parameters of flammable materials, and evaluating ignition energy precisely is essential to the safety of process industry and combustion science and technology. By using electric spark discharge test system, a series of electric spark discharge experiments were conducted with the capacitor-stored energy in the range of 10 J, 100 J, and 1000 J, respectively. The evaluation method for energy consumed by electric spark, wire, and switch during capacitor discharge process has been studied respectively. The resistance of wire, switch, and plasma between electrodes has been evaluated by different methods and an optimized evaluation method has been obtained. The electric energy consumed by wire, electric switch, and electric spark-induced plasma between electrodes were obtained and the energy structure of capacitor-released energy was analyzed. The dynamic process and the characteristic parameters (the maximum power, duration of discharge process) of electric spark discharge process have been analyzed. Experimental results showed that, electric spark-consumed energy only accounts for 8%–14% of the capacitor-released energy. With the increase of capacitor-released energy, the duration of discharge process becomes longer, and the energy of plasma accounts for more in the capacitor-released energy. The power of electric spark varies with time as a damped sinusoids function and the period and the maximum value increase with the capacitor-released energy. (paper)

  10. Spatiotemporal distribution of radioactive cesium released from Fukushima Daiichi Nuclear Power Station in the sediment of Tokyo Bay, Japan

    International Nuclear Information System (INIS)

    Nakagawa, Ryota; Ishida, Masanobu; Baba, Daisuke; Tanimoto, Satomi; Okamoto, Yuichi; Yamazaki, Hideo

    2013-01-01

    The spatial and temporal distribution of "1"3"4Cs and "1"3"7Cs released from Fukushima Daiichi Nuclear Power Station in the Tokyo Bay sediments were investigated. The total radioactivity of "1"3"4Cs and "1"3"7Cs detected in the Tokyo Bay sediment ranged from 240 to 870 Bq/kg-dry in the estuary of Arakawa River, but the activities detected in other sites were about 90 Bq/kg-dry or less. These results suggested that radioactive cesium, which precipitated to the ground, was carried to the river along with clay particles by rainfall and transported to the estuary. The vertical distribution of radioactive cesium showed that it invaded deeper than estimated based on the accumulation rate of the sediment. It was described that the vertical distribution of radioactive cesium was affected by physical mixing of sediments by tidal current, flood, and bioturbation of benthos. (author)

  11. Short- and Long-Term Lead Release after Partial Lead Service Line Replacements in a Metropolitan Water Distribution System.

    Science.gov (United States)

    Deshommes, Elise; Laroche, Laurent; Deveau, Dominique; Nour, Shokoufeh; Prévost, Michèle

    2017-09-05

    Thirty-three households were monitored in a full-scale water distribution system, to investigate the impact of recent (sampling over a period of 1-20 months. Point-of-entry filters were installed to capture sporadic release of particulate lead from the lead service lines (LSLs). Mean concentrations increased immediately after PLSLRs and erratic particulate lead spikes were observed over the 18 month post-PLSLR monitoring period. The mass of lead released during this time frame indicates the occurrence of galvanic corrosion and scale destabilization. System-wide, lead concentrations were however lower in households with PLSLRs as compared to those with no replacement, especially for old PLSLRs. Nonetheless, 61% of PLSLR samples still exceeded 10 μg/L, reflecting the importance of implementing full LSL replacement and efficient risk communication. Acute concentrations measured immediately after PLSLRs demonstrate the need for appropriate flushing procedures to prevent lead poisoning.

  12. Interface Consistency

    DEFF Research Database (Denmark)

    Staunstrup, Jørgen

    1998-01-01

    This paper proposes that Interface Consistency is an important issue for the development of modular designs. Byproviding a precise specification of component interfaces it becomes possible to check that separately developedcomponents use a common interface in a coherent matter thus avoiding a very...... significant source of design errors. Awide range of interface specifications are possible, the simplest form is a syntactical check of parameter types.However, today it is possible to do more sophisticated forms involving semantic checks....

  13. Distribution of sulfur aerosol precursors in the SPCZ released by continuous volcanic degassing at Ambrym, Vanuatu

    Science.gov (United States)

    Lefèvre, Jérôme; Menkes, Christophe; Bani, Philipson; Marchesiello, Patrick; Curci, Gabriele; Grell, Georg A.; Frouin, Robert

    2016-08-01

    The Melanesian Volcanic Arc (MVA) emits about 12 kT d- 1 of sulfur dioxide (SO2) to the atmosphere from continuous passive (non-explosive) volcanic degassing, which contributes 20% of the global SO2 emission from volcanoes. Here we assess, from up-to-date and long-term observations, the SO2 emission of the Ambrym volcano, one of the dominant volcanoes in the MVA, and we investigate its role as sulfate precursor on the regional distribution of aerosols, using both satellite observations and model results at 1° × 1° spatial resolution from WRF-Chem/GOCART. Without considering aerosol forcing on clouds, our model parameterizations for convection, vertical mixing and cloud properties provide a reliable chemical weather representation, making possible a cross-examination of model solution and observations. This preliminary work enables the identification of biases and limitations affecting both the model (missing sources) and satellite sensors and algorithms (for aerosol detection and classification) and leads to the implementation of improved transport and aerosol processes in the modeling system. On the one hand, the model confirms a 50% underestimation of SO2 emissions due to satellite swath sampling of the Ozone Monitoring Instrument (OMI), consistent with field studies. The OMI irregular sampling also produces a level of noise that impairs its monitoring capacity during short-term volcanic events. On the other hand, the model reveals a large sensitivity on aerosol composition and Aerosol Optical Depth (AOD) due to choices of both the source function in WRF-Chem and size parameters for sea-salt in FlexAOD, the post-processor used to compute offline the simulated AOD. We then proceed to diagnosing the role of SO2 volcanic emission in the regional aerosol composition. The model shows that both dynamics and cloud properties associated with the South Pacific Convergence Zone (SPCZ) have a large influence on the oxidation of SO2 and on the transport pathways of

  14. Using self-consistent Gibbs free energy surfaces to calculate size distributions of neutral and charged clusters for the sulfuric acid-water binary system

    Science.gov (United States)

    Smith, J. A.; Froyd, K. D.; Toon, O. B.

    2012-12-01

    We construct tables of reaction enthalpies and entropies for the association reactions involving sulfuric acid vapor, water vapor, and the bisulfate ion. These tables are created from experimental measurements and quantum chemical calculations for molecular clusters and a classical thermodynamic model for larger clusters. These initial tables are not thermodynamically consistent. For example, the Gibbs free energy of associating a cluster consisting of one acid molecule and two water molecules depends on the order in which the cluster was assembled: add two waters and then the acid or add an acid and a water and then the second water. We adjust the values within the tables using the method of Lagrange multipliers to minimize the adjustments and produce self-consistent Gibbs free energy surfaces for the neutral clusters and the charged clusters. With the self-consistent Gibbs free energy surfaces, we calculate size distributions of neutral and charged clusters for a variety of atmospheric conditions. Depending on the conditions, nucleation can be dominated by growth along the neutral channel or growth along the ion channel followed by ion-ion recombination.

  15. Investigation of protein distribution in solid lipid particles and its impact on protein release using coherent anti-Stokes Raman scattering microscopy

    DEFF Research Database (Denmark)

    Christophersen, Philip C.; Birch, Ditlev; Saarinen, Jukka

    2015-01-01

    The aim of this study was to gain new insights into protein distribution in solid lipid microparticles (SLMs) and subsequent release mechanisms using a novel label-free chemical imaging method, coherent anti-Stokes Raman scattering (CARS) microscopy. Lysozyme-loaded SLMs were prepared using...... in the solid lipid matrix, which required full lipolysis of the entire matrix to release lysozyme completely. Therefore, SLMs with lysozyme incorporated in an aqueous solution released lysozyme much faster than with lysozyme incorporated as a solid. In conclusion, CARS microscopy was an efficient and non......-destructive method for elucidating the distribution of lysozyme in SLMs. The interpretation of protein distribution and release during lipolysis enabled elucidation of protein release mechanisms. In future, CARS microscopy analysis could facilitate development of a wide range of protein-lipid matrices with tailor...

  16. Distribution of hydrocarbons released during the 2010 MC252 oil spill in deep offshore waters

    International Nuclear Information System (INIS)

    Spier, Chelsea; Stringfellow, William T.; Hazen, Terry C.; Conrad, Mark

    2013-01-01

    The explosion of the Deepwater Horizon oil platform on April 20th, 2010 resulted in the second largest oil spill in history. The distribution and chemical composition of hydrocarbons within a 45 km radius of the blowout was investigated. All available certified hydrocarbon data were acquired from NOAA and BP. The distribution of hydrocarbons was found to be dispersed over a wider area in subsurface waters than previously predicted or reported. A deepwater hydrocarbon plume predicted by models was verified and additional plumes were identified. Because the samples were not collected systematically, there is still some question about the presence and persistence of an 865 m depth plume predicted by models. Water soluble compounds were extracted from the rising oil in deepwater, and were found at potentially toxic levels outside of areas previously reported to contain hydrocarbons. Application of subsurface dispersants was found to increase hydrocarbon concentration in subsurface waters. - Highlights: ► The hydrocarbon distribution was more widely spread than previously predicted or reported. ► 4 subsurface plumes were identified. ► More soluble compounds were preferentially extracted in the deepwater. ► Percentage of detectable results is a useful data analysis technique. ► Subsurface dispersants application increased hydrocarbons in subsurface waters. - All available certified Deepwater Horizon data was used to determine the spatial, temporal, and chemical distribution of hydrocarbons in subsurface of the Gulf of Mexico.

  17. Diet, abundance and distribution as indices of turbot ( Psetta maxima L.) release habitat suitability

    DEFF Research Database (Denmark)

    Sparrevohn, Claus Reedtz; Støttrup, Josianne

    2008-01-01

    , natural abundance, and depth distribution within the habitats. A marked difference was found among habitats in the timing of the diet change from the suboptimal exoskeleton carrying prey items such as crustaceans to fish. The habitat where the wild turbot had the lowest occurrence of fish in their diet...

  18. Release the BEESTS: Bayesian Estimation of Ex-Gaussian STop-Signal Reaction Time Distributions

    Directory of Open Access Journals (Sweden)

    Dora eMatzke

    2013-12-01

    Full Text Available The stop-signal paradigm is frequently used to study response inhibition. Inthis paradigm, participants perform a two-choice response time task wherethe primary task is occasionally interrupted by a stop-signal that promptsparticipants to withhold their response. The primary goal is to estimatethe latency of the unobservable stop response (stop signal reaction timeor SSRT. Recently, Matzke, Dolan, Logan, Brown, and Wagenmakers (inpress have developed a Bayesian parametric approach that allows for theestimation of the entire distribution of SSRTs. The Bayesian parametricapproach assumes that SSRTs are ex-Gaussian distributed and uses Markovchain Monte Carlo sampling to estimate the parameters of the SSRT distri-bution. Here we present an efficient and user-friendly software implementa-tion of the Bayesian parametric approach —BEESTS— that can be appliedto individual as well as hierarchical stop-signal data. BEESTS comes withan easy-to-use graphical user interface and provides users with summarystatistics of the posterior distribution of the parameters as well various diag-nostic tools to assess the quality of the parameter estimates. The softwareis open source and runs on Windows and OS X operating systems. In sum,BEESTS allows experimental and clinical psychologists to estimate entiredistributions of SSRTs and hence facilitates the more rigorous analysis ofstop-signal data.

  19. Monte carlo simulation of vesicular release, spatiotemporal distribution of glutamate in synaptic cleft and generation of postsynaptic currents.

    Science.gov (United States)

    Glavinovíc, M I

    1999-02-01

    The release of vesicular glutamate, spatiotemporal changes in glutamate concentration in the synaptic cleft and the subsequent generation of fast excitatory postsynaptic currents at a hippocampal synapse were modeled using the Monte Carlo method. It is assumed that glutamate is released from a spherical vesicle through a cylindrical fusion pore into the synaptic cleft and that S-alpha-amino-3-hydroxy -5-methyl-4-isoxazolepropionic acid (AMPA) receptors are uniformly distributed postsynaptically. The time course of change in vesicular concentration can be described by a single exponential, but a slow tail is also observed though only following the release of most of the glutamate. The time constant of decay increases with vesicular size and a lower diffusion constant, and is independent of the initial concentration, becoming markedly shorter for wider fusion pores. The cleft concentration at the fusion pore mouth is not negligible compared to vesicular concentration, especially for wider fusion pores. Lateral equilibration of glutamate is rapid, and within approximately 50 micros all AMPA receptors on average see the same concentration of glutamate. Nevertheless the single-channel current and the number of channels estimated from mean-variance plots are unreliable and different when estimated from rise- and decay-current segments. Greater saturation of AMPA receptor channels provides higher but not more accurate estimates. Two factors contribute to the variability of postsynaptic currents and render the mean-variance nonstationary analysis unreliable, even when all receptors see on average the same glutamate concentration. Firstly, the variability of the instantaneous cleft concentration of glutamate, unlike the mean concentration, first rapidly decreases before slowly increasing; the variability is greater for fewer molecules in the cleft and is spatially nonuniform. Secondly, the efficacy with which glutamate produces a response changes with time. Understanding

  20. Reduction of spatial distribution of risk factors for transportation of contaminants released by coal mining activities.

    Science.gov (United States)

    Karan, Shivesh Kishore; Samadder, Sukha Ranjan

    2016-09-15

    It is reported that water-energy nexus composes two of the biggest development and human health challenges. In the present study we presented a Risk Potential Index (RPI) model which encapsulates Source, Vector (Transport), and Target risks for forecasting surface water contamination. The main aim of the model is to identify critical surface water risk zones for an open cast mining environment, taking Jharia Coalfield, India as the study area. The model also helps in feasible sampling design. Based on spatial analysis various risk zones were successfully delineated. Monthly RPI distribution revealed that the risk of surface water contamination was highest during the monsoon months. Surface water samples were analysed to validate the model. A GIS based alternative management option was proposed to reduce surface water contamination risk and observed 96% and 86% decrease in the spatial distribution of very high risk areas for the months June and July respectively. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Levels, distribution and bioavailability of transuranic elements released in the Palomares accident (Spain).

    Science.gov (United States)

    Jiménez-Ramos, M C; Vioque, I; García-Tenorio, R; García León, M

    2008-11-01

    The current levels and distribution of the remaining transuranic contamination present in the terrestrial area affected by the nuclear Palomares accident have been evaluated through the determination of the Pu-isotopes and (241)Am concentrations in soils collected 35 years after the accident. In addition, after confirming that most of the contamination is present in particulate form, some bioavailability laboratory-based experiments, based on the use of single extractants, were performed as an essential step in order to study the behaviour of the Pu contamination in the soils from the affected areas.

  2. Expression and distribution of octopus gonadotropin-releasing hormone in the central nervous system and peripheral organs of the octopus (Octopus vulgaris) by in situ hybridization and immunohistochemistry.

    Science.gov (United States)

    Iwakoshi-Ukena, Eiko; Ukena, Kazuyoshi; Takuwa-Kuroda, Kyoko; Kanda, Atshuhiro; Tsutsui, Kazuyoshi; Minakata, Hiroyuki

    2004-09-20

    We recently purified a peptide with structural features similar to vertebrate gonadotropin-releasing hormone (GnRH) from the brain of Octopus vulgaris, cloned a cDNA encoding the precursor protein, and named it oct-GnRH. In the current study, we investigated the expression and distribution of oct-GnRH throughout the central nervous system (CNS) and peripheral organs of Octopus by in situ hybridization on the basis of the cDNA sequence and by immunohistochemistry using a specific antiserum against oct-GnRH. Oct-GnRH mRNA-expressing cell bodies were located in 10 of 19 lobes in the supraesophageal and subesophageal parts of the CNS. Several oct-GnRH-like immunoreactive fibers were seen in all the neuropils of the CNS lobes. The sites of oct-GnRH mRNA expression and the mature peptide distribution were consistent with each other as judged by in situ hybridization and immunohistochemistry. In addition, many immunoreactive fibers were distributed in peripheral organs such as the heart, the oviduct, and the oviducal gland. Modulatory effects of oct-GnRH on the contractions of the heart and the oviduct were demonstrated. The results suggested that, in the context of reproduction, oct-GnRH is a key peptide in the subpedunculate lobe and/or posterior olfactory lobe-optic gland-gonadal axis, an octopus analogue of the hypothalamo-hypophysial-gonadal axis. It may also act as a modulatory factor in controlling higher brain functions such as feeding, memory, movement, maturation, and autonomic functions

  3. Geographical distribution of radioactive nuclides released from the Fukushima Daiichi Nuclear Power Station accident in eastern Japan

    International Nuclear Information System (INIS)

    Ishida, Masanobu; Umetsu, Kohei; Sugimoto, Miyabi; Yamaguchi, Yuta; Yamazaki, Hideo; Nakagawa, Ryota

    2013-01-01

    The geographical distribution of radioactive nuclides released from the Fukushima Daiichi Nuclear Power Station accident in metropolitan areas located in eastern Japan was investigated. The radioactive contamination of environmental samples, including soil and biological materials, was analyzed. The concentrations of 131 I, 134 Cs, and 137 Cs in the soil samples collected from Fukushima City were 122000, 11500 and 14000 Bq/kg on 19th March 2011 and 129000, 11000 and 13700 Bq/kg on 26th March 2011, for the three nuclides respectively. The concentrations of 131 I, 134 Cs and 137 Cs in the soil samples collected from March-June 2011 from study sites ranged from 240 to 101000, 28 to 26200, and 14 to 33700 Bq/kg, respectively. In Higashiosaka City, it began to detect those radioactive nuclides in the atmospheric airborne dust from 25th March. Radioactive fission products 95 Zr- 95 Nb were detected on 18th April 2011. Biological samples collected from Tokyo Bay were studied. The maximum concentrations of 134 Cs and 137 Cs detected in the biological samples were 12.2 and 19.2 Bq/kg, which were measured in goby. 131 I was not detected in the biological samples however, trace amounts of the short half-life nuclide 110m Ag were found in the shellfish samples. (author)

  4. Direct releases to the surface and associated complementary cumulative distribution functions in the 1996 performance assessment for the Waste Isolation Pilot Plant: cuttings, cavings and spallings

    International Nuclear Information System (INIS)

    Berglund, J.W.; Garner, J.W.; Helton, J.C.; Johnson, J.D.; Smith, L.N.

    2000-01-01

    The following topics related to the treatment of cuttings, cavings and spallings releases to the surface environment in the 1996 performance assessment for the Waste Isolation Pilot Plant (WIPP) are presented: (i) mathematical description of models; (ii) uncertainty and sensitivity analysis results arising from subjective (i.e. epistemic) uncertainty for individual releases; (iii) construction of complementary cumulative distribution functions (CCDFs) arising from stochastic (i.e. aleatory) uncertainty; and (iv) uncertainty and sensitivity analysis results for CCDFs. The presented results indicate that direct releases due to cuttings, cavings and spallings do not constitute a serious threat to the effectiveness of the WIPP as a disposal facility for transuranic waste. Even when the effects of uncertain analysis inputs are taken into account, the CCDFs for cuttings, cavings and spallings releases fall substantially to the left of the boundary line specified in the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, 40 CFR 194)

  5. Direct releases to the surface and associated complementary cumulative distribution functions in the 1996 performance assessment for the Waste Isolation Pilot Plant: Cuttings, cavings and spallings

    International Nuclear Information System (INIS)

    Berglund, J.W.; Garner, J.W.; Helton, Jon Craig; Johnson, J.D.; Smith, L.N.; Anderson, R.P.

    2000-01-01

    The following topics related to the treatment of cuttings, cavings and spallings releases to the surface environment in the 1996 performance assessment for the Waste Isolation Pilot Plant (WIPP) are presented: (1) mathematical description of models. (2) uncertainty and sensitivity analysis results arising from subjective (i.e., epistemic) uncertainty for individual releases, (3) construction of complementary cumulative distribution functions (CCDFs) arising from stochastic (i.e., aleatory) uncertainty, and (4) uncertainty and sensitivity analysis results for CCDFs. The presented results indicate that direct releases due to cuttings, cavings and spallings do not constitute a serious threat to the effectiveness of the WIPP as a disposal facility for transuranic waste. Even when the effects of uncertain analysis inputs are taken into account, the CCDFs for cuttings, cavings and spallings releases fall substantially to the left of the boundary line specified in the US Environmental Protection Agency standard for the geologic disposal of radioactive waste (40 CFR 191, 40 CFR 194)

  6. Distribution-based estimates of minimum clinically important difference in cognition, arm function and lower body function after slow release-fampridine treatment of patients with multiple sclerosis

    DEFF Research Database (Denmark)

    Jensen, H B; Mamoei, Sepehr; Ravnborg, M.

    2016-01-01

    OBJECTIVE: To provide distribution-based estimates of the minimal clinical important difference (MCID) after slow release fampridine treatment on cognition and functional capacity in people with MS (PwMS). METHOD: MCID values were determined after SR-Fampridine treatment in 105 PwMS. Testing...

  7. Reaction dynamics of the four-centered elimination CH2OH + --> CHO + +H2: Measurement of kinetic energy release distribution and classical trajectory calculation

    Science.gov (United States)

    Lee, Tae Geol; Park, Seung C.; Kim, Myung Soo

    1996-03-01

    Mass-analyzed ion kinetic energy (MIKE) spectrum of CHO+ generated in the unimolecular dissociation of CH2OH+ was measured. Kinetic energy release distribution (KERD) was evaluated by analyzing the spectrum according to the algorithm developed previously. The average kinetic energy release evaluated from the distribution was extraordinarily large, 1.63 eV, corresponding to 75% of the reverse barrier of the reaction. A global analytical potential energy surface was constructed such that the experimental energetics was represented and that various features in the ab initio potential energy surface were closely reproduced. Classical trajectory calculation was carried out with the global analytical potential energy surface to investigate the causes for the extraordinarily large kinetic energy release. Based on the detailed dynamical calculations, it was found that the strained bending forces at the transition state and strengthening of the CO bond from double to triple bond character were mainly responsible for such a significant kinetic energy release. In addition, the dissociation products H2 and CHO+ ion were found to be rotationally excited in the trajectory calculations. This was attributed to the asymmetry of the transition state and the release of asymmetric bending forces. Also, the bending vibrational modes of CHO+ and the H2 stretching mode, which are coupled with the bending coordinates, were found to be moderately excited.

  8. Glass consistency and glass performance

    International Nuclear Information System (INIS)

    Plodinec, M.J.; Ramsey, W.G.

    1994-01-01

    Glass produced by the Defense Waste Processing Facility (DWPF) will have to consistently be more durable than a benchmark glass (evaluated using a short-term leach test), with high confidence. The DWPF has developed a Glass Product Control Program to comply with this specification. However, it is not clear what relevance product consistency has on long-term glass performance. In this report, the authors show that DWPF glass, produced in compliance with this specification, can be expected to effectively limit the release of soluble radionuclides to natural environments. However, the release of insoluble radionuclides to the environment will be limited by their solubility, and not glass durability

  9. Releasing Pattern of Applied Phosphorus and Distribution Change of Phosphorus Fractions in the Acid Upland Soils with Successive Resin Extraction

    Directory of Open Access Journals (Sweden)

    Arief Hartono

    2008-05-01

    Full Text Available The releasing pattern of applied P in the acid upland soils and the soil properties influencing the pattern were studied. Surface horizons of six acid upland soils from Sumatra, Java and Kalimantan were used in this study. The releasing pattern of applied P (300 mg P kg-1 of these soils were studied by successive resin extraction. P fractionation was conducted to evaluate which fractions released P to the soil solution after successive resin extraction. The cumulative of resin-Pinorganic (Pi release of soils was fitted to the first order kinetic. Regression analyses using factor scores obtained from the previous principal components analyses was applied to determine soil properties influencing P releasing pattern. The results suggested that the maximum P release was significantly (P < 0.05 increased by acidity plus 1.4 nm mineral-related factor (PC2 i.e. exchangeable Al and 1.4 nm minerals (smectite and vermiculite and decreased by oxide related factor (PC1 i.e. aluminum (Al plus 1/2 iron (Fe (by ammonium oxalate, crystalline Al and Fe oxides, cation exchange capacity, and clay content. P fractionation analysis after successive resin extraction showed that both labile and less labile in the form of NaHCO3-Pi and NaOH-Pi fractions, respectively, can be transformed into resin-Pi when in the most labile resin-Pi is depleted. Most of P released in high oxides soils were from NaOH-Pi fraction while in low oxides soils were from NaHCO3-Pi. P release from the former fraction resulted in the maximum P release lower than that of the latter one. When NaHCO3-Pi was high, NaOH-Pi was relatively more stable than NaHCO3-Pi despite resin-Pi removal. NaHCO3-Pi and NaOH-Pi are very important P fractions in replenishing resin-Pi in these acid upland soils.

  10. Combining scenarios in a calculation of the overall probability distribution of cumulative releases of radioactivity from the Waste Isolation Pilot Plant, southeastern New Mexico

    International Nuclear Information System (INIS)

    Tierney, M.S.

    1991-11-01

    The Waste Isolation Pilot Plant (WIPP), in southeastern New Mexico, is a research and development facility to demonstrate safe disposal of defense-generated transuranic waste. The US Department of Energy will designate WIPP as a disposal facility if it meets the US Environmental Protection Agency's standard for disposal of such waste; the standard includes a requirement that estimates of cumulative releases of radioactivity to the accessible environment be incorporated in an overall probability distribution. The WIPP Project has chosen an approach to calculation of an overall probability distribution that employs the concept of scenarios for release and transport of radioactivity to the accessible environment. This report reviews the use of Monte Carlo methods in the calculation of an overall probability distribution and presents a logical and mathematical foundation for use of the scenario concept in such calculations. The report also draws preliminary conclusions regarding the shape of the probability distribution for the WIPP system; preliminary conclusions are based on the possible occurrence of three events and the presence of one feature: namely, the events ''attempted boreholes over rooms and drifts,'' ''mining alters ground-water regime,'' ''water-withdrawal wells provide alternate pathways,'' and the feature ''brine pocket below room or drift.'' Calculation of the WIPP systems's overall probability distributions for only five of sixteen possible scenario classes that can be obtained by combining the four postulated events or features

  11. Lexical processing and distributional knowledge in sound-spelling mapping in a consistent orthography: A longitudinal study of reading and spelling in dyslexic and typically developing children.

    Science.gov (United States)

    Marinelli, Chiara Valeria; Cellini, Pamela; Zoccolotti, Pierluigi; Angelelli, Paola

    This study examined the ability to master lexical processing and use knowledge of the relative frequency of sound-spelling mappings in both reading and spelling. Twenty-four dyslexic and dysgraphic children and 86 typically developing readers were followed longitudinally in 3rd and 5th grades. Effects of word regularity, word frequency, and probability of sound-spelling mappings were examined in two experimental tasks: (a) spelling to dictation; and (b) orthographic judgment. Dyslexic children showed larger regularity and frequency effects than controls in both tasks. Sensitivity to distributional information of sound-spelling mappings was already detected by third grade, indicating early acquisition even in children with dyslexia. Although with notable differences, knowledge of the relative frequencies of sound-spelling mapping influenced both reading and spelling. Results are discussed in terms of their theoretical and empirical implications.

  12. Investigation of Blade-row Flow Distributions in Axial-flow-compressor Stage Consisting of Guide Vanes and Rotor-blade Row

    Science.gov (United States)

    Mahoney, John J; Dugan, Paul D; Budinger, Raymond E; Goelzer, H Fred

    1950-01-01

    A 30-inch tip-diameter axial-flow compressor stage was investigated with and without rotor to determine individual blade-row performance, interblade-row effects, and outer-wall boundary-layer conditions. Velocity gradients at guide-vane outlet without rotor approximated design assumptions, when the measured variation of leaving angle was considered. With rotor in operation, Mach number and rotor-blade effects changed flow distribution leaving guide vanes and invalidated design assumption of radial equilibrium. Rotor-blade performance correlated interpolated two-dimensional results within 2 degrees, although tip stall was indicated in experimental and not two-dimensional results. Boundary-displacement thickness was less than 1.0 and 1.5 percent of passage height after guide vanes and after rotor, respectively, but increased rapidly after rotor when tip stall occurred.

  13. The Rucio Consistency Service

    CERN Document Server

    Serfon, Cedric; The ATLAS collaboration

    2016-01-01

    One of the biggest challenge with Large scale data management system is to ensure the consistency between the global file catalog and what is physically on all storage elements. To tackle this issue, the Rucio software which is used by the ATLAS Distributed Data Management system has been extended to automatically handle lost or unregistered files (aka Dark Data). This system automatically detects these inconsistencies and take actions like recovery or deletion of unneeded files in a central manner. In this talk, we will present this system, explain the internals and give some results.

  14. Preparation and characterization of ibuprofen-loaded microspheres consisting of poly(3-hydroxybutyrate) and methoxy poly (ethylene glycol)-b-poly (D,L-lactide) blends or poly(3-hydroxybutyrate) and gelatin composites for controlled drug release

    Energy Technology Data Exchange (ETDEWEB)

    Bidone, Juliana; Melo, Ana Paula P. [Laboratorio de Farmacotecnica, Departamento de Ciencias Farmaceuticas, Universidade Federal de Santa Catarina, Florianopolis (Brazil); Bazzo, Giovana C. [Grupo de Estudos em Materiais Polimericos (POLIMAT), Departamento de Quimica, Universidade Federal de Santa Catarina, Florianopolis (Brazil); Carmignan, Francoise [Laboratorio de Farmacotecnica, Departamento de Ciencias Farmaceuticas, Universidade Federal de Santa Catarina, Florianopolis (Brazil); Soldi, Marli S.; Pires, Alfredo T.N. [Grupo de Estudos em Materiais Polimericos (POLIMAT), Departamento de Quimica, Universidade Federal de Santa Catarina, Florianopolis (Brazil); Lemos-Senna, Elenara [Laboratorio de Farmacotecnica, Departamento de Ciencias Farmaceuticas, Universidade Federal de Santa Catarina, Florianopolis (Brazil)], E-mail: lemos@ccs.ufsc.br

    2009-03-01

    Poly-(3-hydroxybutyrate) (P(3HB)) is a biodegradable and biocompatible polymer that has been used to obtain polymer-based drug carriers. However, due to the high crystallinity degree of this polymer, drug release from P(3HB) microspheres frequently occurs at excessive rates. In this study, two strategies for prolonging ibuprofen release from P(3HB)-based microspheres were tested: blending with poly(D,L-lactide)-b-polyethylene glycol (mPEG-PLA); and obtaining composite particles with gelatin (GEL). SEM micrographs showed particles that were spherical and had a rough surface. A slight decrease of the crystallinity degree of P(3HB) was observed only in the DSC thermogram obtained from unloaded-microspheres prepared from 1:1 P(3HB):mPEG-PLA blend. For IBF-loaded microspheres, a reduction of around 10 deg. C in the melting temperature of P(3HB) was observed, indicating that the crystalline structure of the polymer was affected in the presence of the drug. DSC studies also yielded evidence of the presence of a molecular dispersion coexisting with a crystalline dispersion in the drug in the matrix. Similar results were obtained from X-ray diffractograms. In spite of 1:1 mPEG-PLA:P(3HB) blends having contributed to the reduction of the burst effect, a more controlled drug release was provided by the use of the 3:1 P(3HB):mPEGPLA blend. This result indicated that particle hydration played an important role in the drug release. On the other hand, the preparation of P(3HB):GEL composite microspheres did not allow control of the IBF release.

  15. Preparation and characterization of ibuprofen-loaded microspheres consisting of poly(3-hydroxybutyrate) and methoxy poly (ethylene glycol)-b-poly (D,L-lactide) blends or poly(3-hydroxybutyrate) and gelatin composites for controlled drug release

    International Nuclear Information System (INIS)

    Bidone, Juliana; Melo, Ana Paula P.; Bazzo, Giovana C.; Carmignan, Francoise; Soldi, Marli S.; Pires, Alfredo T.N.; Lemos-Senna, Elenara

    2009-01-01

    Poly-(3-hydroxybutyrate) (P(3HB)) is a biodegradable and biocompatible polymer that has been used to obtain polymer-based drug carriers. However, due to the high crystallinity degree of this polymer, drug release from P(3HB) microspheres frequently occurs at excessive rates. In this study, two strategies for prolonging ibuprofen release from P(3HB)-based microspheres were tested: blending with poly(D,L-lactide)-b-polyethylene glycol (mPEG-PLA); and obtaining composite particles with gelatin (GEL). SEM micrographs showed particles that were spherical and had a rough surface. A slight decrease of the crystallinity degree of P(3HB) was observed only in the DSC thermogram obtained from unloaded-microspheres prepared from 1:1 P(3HB):mPEG-PLA blend. For IBF-loaded microspheres, a reduction of around 10 deg. C in the melting temperature of P(3HB) was observed, indicating that the crystalline structure of the polymer was affected in the presence of the drug. DSC studies also yielded evidence of the presence of a molecular dispersion coexisting with a crystalline dispersion in the drug in the matrix. Similar results were obtained from X-ray diffractograms. In spite of 1:1 mPEG-PLA:P(3HB) blends having contributed to the reduction of the burst effect, a more controlled drug release was provided by the use of the 3:1 P(3HB):mPEGPLA blend. This result indicated that particle hydration played an important role in the drug release. On the other hand, the preparation of P(3HB):GEL composite microspheres did not allow control of the IBF release

  16. Carbon 14 distribution in irradiated BWR fuel cladding and released carbon 14 after aqueous immersion of 6.5 years

    Energy Technology Data Exchange (ETDEWEB)

    Sakuragi, T. [Radioactive Waste Management Funding and Research Center, Tsukishima 1-15-7, Chuo City, Tokyo, 104-0052 (Japan); Yamashita, Y.; Akagi, M.; Takahashi, R. [TOSHIBA Corporation, Ukishima Cho 4-1, Kawasaki Ward, Kawasaki, 210-0862 (Japan)

    2016-07-01

    Spent fuel cladding which is highly activated and strongly contaminated is expected to be disposed of in an underground repository. A typical activation product in the activated metal waste is carbon 14 ({sup 14}C), which is mainly generated by the {sup 14}N(n,p){sup 14}C reaction and produces a significant exposure dose due to the large inventory, long half-life (5730 years), rapid release rate, and the speciation and consequent migration parameters. In the preliminary Japanese safety case, the release of radionuclides from the metal matrix is regarded as the corrosion-related congruent release, and the cladding oxide layer is regarded as a source of instant release fraction (IRF). In the present work, specific activity of {sup 14}C was measured using an irradiated BWR fuel cladding (Zircaloy-2, average rod burnup of 41.6 GWd/tU) which has an external oxide film having a thickness of 25.3 μm. The {sup 14}C specific activity of the base metal was 1.49*10{sup 4} Bq/g, which in the corresponding burnup is comparable to values in the existing literature, which were obtained from various irradiated claddings. Although the specific activity in oxide was 2.8 times the base metal activity due to the additive generation by the {sup 17}O(n,α){sup 14}C reaction, the {sup 14}C abundance in oxide was less than 10% of total inventory. A static leaching test using the cladding tube was carried out in an air-tight vessel filled with a deoxygenated dilute NaOH solution (pH of 12.5) at room temperature. After 6.5 years, {sup 14}C was found in each leachate fraction of gas phase and dissolved organics and inorganics, the total of which was less than 0.01% of the {sup 14}C inventory of the immersed cladding tube. A simple calculation based on the congruent release with Zircaloy corrosion has suggested that the 96.7% of released {sup 14}C was from the external oxide layer and 3.3% was from the base Zircaloy metal. However, both the {sup 14}C abundance and the low leaching rate

  17. Structural Consistency, Consistency, and Sequential Rationality.

    OpenAIRE

    Kreps, David M; Ramey, Garey

    1987-01-01

    Sequential equilibria comprise consistent beliefs and a sequentially ra tional strategy profile. Consistent beliefs are limits of Bayes ratio nal beliefs for sequences of strategies that approach the equilibrium strategy. Beliefs are structurally consistent if they are rationaliz ed by some single conjecture concerning opponents' strategies. Consis tent beliefs are not necessarily structurally consistent, notwithstan ding a claim by Kreps and Robert Wilson (1982). Moreover, the spirit of stru...

  18. 3D finite element analysis of stress distributions and strain energy release rates for adhesive bonded flat composite lap shear joints having pre-existing delaminations

    Energy Technology Data Exchange (ETDEWEB)

    Parida, S. K.; Pradhan, A. K. [Indian Institute of Technology, Bhubaneswar (India)

    2014-02-15

    The rate of propagation of embedded delamination in the strap adherend of lap shear joint (LSJ) made of carbon/epoxy composites has been evaluated employing three-dimensional non-linear finite elements. The delamination has been presumed to pre-exist in the thin resin layer between the first and second plies of the strap adherend. The inter-laminar peel and shear stress distributions have been studied in details and are seen to be predominantly three-dimensional in nature. The components of strain energy release rate (SERR) corresponding to the opening, sliding and cross sliding modes of delamination are significantly different at the two fronts of the embedded delamination. The sequential release of multi-point constraint (MPC) finite elements in the vicinity of the delamination fronts enables to simulate the growth of the delamination at either ends. This simulation procedure can be utilized effectively for evaluation of the status of the structural integrity of the bonded joints.

  19. A Monte Carlo procedure for the construction of complementary cumulative distribution functions for comparison with the EPA release limits for radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C.; Shiver, A.W.

    1994-10-01

    A Monte Carlo procedure for the construction of complementary cumulative distribution functions (CCDFs) for comparison with the US Environmental Protection Agency (EPA) release limits for radioactive waste disposal (40 CFR 191, Subpart B) is described and illustrated with results from a recent performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP). The Monte Carlo procedure produces CCDF estimates similar to those obtained with stratified sampling in several recent PAs for the WIPP. The advantages of the Monte Carlo procedure over stratified sampling include increased resolution in the calculation of probabilities for complex scenarios involving drilling intrusions and better use of the necessarily limited number of mechanistic calculations that underlie CCDF construction.

  20. A Monte Carlo procedure for the construction of complementary cumulative distribution functions for comparison with the EPA release limits for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.; Shiver, A.W.

    1994-10-01

    A Monte Carlo procedure for the construction of complementary cumulative distribution functions (CCDFs) for comparison with the US Environmental Protection Agency (EPA) release limits for radioactive waste disposal (40 CFR 191, Subpart B) is described and illustrated with results from a recent performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP). The Monte Carlo procedure produces CCDF estimates similar to those obtained with stratified sampling in several recent PAs for the WIPP. The advantages of the Monte Carlo procedure over stratified sampling include increased resolution in the calculation of probabilities for complex scenarios involving drilling intrusions and better use of the necessarily limited number of mechanistic calculations that underlie CCDF construction

  1. Managing the consistency of distributed documents

    OpenAIRE

    Nentwich, C.

    2005-01-01

    Many businesses produce documents as part of their daily activities: software engineers produce requirements specifications, design models, source code, build scripts and more; business analysts produce glossaries, use cases, organisation charts, and domain ontology models; service providers and retailers produce catalogues, customer data, purchase orders, invoices and web pages. What these examples have in common is that the content of documents is often semantically relate...

  2. On the phenomenon of the fast release of energy in irradiated solid methane: discussion of models considering the local space distribution of energy

    International Nuclear Information System (INIS)

    Shabalin, E.P.

    1995-01-01

    A general idea of the phenomenon, so-called 'a burp', is provided. It is concluded that no justified and consistent theory of the burp phenomenon is proposed. The paper expresses criticism on the application of the classic theory of homogeneous chemical reactions and of thermal explosion to analyze burps as it was in common usage up to now. Two hypotheses to explain features burps display are presented instead. Both of them have to do with mechanism of storing and releasing energy accounting for local non-uniformity of temperature end energy deposition. 11 refs., 4 figs

  3. Sorption-desorption processes of radioisotopes with solid materials from liquid releases and atmosphere deposits. The distribution coefficient (Ksub(d)), its uses, limitations, and practical applications

    International Nuclear Information System (INIS)

    Saas, Arsene

    1979-03-01

    The various sorption-desorption processes of radionuclides with environmental materials are presented. The parameters governing the distribution coefficient are reviewed in the light of various examples. The factors affecting equilibria between the different phases are: reaction time, concentration of the solid phase, water quality, salinity, competition between ions, concentration of radioisotopes or stable isotopes, pH of the mobile phase, particle diameter, chemical form of the radioisotopes, nature of the solid phase, temperature. The effects of the biological parameters on the distribution coefficient are discussed. Biological processes affect the main chemical transformations: mineralization, insolubilization, oxidation-reduction, complexation, ... The importance of these processes is demonstrated by a number of examples in various media. Finally, the practical use of Ksub(d) in the assessment of the environmental impact of radioactive releases is developed, with special emphasis on the limits of its use in siting studies and its essential interest in specifying pathways and capacity of a river system [fr

  4. Sensitivity Analysis of Flow and Temperature Distributions of Density Currents in a River-Reservoir System under Upstream Releases with Different Durations

    Directory of Open Access Journals (Sweden)

    Gang Chen

    2015-11-01

    Full Text Available A calibrated three-dimensional Environmental Fluid Dynamics Code model was applied to simulate unsteady flow patterns and temperature distributions in the Bankhead river-reservoir system in Alabama, USA. A series of sensitivity model runs were performed under daily repeated large releases (DRLRs with different durations (2, 4 and 6 h from Smith Dam Tailrace (SDT when other model input variables were kept unchanged. The density currents in the river-reservoir system form at different reaches, are destroyed at upstream locations due to the flow momentum of the releases, and form again due to solar heating. DRLRs (140 m3/s with longer durations push the bottom cold water further downstream and maintain a cooler bottom water temperature. For the 6-h DRLR, the momentum effect definitely reaches Cordova (~43.7 km from SDT. Positive bottom velocity (density currents moving downstream is achieved 48.4%, 69.0% and 91.1% of the time with an average velocity of 0.017, 0.042 and 0.053 m/s at Cordova for the 2-h, 4-h and 6-h DRLR, respectively. Results show that DRLRs lasting for at least 4 h maintain lower water temperatures at Cordova. When the 4-h and 6-h DRLRs repeat for more than 6 and 10 days, respectively, bottom temperatures at Cordova become lower than those for the constant small release (2.83 m3/s. These large releases overwhelm the mixing effects due to inflow momentum and maintain temperature stratification at Cordova.

  5. Lateral hypothalamic thyrotropin-releasing hormone neurons: distribution and relationship to histochemically defined cell populations in the rat.

    Science.gov (United States)

    Horjales-Araujo, E; Hellysaz, A; Broberger, C

    2014-09-26

    The lateral hypothalamic area (LHA) constitutes a large component of the hypothalamus, and has been implicated in several aspects of motivated behavior. The LHA is of particular relevance to behavioral state control and the maintenance of arousal. Due to the cellular heterogeneity of this region, however, only some subpopulations of LHA cells have been properly anatomically characterized. Here, we have focused on cells expressing thyrotropin-releasing hormone (TRH), a peptide found in the LHA that has been implicated as a promoter of arousal. Immunofluorescence and in situ hybridization were used to map the LHA TRH population in the rat, and cells were observed to form a large ventral cluster that extended throughout almost the entire rostro-caudal axis of the hypothalamus. Almost no examples of coexistence were seen when sections were double-stained for TRH and markers of other LHA populations, including the peptides hypocretin/orexin, melanin-concentrating hormone and neurotensin. In the juxtaparaventricular area, however, a discrete group of TRH-immunoreactive cells were also stained with antisera against enkephalin and urocortin 3. Innervation from the metabolically sensitive hypothalamic arcuate nucleus was investigated by double-staining for peptide markers of the two centrally projecting groups of arcuate neurons, agouti gene-related peptide and α-melanocyte-stimulating hormone, respectively; both populations of terminals were observed forming close appositions on TRH cells in the LHA. The present study indicates that TRH-expressing cells form a unique population in the LHA that may serve as a link between metabolic signals and the generation of arousal. Copyright © 2014 IBRO. Published by Elsevier Ltd. All rights reserved.

  6. A Bayesian method for characterizing distributed micro-releases: II. inference under model uncertainty with short time-series data.

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef; Fast P. (Lawrence Livermore National Laboratory, Livermore, CA); Kraus, M. (Peterson AFB, CO); Ray, J. P.

    2006-01-01

    Terrorist attacks using an aerosolized pathogen preparation have gained credibility as a national security concern after the anthrax attacks of 2001. The ability to characterize such attacks, i.e., to estimate the number of people infected, the time of infection, and the average dose received, is important when planning a medical response. We address this question of characterization by formulating a Bayesian inverse problem predicated on a short time-series of diagnosed patients exhibiting symptoms. To be of relevance to response planning, we limit ourselves to 3-5 days of data. In tests performed with anthrax as the pathogen, we find that these data are usually sufficient, especially if the model of the outbreak used in the inverse problem is an accurate one. In some cases the scarcity of data may initially support outbreak characterizations at odds with the true one, but with sufficient data the correct inferences are recovered; in other words, the inverse problem posed and its solution methodology are consistent. We also explore the effect of model error-situations for which the model used in the inverse problem is only a partially accurate representation of the outbreak; here, the model predictions and the observations differ by more than a random noise. We find that while there is a consistent discrepancy between the inferred and the true characterizations, they are also close enough to be of relevance when planning a response.

  7. Time-dependent density functional theory (TD-DFT) coupled with reference interaction site model self-consistent field explicitly including spatial electron density distribution (RISM-SCF-SEDD)

    Energy Technology Data Exchange (ETDEWEB)

    Yokogawa, D., E-mail: d.yokogawa@chem.nagoya-u.ac.jp [Department of Chemistry, Graduate School of Science, Nagoya University, Chikusa, Nagoya 464-8602 (Japan); Institute of Transformative Bio-Molecules (WPI-ITbM), Nagoya University, Chikusa, Nagoya 464-8602 (Japan)

    2016-09-07

    Theoretical approach to design bright bio-imaging molecules is one of the most progressing ones. However, because of the system size and computational accuracy, the number of theoretical studies is limited to our knowledge. To overcome the difficulties, we developed a new method based on reference interaction site model self-consistent field explicitly including spatial electron density distribution and time-dependent density functional theory. We applied it to the calculation of indole and 5-cyanoindole at ground and excited states in gas and solution phases. The changes in the optimized geometries were clearly explained with resonance structures and the Stokes shift was correctly reproduced.

  8. Ultraviolet B irradiation induces changes in the distribution and release of arachidonic acid, dihomo-gamma-linolenic acid, and eicosapentaenoic acid in human keratinocytes in culture

    International Nuclear Information System (INIS)

    Punnonen, K.; Puustinen, T.; Jansen, C.T.

    1987-01-01

    There is increasing evidence that derivatives of 20-carbon polyunsaturated fatty acids, the eicosanoids, play an important role in the inflammatory responses of the human skin. To better understand the metabolic fate of fatty acids in the skin, the effect of ultraviolet B (UVB) irradiation (280-320 nm) on the distribution and release of 14 C-labeled arachidonic acid, dihomo-gamma-linolenic acid, and eicosapentaenoic acid in human keratinocytes in culture was investigated. Ultraviolet B irradiation induced the release of all three 14 C-labeled fatty acids from the phospholipids, especially from phosphatidylethanolamine, and this was accompanied by increased labeling of the nonphosphorus lipids. This finding suggests that UVB induces a significant liberation of eicosanoid precursor fatty acids from cellular phospholipids, but the liberated fatty acids are largely reincorporated into the nonphosphorus lipids. In conclusion, the present study suggests that not only arachidonic acid but also dihomo-gamma-linolenic acid, and eicosapentaenoic acid might be involved in the UVB irradiation-induced inflammatory reactions of human skin

  9. Fission Fragment Mass Distributions and Total Kinetic Energy Release of 235-Uranium and 238-Uranium in Neutron-Induced Fission at Intermediate and Fast Neutron Energies

    Energy Technology Data Exchange (ETDEWEB)

    Duke, Dana Lynn [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-11-12

    This Ph.D. dissertation describes a measurement of the change in mass distributions and average total kinetic energy (TKE) release with increasing incident neutron energy for fission of 235U and 238U. Although fission was discovered over seventy-five years ago, open questions remain about the physics of the fission process. The energy of the incident neutron, En, changes the division of energy release in the resulting fission fragments, however, the details of energy partitioning remain ambiguous because the nucleus is a many-body quantum system. Creating a full theoretical model is difficult and experimental data to validate existing models are lacking. Additional fission measurements will lead to higher-quality models of the fission process, therefore improving applications such as the development of next-generation nuclear reactors and defense. This work also paves the way for precision experiments such as the Time Projection Chamber (TPC) for fission cross section measurements and the Spectrometer for Ion Determination in Fission (SPIDER) for precision mass yields.

  10. Effect of radiative transfer of heat released from combustion reaction on temperature distribution: A numerical study for a 2-D system

    International Nuclear Information System (INIS)

    Zhou Huaichun; Ai Yuhua

    2006-01-01

    Both light and heat are produced during a chemical reaction in a combustion process, but traditionally all the energy released is taken as to be transformed into the internal energy of the combustion medium. So the temperature of the medium increases, and then the thermal radiation emitted from it increases too. Chemiluminescence is generated during a chemical reaction and independent of the temperature, and has been used widely for combustion diagnostics. It was assumed in this paper that the total energy released in a combustion reaction is divided into two parts, one part is a self-absorbed heat, and the other is a directly emitted heat. The former is absorbed immediately by the products, becomes the internal energy and then increases the temperature of the products as treated in the traditional way. The latter is emitted directly as radiation into the combustion domain and should be included in the radiation transfer equation (RTE) as a part of radiation source. For a simple, 2-D, gray, emitting-absorbing, rectangular system, the numerical study showed that the temperatures in reaction zones depended on the fraction of the directly emitted energy, and the smaller the gas absorption coefficient was, the more strong the dependence appeared. Because the effect of the fraction of the directly emitted heat on the temperature distribution in the reacting zones for gas combustion is significant, it is required to conduct experimental measurements to determine the fraction of self-absorbed heat for different combustion processes

  11. Modeling benthic–pelagic nutrient exchange processes and porewater distributions in a seasonally hypoxic sediment: evidence for massive phosphate release by Beggiatoa?

    Directory of Open Access Journals (Sweden)

    K. Wallmann

    2013-02-01

    Full Text Available This study presents benthic data from 12 samplings from February to December 2010 in a 28 m deep channel in the southwest Baltic Sea. In winter, the distribution of solutes in the porewater was strongly modulated by bioirrigation which efficiently flushed the upper 10 cm of sediment, leading to concentrations which varied little from bottom water values. Solute pumping by bioirrigation fell sharply in the summer as the bottom waters became severely hypoxic (2. At this point the giant sulfide-oxidizing bacteria Beggiatoa was visible on surface sediments. Despite an increase in O2 following mixing of the water column in November, macrofauna remained absent until the end of the sampling. Contrary to expectations, metabolites such as dissolved inorganic carbon, ammonium and hydrogen sulfide did not accumulate in the upper 10 cm during the hypoxic period when bioirrigation was absent, but instead tended toward bottom water values. This was taken as evidence for episodic bubbling of methane gas out of the sediment acting as an abiogenic irrigation process. Porewater–seawater mixing by escaping bubbles provides a pathway for enhanced nutrient release to the bottom water and may exacerbate the feedback with hypoxia. Subsurface dissolved phosphate (TPO4 peaks in excess of 400 μM developed in autumn, resulting in a very large diffusive TPO4 flux to the water column of 0.7 ± 0.2 mmol m−2 d−1. The model was not able to simulate this TPO4 source as release of iron-bound P (Fe–P or organic P. As an alternative hypothesis, the TPO4 peak was reproduced using new kinetic expressions that allow Beggiatoa to take up porewater TPO4 and accumulate an intracellular P pool during periods with oxic bottom waters. TPO4 is then released during hypoxia, as previous published results with sulfide-oxidizing bacteria indicate. The TPO4 added to the porewater over the year by organic P and Fe–P is recycled through Beggiatoa, meaning that no additional source of

  12. Consistent model driven architecture

    Science.gov (United States)

    Niepostyn, Stanisław J.

    2015-09-01

    The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.

  13. Bitcoin Meets Strong Consistency

    OpenAIRE

    Decker, Christian; Seidel, Jochen; Wattenhofer, Roger

    2014-01-01

    The Bitcoin system only provides eventual consistency. For everyday life, the time to confirm a Bitcoin transaction is prohibitively slow. In this paper we propose a new system, built on the Bitcoin blockchain, which enables strong consistency. Our system, PeerCensus, acts as a certification authority, manages peer identities in a peer-to-peer network, and ultimately enhances Bitcoin and similar systems with strong consistency. Our extensive analysis shows that PeerCensus is in a secure state...

  14. Consistent classical supergravity theories

    International Nuclear Information System (INIS)

    Muller, M.

    1989-01-01

    This book offers a presentation of both conformal and Poincare supergravity. The consistent four-dimensional supergravity theories are classified. The formulae needed for further modelling are included

  15. Effect of EMIC Wave Normal Angle Distribution on Relativistic Electron Scattering Based on the Newly Developed Self-consistent RC/EMIC Waves Model by Khazanov et al. [2006

    Science.gov (United States)

    Khazanov, G. V.; Gallagher, D. L.; Gamayunov, K.

    2007-01-01

    It is well known that the effects of EMIC waves on RC ion and RB electron dynamics strongly depend on such particle/wave characteristics as the phase-space distribution function, frequency, wave-normal angle, wave energy, and the form of wave spectral energy density. Therefore, realistic characteristics of EMIC waves should be properly determined by modeling the RC-EMIC waves evolution self-consistently. Such a selfconsistent model progressively has been developing by Khaznnov et al. [2002-2006]. It solves a system of two coupled kinetic equations: one equation describes the RC ion dynamics and another equation describes the energy density evolution of EMIC waves. Using this model, we present the effectiveness of relativistic electron scattering and compare our results with previous work in this area of research.

  16. Consistency of orthodox gravity

    Energy Technology Data Exchange (ETDEWEB)

    Bellucci, S. [INFN, Frascati (Italy). Laboratori Nazionali di Frascati; Shiekh, A. [International Centre for Theoretical Physics, Trieste (Italy)

    1997-01-01

    A recent proposal for quantizing gravity is investigated for self consistency. The existence of a fixed-point all-order solution is found, corresponding to a consistent quantum gravity. A criterion to unify couplings is suggested, by invoking an application of their argument to more complex systems.

  17. Quasiparticles and thermodynamical consistency

    International Nuclear Information System (INIS)

    Shanenko, A.A.; Biro, T.S.; Toneev, V.D.

    2003-01-01

    A brief and simple introduction into the problem of the thermodynamical consistency is given. The thermodynamical consistency relations, which should be taken into account under constructing a quasiparticle model, are found in a general manner from the finite-temperature extension of the Hellmann-Feynman theorem. Restrictions following from these relations are illustrated by simple physical examples. (author)

  18. Distribution

    Science.gov (United States)

    John R. Jones

    1985-01-01

    Quaking aspen is the most widely distributed native North American tree species (Little 1971, Sargent 1890). It grows in a great diversity of regions, environments, and communities (Harshberger 1911). Only one deciduous tree species in the world, the closely related Eurasian aspen (Populus tremula), has a wider range (Weigle and Frothingham 1911)....

  19. Expectation Consistent Approximate Inference

    DEFF Research Database (Denmark)

    Opper, Manfred; Winther, Ole

    2005-01-01

    We propose a novel framework for approximations to intractable probabilistic models which is based on a free energy formulation. The approximation can be understood from replacing an average over the original intractable distribution with a tractable one. It requires two tractable probability dis...

  20. Modelling isothermal fission gas release

    International Nuclear Information System (INIS)

    Uffelen, P. van

    2002-01-01

    The present paper presents a new fission gas release model consisting of two coupled modules. The first module treats the behaviour of the fission gas atoms in spherical grains with a distribution of grain sizes. This module considers single atom diffusion, trapping and fission induced re-solution of gas atoms associated with intragranular bubbles, and re-solution from the grain boundary into a few layers adjacent to the grain face. The second module considers the transport of the fission gas atoms along the grain boundaries. Four mechanisms are incorporated: diffusion controlled precipitation of gas atoms into bubbles, grain boundary bubble sweeping, re-solution of gas atoms into the adjacent grains and gas flow through open porosity when grain boundary bubbles are interconnected. The interconnection of the intergranular bubbles is affected both by the fraction of the grain face occupied by the cavities and by the balance between the bubble internal pressure and the hydrostatic pressure surrounding the bubbles. The model is under validation. In a first step, some numerical routines have been tested by means of analytic solutions. In a second step, the fission gas release model has been coupled with the FTEMP2 code of the Halden Reactor Project for the temperature distribution in the pellets. A parametric study of some steady-state irradiations and one power ramp have been simulated successfully. In particular, the Halden threshold for fission gas release and two simplified FUMEX cases have been computed and are summarised. (author)

  1. Consistency in PERT problems

    OpenAIRE

    Bergantiños, Gustavo; Valencia-Toledo, Alfredo; Vidal-Puga, Juan

    2016-01-01

    The program evaluation review technique (PERT) is a tool used to schedule and coordinate activities in a complex project. In assigning the cost of a potential delay, we characterize the Shapley rule as the only rule that satisfies consistency and other desirable properties.

  2. Reporting consistently on CSR

    DEFF Research Database (Denmark)

    Thomsen, Christa; Nielsen, Anne Ellerup

    2006-01-01

    This chapter first outlines theory and literature on CSR and Stakeholder Relations focusing on the different perspectives and the contextual and dynamic character of the CSR concept. CSR reporting challenges are discussed and a model of analysis is proposed. Next, our paper presents the results...... of a case study showing that companies use different and not necessarily consistent strategies for reporting on CSR. Finally, the implications for managerial practice are discussed. The chapter concludes by highlighting the value and awareness of the discourse and the discourse types adopted...... in the reporting material. By implementing consistent discourse strategies that interact according to a well-defined pattern or order, it is possible to communicate a strong social commitment on the one hand, and to take into consideration the expectations of the shareholders and the other stakeholders...

  3. Geometrically Consistent Mesh Modification

    KAUST Repository

    Bonito, A.

    2010-01-01

    A new paradigm of adaptivity is to execute refinement, coarsening, and smoothing of meshes on manifolds with incomplete information about their geometry and yet preserve position and curvature accuracy. We refer to this collectively as geometrically consistent (GC) mesh modification. We discuss the concept of discrete GC, show the failure of naive approaches, and propose and analyze a simple algorithm that is GC and accuracy preserving. © 2010 Society for Industrial and Applied Mathematics.

  4. Molecular cloning and distribution of oxytocin/vasopressin-like mRNA in the blue swimming crab, Portunus pelagicus, and its inhibitory effect on ovarian steroid release.

    Science.gov (United States)

    Saetan, Jirawat; Kruangkum, Thanapong; Phanthong, Phetcharat; Tipbunjong, Chittipong; Udomuksorn, Wandee; Sobhon, Prasert; Sretarugsa, Prapee

    2018-04-01

    This study was aimed to characterize the full length of mRNA of oxytocin/vasopressin (OT/VP)-like mRNA in female Portunus pelagicus (PpelOT/VP-like mRNA) using a partial PpelOT/VP-like sequence obtained previously in our transcriptome analysis (Saetan, 2014) to construct the primers. The PpelOT/VP-like mRNA was 626 bp long and it encoded the preprohormones containing 158 amino acids. This preprohormone consisted of a signal peptide, an active nonapeptide (CFITNCPPG) followed by the dibasic cleavage site (GKR), and the neurophysin domain. Sequence alignment of the PpelOT/VP-like peptide with those of other animals revealed strong molecular conservation. Phylogenetic analysis of encoded proteins revealed that the PpelOT/VP-like peptide was clustered within the group of crustacean OT/VP-like peptide. Analysis by RT-PCR revealed the expression of mRNA transcripts in the eyestalk, brain, ventral nerve cord (VNC), ovary, intestine and gill. The in situ hybridization demonstrated the cellular localizations of the transcripts in the central nervous system (CNS) and ovary tissues. In the eyestalk, the mRNA expression was observed in the neuronal clusters 1-5 but not in the sinus gland complex. In the brain and the VNC, the transcripts were detected in all neuronal clusters but not in the glial cell. In the ovary, the transcripts were found in all stages of oocytes (Oc1, Oc2, Oc3, and Oc4). In addition, synthetic PpelOT/VP-like peptide could inhibit steroid release from the ovary. The knowledge gained from this study will provide more understanding on neuro-endocrinological controls in this crab species. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Is cosmology consistent?

    International Nuclear Information System (INIS)

    Wang Xiaomin; Tegmark, Max; Zaldarriaga, Matias

    2002-01-01

    We perform a detailed analysis of the latest cosmic microwave background (CMB) measurements (including BOOMERaNG, DASI, Maxima and CBI), both alone and jointly with other cosmological data sets involving, e.g., galaxy clustering and the Lyman Alpha Forest. We first address the question of whether the CMB data are internally consistent once calibration and beam uncertainties are taken into account, performing a series of statistical tests. With a few minor caveats, our answer is yes, and we compress all data into a single set of 24 bandpowers with associated covariance matrix and window functions. We then compute joint constraints on the 11 parameters of the 'standard' adiabatic inflationary cosmological model. Our best fit model passes a series of physical consistency checks and agrees with essentially all currently available cosmological data. In addition to sharp constraints on the cosmic matter budget in good agreement with those of the BOOMERaNG, DASI and Maxima teams, we obtain a heaviest neutrino mass range 0.04-4.2 eV and the sharpest constraints to date on gravity waves which (together with preference for a slight red-tilt) favor 'small-field' inflation models

  6. Consistent Quantum Theory

    Science.gov (United States)

    Griffiths, Robert B.

    2001-11-01

    Quantum mechanics is one of the most fundamental yet difficult subjects in physics. Nonrelativistic quantum theory is presented here in a clear and systematic fashion, integrating Born's probabilistic interpretation with Schrödinger dynamics. Basic quantum principles are illustrated with simple examples requiring no mathematics beyond linear algebra and elementary probability theory. The quantum measurement process is consistently analyzed using fundamental quantum principles without referring to measurement. These same principles are used to resolve several of the paradoxes that have long perplexed physicists, including the double slit and Schrödinger's cat. The consistent histories formalism used here was first introduced by the author, and extended by M. Gell-Mann, J. Hartle and R. Omnès. Essential for researchers yet accessible to advanced undergraduate students in physics, chemistry, mathematics, and computer science, this book is supplementary to standard textbooks. It will also be of interest to physicists and philosophers working on the foundations of quantum mechanics. Comprehensive account Written by one of the main figures in the field Paperback edition of successful work on philosophy of quantum mechanics

  7. Release the Prisoners Game

    Science.gov (United States)

    Van Hecke, Tanja

    2011-01-01

    This article presents the mathematical approach of the optimal strategy to win the "Release the prisoners" game and the integration of this analysis in a math class. Outline lesson plans at three different levels are given, where simulations are suggested as well as theoretical findings about the probability distribution function and its mean…

  8. Potential application of the consistency approach for vaccine potency testing.

    Science.gov (United States)

    Arciniega, J; Sirota, L A

    2012-01-01

    The Consistency Approach offers the possibility of reducing the number of animals used for a potency test. However, it is critical to assess the effect that such reduction may have on assay performance. Consistency of production, sometimes referred to as consistency of manufacture or manufacturing, is an old concept implicit in regulation, which aims to ensure the uninterrupted release of safe and effective products. Consistency of manufacture can be described in terms of process capability, or the ability of a process to produce output within specification limits. For example, the standard method for potency testing of inactivated rabies vaccines is a multiple-dilution vaccination challenge test in mice that gives a quantitative, although highly variable estimate. On the other hand, a single-dilution test that does not give a quantitative estimate, but rather shows if the vaccine meets the specification has been proposed. This simplified test can lead to a considerable reduction in the number of animals used. However, traditional indices of process capability assume that the output population (potency values) is normally distributed, which clearly is not the case for the simplified approach. Appropriate computation of capability indices for the latter case will require special statistical considerations.

  9. Technology of stable, prolonged-release eye-drops containing Cyclosporine A, distributed between lipid matrix and surface of the solid lipid microspheres (SLM).

    Science.gov (United States)

    Wolska, Eliza; Sznitowska, Małgorzata

    2013-01-30

    The aim of this study was to prepare solid lipid microspheres (SLM) with incorporated Cyclosporine A (Cs), suitable for ocular application. For this purpose, SLM were formulated by using different lipids and three different nonionic surfactants. The SLM were produced using a hot emulsification method. The SLM dispersions contained 10, 20 or 30% of lipid (w/w) and up to 2% (w/w) of Cs. The size of the microspheres with Cs ranged from 1 to 15 μm. Physically stable SLM with Cs were prepared using Compritol, as a lipid matrix, and Tween 80, as a surfactant. In contrast, dispersion with Precirol alone, formed semi-solid gels during storage, while in formulations with Precirol and Miglyol, crystals of Cs were observed. In vitro release profile of Compritol formulations showed that 40% of Cs is released within 1h, while the release of the following 40% takes more time, depending on lipid content in the formulations. The large part of Cs, added to SLM formulations (from 45 to 80%), was found on the surface of microparticles, but no drug crystallization occurred during a long-term storage. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Distribution of MT1 melatonin receptor immunoreactivity in the human hypothalamus and pituitary gland: colocalization of MT1 with vasopressin, oxytocin, and corticotropin-releasing hormone.

    NARCIS (Netherlands)

    Wu, Y.-H.; Zhou, J.-N.; Balesar, R.; Unmehopa, U.; Bao, A.; Jockers, R.; Heerikhuize, J.; Swaab, D.F.

    2006-01-01

    Melatonin is implicated in numerous physiological processes, including circadian rhythms, stress, and reproduction, many of which are mediated by the hypothalamus and pituitary. The physiological actions of melatonin are mainly mediated by melatonin receptors. We here describe the distribution of

  11. Decentralized Consistent Updates in SDN

    KAUST Repository

    Nguyen, Thanh Dang

    2017-04-10

    We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and blackholes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.

  12. Modifications of imaging spectroscopy methods for increases spatial and temporal consistency: A case study of change in leafy spurge distribution between 1999 and 2001 in Theodore Roosevelt National Park, North Dakota

    Science.gov (United States)

    Dudek, Kathleen Burke

    The noxious weed leafy spurge (Euphorbia esula L.) has spread throughout the northern Great Plains of North America since it was introduced in the early 1800s, and it is currently a significant management concern. Accurate, rapid location and repeatable measurements are critical for successful temporal monitoring of infestations. Imaging spectroscopy is well suited for identification of spurge; however, the development and dissemination of standardized hyperspectral mapping procedures that produce consistent multi-temporal maps has been absent. Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data, collected in 1999 and 2001 over Theodore Roosevelt National Park, North Dakota, were used to locate leafy spurge. Published image-processing methods were tested to determine the most successful for consistent maps. Best results were obtained using: (1) NDVI masking; (2) cross-track illumination correction; (3) image-derived spectral libraries; and (4) mixture-tuned matched filtering algorithm. Application of the algorithm was modified to standardize processing and eliminate threshold decisions; the image-derived library was refined to eliminate additional variability. Primary (spurge dominant), secondary (spurge non-dominant), abundance, and area-wide vegetation maps were produced. Map accuracies were analyzed with point, polygon, and grid reference sets, using confusion matrices and regression between field-measured and image-derived abundances. Accuracies were recalculated after applying a majority filter, and buffers ranging from 1-5 pixels wide around classified pixels, to accommodate poor reference-image alignment. Overall accuracy varied from 39% to 82%, however, regression analyses yielded r2 = 0.725, indicating a strong relationship between field and image-derived densities. Accuracy was sensitive to: (1) registration offsets between field and image locations; (2) modification of analytical methods; and (3) reference data quality. Sensor viewing angle

  13. Unique distributions of hydrocarbons and sulphur compounds released by flash pyrolysis from the fossilised alga Gloeocapsomorpha prisca, a major constituent in one of four Ordovician kerogens

    NARCIS (Netherlands)

    Sinninghe Damsté, J.S.; Douglas, A.G.; Fowler, M.G.; Eglinton, T.I.

    1991-01-01

    Kerogens isolated from four rocks of Ordovician age from North America have been analysed by combined pyrolysis-gas chromatography-mass spectrometry to compare and contrast the type and distribution of sulphur-containing compounds and aromatic and aliphatic hydrocarbons present in the

  14. The Dark Energy Survey First Data Release

    Science.gov (United States)

    Carrasco Kind, Matias

    2018-01-01

    In this talk I will announce and highlight the main components of the first public data release (DR1) coming from the Dark Energy Survey (DES).In January 2016, the DES survey made available, in a simple unofficial release to the astronomical community, the first set of products. This data was taken and studied during the DES Science Verification period consisting on roughly 250 sq. degrees and 25 million objects at a mean depth of i=23.7 that led to over 80 publications from DES scientist.The DR1 release is the first official release from the main survey and it consists on the observations taken during the first 3 seasons from August 2013 to February 2016 (about 100 nights each season) of the survey which cover the entire DES footprint. All of the Single Epoch Images and the Year 3 Coadded images distributed in 10223 tiles are available for download in this release. The catalogs provide astrometry, photometry and basic classification for near 400M objects in roughly 5000 sq. degrees on the southern hemisphere with a approximate mean depth of i=23.3. Complementary footprint, masking and depth information is also available. All of the software used during the generation of these products are open sourced and have been made available through the Github DES Organization. Images, data and other sub products have been possible through the international and collaborative effort of all 25 institutions involved in DES and are available for exploration and download through the interfaces provided by a partnership between NCSA, NOAO and LIneA.

  15. Methane release

    International Nuclear Information System (INIS)

    Seifert, M.

    1999-01-01

    The Swiss Gas Industry has carried out a systematic, technical estimate of methane release from the complete supply chain from production to consumption for the years 1992/1993. The result of this survey provided a conservative value, amounting to 0.9% of the Swiss domestic output. A continuation of the study taking into account new findings with regard to emission factors and the effect of the climate is now available, which provides a value of 0.8% for the target year of 1996. These results show that the renovation of the network has brought about lower losses in the local gas supplies, particularly for the grey cast iron pipelines. (author)

  16. Measuring process and knowledge consistency

    DEFF Research Database (Denmark)

    Edwards, Kasper; Jensen, Klaes Ladeby; Haug, Anders

    2007-01-01

    When implementing configuration systems, knowledge about products and processes are documented and replicated in the configuration system. This practice assumes that products are specified consistently i.e. on the same rule base and likewise for processes. However, consistency cannot be taken...... for granted; rather the contrary, and attempting to implement a configuration system may easily ignite a political battle. This is because stakes are high in the sense that the rules and processes chosen may only reflect one part of the practice, ignoring a majority of the employees. To avoid this situation......, this paper presents a methodology for measuring product and process consistency prior to implementing a configuration system. The methodology consists of two parts: 1) measuring knowledge consistency and 2) measuring process consistency. Knowledge consistency is measured by developing a questionnaire...

  17. Testing the enemy release hypothesis: abundance and distribution patterns of helminth communities in grey mullets (Teleostei: Mugilidae) reveal the success of invasive species.

    Science.gov (United States)

    Sarabeev, Volodimir; Balbuena, Juan Antonio; Morand, Serge

    2017-09-01

    The abundance and aggregation patterns of helminth communities of two grey mullet hosts, Liza haematocheilus and Mugil cephalus, were studied across 14 localities in Atlantic and Pacific marine areas. The analysis matched parasite communities of (i) L. haematocheilus across its native and introduced populations (Sea of Japan and Sea of Azov, respectively) and (ii) the introduced population of L. haematocheilus with native populations of M. cephalus (Mediterranean, Azov-Black and Japan Seas). The total mean abundance (TMA), as a feature of the infection level in helminth communities, and slope b of the Taylor's power law, as a measure of parasite aggregation at the infra and component-community levels, were estimated and compared between host species and localities using ANOVA. The TMA of the whole helminth community in the introduced population of L. haematocheilus was over 15 times lower than that of the native population, but the difference was less pronounced for carried (monogeneans) than for acquired (adult and larval digeneans) parasite communities. Similar to the abundance pattern, the species distribution in communities from the invasive population of L. haematocheilus was less aggregated than from its native population for endoparasitic helminths, including adult and larval digeneans, while monogeneans showed a similar pattern of distribution in the compared populations of L. haematocheilus. The aggregation level of the whole helminth community, endoparasitic helminths, adult and larval digeneans was lower in the invasive host species in comparison with native ones as shown by differences in the slope b. An important theoretical implication from this study is that the pattern of parasite aggregation may explain the success of invasive species in ecosystems. Because the effects of parasites on host mortality are likely dose-dependent, the proportion of susceptible host individuals in invasive species is expected to be lower, as the helminth distribution in

  18. Environmental transformation and distribution of mercury released from gold mining and its implications on human health in Tanzania, studied by nuclear techniques

    International Nuclear Information System (INIS)

    Ikingura, Justinian R.

    2001-01-01

    The catchment areas of Lake Victoria in Tanzania are impacted by mercury contamination from small-scale gold mining activities. A preliminary survey of the mercury contamination has indicated in some cases mercury concentrations that are higher than background levels in soil and river sediment downstream of the mining areas. Average mercury concentration in contaminated soil is in the order of 3.4 mg/kg whereas in river sediment the concentration is about 4.9 mg/kg. Mercury concentrations in fish from a few areas of the Lake Victoria close to gold mining areas are in the range of 2-20 ppb. These fish mercury concentrations are surprisingly low considering the extent of mercury contamination in the Lake Victoria catchment. The dynamics of mercury cycling and their long-term impact on mercury levels in fish and other aquatic organisms in the Lake Victoria gold fields still need to be clarified. Research activities for the first year (2000) will concentrate on the determination of total mercury distribution patterns among soil, river water, sediment, and biota (fish, and other aquatic biota) in two areas (Mugusu-Nungwe Bay and Imweru-Bukombe Bay) of the Lake Victoria gold fields. The relationships between local tropical soil-sediment- and water-chemistry and the distribution of mercury in the contaminated areas will be investigated. Data from this work will be used in the identification and selection of suitable bio-monitors for mercury contamination and human health risk assessment in the study areas. In the second year, the project will focus mainly on methylmercury production and partition between sediment, water and biota in contaminated local tropical sediments. The main factors influencing the methylation and distribution of mercury species will be evaluated in laboratory experiments and extrapolated to environmental conditions. The results of the project will have important implications in mercury pollution monitoring, mitigation, and health risk assessment not

  19. Stochastic multicomponent reactive transport analysis of low quality drainage release from waste rock piles: Controls of the spatial distribution of acid generating and neutralizing minerals.

    Science.gov (United States)

    Pedretti, Daniele; Mayer, K Ulrich; Beckie, Roger D

    2017-06-01

    In mining environmental applications, it is important to assess water quality from waste rock piles (WRPs) and estimate the likelihood of acid rock drainage (ARD) over time. The mineralogical heterogeneity of WRPs is a source of uncertainty in this assessment, undermining the reliability of traditional bulk indicators used in the industry. We focused in this work on the bulk neutralizing potential ratio (NPR), which is defined as the ratio of the content of non-acid-generating minerals (typically reactive carbonates such as calcite) to the content of potentially acid-generating minerals (typically sulfides such as pyrite). We used a streamtube-based Monte-Carlo method to show why and to what extent bulk NPR can be a poor indicator of ARD occurrence. We simulated ensembles of WRPs identical in their geometry and bulk NPR, which only differed in their initial distribution of the acid generating and acid neutralizing minerals that control NPR. All models simulated the same principal acid-producing, acid-neutralizing and secondary mineral forming processes. We show that small differences in the distribution of local NPR values or the number of flow paths that generate acidity strongly influence drainage pH. The results indicate that the likelihood of ARD (epitomized by the probability of occurrence of pH<4 in a mixing boundary) within the first 100years can be as high as 75% for a NPR=2 and 40% for NPR=4. The latter is traditionally considered as a "universally safe" threshold to ensure non-acidic waters in practical applications. Our results suggest that new methods that explicitly account for mineralogical heterogeneity must be sought when computing effective (upscaled) NPR values at the scale of the piles. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Histological organization of the central nervous system and distribution of a gonadotropin-releasing hormone-like peptide in the blue crab, Portunus pelagicus.

    Science.gov (United States)

    Saetan, Jirawat; Senarai, Thanyaporn; Tamtin, Montakan; Weerachatyanukul, Wattana; Chavadej, Jittipan; Hanna, Peter J; Parhar, Ishwar; Sobhon, Prasert; Sretarugsa, Prapee

    2013-09-01

    We present a detailed histological description of the central nervous system (CNS: brain, subesophageal ganglion, thoracic ganglia, abdominal ganglia) of the blue crab, Portunus pelagicus. Because the presence of gonadotropin-releasing hormone (GnRH) in crustaceans has been disputed, we examine the presence and localization of a GnRH-like peptide in the CNS of the blue crab by using antibodies against lamprey GnRH (lGnRH)-III, octopus GnRH (octGnRH) and tunicate GnRH (tGnRH)-I. These antibodies showed no cross-reactivity with red-pigment-concentrating hormone, adipokinetic hormone, or corazonin. In the brain, strong lGnRH-III immunoreactivity (-ir) was detected in small (7-17 μm diameter) neurons of clusters 8, 9 and 10, in medium-sized (21-36 μm diameter) neurons of clusters 6, 7 and 11 and in the anterior and posterior median protocerebral neuropils, olfactory neuropil, median and lateral antenna I neuropils, tegumentary neuropil and antenna II neuropil. In the subesophageal ganglion, lGnRH-III-ir was detected in medium-sized neurons and in the subesophageal neuropil. In the thoracic and abdominal ganglia, lGnRH-III-ir was detected in medium-sized and small neurons and in the neuropils. OctGnRH-ir was observed in neurons of the same clusters with moderate staining, particularly in the deutocerebrum, whereas tGnRH-I-ir was only detected in medium-sized neurons of cluster 11 in the brain. Thus, anti-lGnRH-III shows greater immunoreactivity in the crab CNS than anti-octGnRH and anti-tGnRH-I. Moreover, our functional bioassay demonstrates that only lGnRH-III has significant stimulatory effects on ovarian growth and maturation. We therefore conclude that, although the true identity of the crab GnRH eludes us, crabs possess a putative GnRH hormone similar to lGnRH-III. The identification and characterization of this molecule is part of our ongoing research.

  1. Sustained release of radioprotective agents

    International Nuclear Information System (INIS)

    Shani, J.

    1980-11-01

    New pharmaceutical formulations for the sustained release into the G.I. tract of radioprotective agents have been developed by the authors. The experimental method initially consisted in the production of methylcellulose microcapsules. This method failed apparently because of the premature ''explosion'' of the microcapsules and the consequent premature release of massive amounts of the drug. A new method has been developed which consists in drying and pulverising cysteamine and cysteine preparations, mixing them in various proportions with stearic acid and ethylcellulose as carriers. The mixture is then compressed into cylindrical tablets at several pressure values and the leaching rate of the radioprotective agents is then measured by spectrophotometry. The relation between the concentration of the active drug and its rate of release, and the effect on the release rate of the pressure applied to the tablet during its formation were also investigated. Results indicating that the release rate was linearly related to the square root of ''t'' seem to be in agreement with what is predictable, according to Higuchi's equation, save for the very initial and terminal phases. A clear correlation was also established between the stearic acid/ethylcellulose ratios and the release of 20% cysteine, namely a marked decrease in the rate of cysteine release was observed with increasing concentrations of stearic acid. Finally, it was observed that a higher formation pressure results in quicker release of the drug

  2. Consistency argued students of fluid

    Science.gov (United States)

    Viyanti; Cari; Suparmi; Winarti; Slamet Budiarti, Indah; Handika, Jeffry; Widyastuti, Fatma

    2017-01-01

    Problem solving for physics concepts through consistency arguments can improve thinking skills of students and it is an important thing in science. The study aims to assess the consistency of the material Fluid student argmentation. The population of this study are College students PGRI Madiun, UIN Sunan Kalijaga Yogyakarta and Lampung University. Samples using cluster random sampling, 145 samples obtained by the number of students. The study used a descriptive survey method. Data obtained through multiple-choice test and interview reasoned. Problem fluid modified from [9] and [1]. The results of the study gained an average consistency argmentation for the right consistency, consistency is wrong, and inconsistent respectively 4.85%; 29.93%; and 65.23%. Data from the study have an impact on the lack of understanding of the fluid material which is ideally in full consistency argued affect the expansion of understanding of the concept. The results of the study as a reference in making improvements in future studies is to obtain a positive change in the consistency of argumentations.

  3. Coordinating user interfaces for consistency

    CERN Document Server

    Nielsen, Jakob

    2001-01-01

    In the years since Jakob Nielsen's classic collection on interface consistency first appeared, much has changed, and much has stayed the same. On the one hand, there's been exponential growth in the opportunities for following or disregarding the principles of interface consistency-more computers, more applications, more users, and of course the vast expanse of the Web. On the other, there are the principles themselves, as persistent and as valuable as ever. In these contributed chapters, you'll find details on many methods for seeking and enforcing consistency, along with bottom-line analys

  4. Choice, internal consistency, and rationality

    OpenAIRE

    Aditi Bhattacharyya; Prasanta K. Pattanaik; Yongsheng Xu

    2010-01-01

    The classical theory of rational choice is built on several important internal consistency conditions. In recent years, the reasonableness of those internal consistency conditions has been questioned and criticized, and several responses to accommodate such criticisms have been proposed in the literature. This paper develops a general framework to accommodate the issues raised by the criticisms of classical rational choice theory, and examines the broad impact of these criticisms from both no...

  5. Self-consistent quark bags

    International Nuclear Information System (INIS)

    Rafelski, J.

    1979-01-01

    After an introductory overview of the bag model the author uses the self-consistent solution of the coupled Dirac-meson fields to represent a bound state of strongly ineteracting fermions. In this framework he discusses the vivial approach to classical field equations. After a short description of the used numerical methods the properties of bound states of scalar self-consistent Fields and the solutions of a self-coupled Dirac field are considered. (HSI) [de

  6. Environmental transformation and distribution of mercury released from gold mining and its implications on human health in Tanzania, studied by nuclear techniques

    International Nuclear Information System (INIS)

    Ikingura, Justinian R.

    2002-01-01

    The dispersion and transformation of mercury in the southwest Lake Victoria gold fields was investigated through field and laboratory studies in order to evaluate the environmental impact and human health risks due to mercury pollution from small-scale gold mining in Tanzania. River sediment, gold-ore tailings, fish, and lichens were analyzed for their mercury content to determine mercury contamination levels. Mercury concentrations in the tailings from Rwamagaza mine were in the range of 165 to 232 mg/kg while at the Mugusu mine the maximum concentration was 6 mg/kg in the river sediment contaminated by the tailings. The dispersion of mercury along the Mabubi River downstream of the gold-ore processing site at the Mugusu mine decreased rapidly to less than 0.5 mg/kg at a distance of 4 km, and less than 0.1 mg/kg at 9 km. Granulometrical analysis of mercury distribution indicated highest mercury concentrations to be associated with the grain size fraction <212 mm in the sediment. Total mercury concentrations in eight fish species from the Lake Victoria at Nungwe Bay were generally very low and varied from 2 to 34, μg/kg (w.w). The lowest concentrations were found in Tilapia and the highest in Nile perch. The percentage of methylmercury in the fish muscle ranged from 65 to 97%. These results suggest that mercury contamination from gold mining operations in the southwest Lake Victoria goldfields has not led to any significant increase in environmental methylmercury levels that could be reflected in high mercury concentrations in the fish. Based on these results, fish consumption from the Nungwe Bay area of the Lake Victoria does not pose any human health risks on account of very low mercury levels in the fish at present. Mercury concentrations in two lichen species, Parmelia and Usnea, in the Geita Forest Reserve around the Mugusu mine ranged from 0.10 to 3.10 μg/g (d.w.). The mercury concentration in the lichens decreased away from the mine village, indicating the

  7. Cloud Standardization: Consistent Business Processes and Information

    Directory of Open Access Journals (Sweden)

    Razvan Daniel ZOTA

    2013-01-01

    Full Text Available Cloud computing represents one of the latest emerging trends in distributed computing that enables the existence of hardware infrastructure and software applications as services. The present paper offers a general approach to the cloud computing standardization as a mean of improving the speed of adoption for the cloud technologies. Moreover, this study tries to show out how organizations may achieve more consistent business processes while operating with cloud computing technologies.

  8. The ALICE Software Release Validation cluster

    International Nuclear Information System (INIS)

    Berzano, D; Krzewicki, M

    2015-01-01

    One of the most important steps of software lifecycle is Quality Assurance: this process comprehends both automatic tests and manual reviews, and all of them must pass successfully before the software is approved for production. Some tests, such as source code static analysis, are executed on a single dedicated service: in High Energy Physics, a full simulation and reconstruction chain on a distributed computing environment, backed with a sample “golden” dataset, is also necessary for the quality sign off. The ALICE experiment uses dedicated and virtualized computing infrastructures for the Release Validation in order not to taint the production environment (i.e. CVMFS and the Grid) with non-validated software and validation jobs: the ALICE Release Validation cluster is a disposable virtual cluster appliance based on CernVM and the Virtual Analysis Facility, capable of deploying on demand, and with a single command, a dedicated virtual HTCondor cluster with an automatically scalable number of virtual workers on any cloud supporting the standard EC2 interface. Input and output data are externally stored on EOS, and a dedicated CVMFS service is used to provide the software to be validated. We will show how the Release Validation Cluster deployment and disposal are completely transparent for the Release Manager, who simply triggers the validation from the ALICE build system's web interface. CernVM 3, based entirely on CVMFS, permits to boot any snapshot of the operating system in time: we will show how this allows us to certify each ALICE software release for an exact CernVM snapshot, addressing the problem of Long Term Data Preservation by ensuring a consistent environment for software execution and data reprocessing in the future. (paper)

  9. Time-consistent and market-consistent evaluations

    NARCIS (Netherlands)

    Pelsser, A.; Stadje, M.A.

    2014-01-01

    We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  10. Derived release limits for airborne effluents at TRIGA - INR Reactor

    International Nuclear Information System (INIS)

    Toma, A.; Dulama, C.; Hirica, O.; Mihai, S.; Oprea, I.

    2008-01-01

    Beginning from fulfilling the purposes of dose limitation system recommended by ICRP, and now accepted in radiation protection, this paper presents an environmental transfer model to calculate derived release limits for airborne and gaseous radioactive effluents at TRIGA-INR, 14 MW Steady State Reactor, in function on INR-Pitesti site. The methodology consists in determination of the principal exposure pathways for different groups of population and dose calculations for each radionuclide. The characterization of radionuclides transfer to environment was made using the compartmental model. The parameter transfer concept was used to describe the distribution of radionuclides between the different compartments. Atmospheric dispersion was very carefully treated, because it is the primary mechanism of the transfer of radionuclides in the environment and it determines all exposure pathways. Calculation of the atmospheric dispersion was made using ORION-II computer code based on the Gaussian plume model which takes account of site's specific climate and relief conditions. Default values recommended by literature were used to calculate some of the parameters when specific site values were not available. After identification of all transfer parameters which characterize the most important exposure pathways, the release rate corresponding to the individual dose rate limit was calculated. This maximum release rate is the derived release limit for each radionuclide and source. In the paper, the derived release limits are calculated for noble gases, radioiodine and other airborne particulate radionuclides, which can be released on the TRIGA-INR reactor stack, and are important to radiation protection. (authors)

  11. Market-consistent actuarial valuation

    CERN Document Server

    Wüthrich, Mario V

    2016-01-01

    This is the third edition of this well-received textbook, presenting powerful methods for measuring insurance liabilities and assets in a consistent way, with detailed mathematical frameworks that lead to market-consistent values for liabilities. Topics covered are stochastic discounting with deflators, valuation portfolio in life and non-life insurance, probability distortions, asset and liability management, financial risks, insurance technical risks, and solvency. Including updates on recent developments and regulatory changes under Solvency II, this new edition of Market-Consistent Actuarial Valuation also elaborates on different risk measures, providing a revised definition of solvency based on industry practice, and presents an adapted valuation framework which takes a dynamic view of non-life insurance reserving risk.

  12. The Principle of Energetic Consistency

    Science.gov (United States)

    Cohn, Stephen E.

    2009-01-01

    A basic result in estimation theory is that the minimum variance estimate of the dynamical state, given the observations, is the conditional mean estimate. This result holds independently of the specifics of any dynamical or observation nonlinearity or stochasticity, requiring only that the probability density function of the state, conditioned on the observations, has two moments. For nonlinear dynamics that conserve a total energy, this general result implies the principle of energetic consistency: if the dynamical variables are taken to be the natural energy variables, then the sum of the total energy of the conditional mean and the trace of the conditional covariance matrix (the total variance) is constant between observations. Ensemble Kalman filtering methods are designed to approximate the evolution of the conditional mean and covariance matrix. For them the principle of energetic consistency holds independently of ensemble size, even with covariance localization. However, full Kalman filter experiments with advection dynamics have shown that a small amount of numerical dissipation can cause a large, state-dependent loss of total variance, to the detriment of filter performance. The principle of energetic consistency offers a simple way to test whether this spurious loss of variance limits ensemble filter performance in full-blown applications. The classical second-moment closure (third-moment discard) equations also satisfy the principle of energetic consistency, independently of the rank of the conditional covariance matrix. Low-rank approximation of these equations offers an energetically consistent, computationally viable alternative to ensemble filtering. Current formulations of long-window, weak-constraint, four-dimensional variational methods are designed to approximate the conditional mode rather than the conditional mean. Thus they neglect the nonlinear bias term in the second-moment closure equation for the conditional mean. The principle of

  13. Consistent guiding center drift theories

    International Nuclear Information System (INIS)

    Wimmel, H.K.

    1982-04-01

    Various guiding-center drift theories are presented that are optimized in respect of consistency. They satisfy exact energy conservation theorems (in time-independent fields), Liouville's theorems, and appropriate power balance equations. A theoretical framework is given that allows direct and exact derivation of associated drift-kinetic equations from the respective guiding-center drift-orbit theories. These drift-kinetic equations are listed. Northrop's non-optimized theory is discussed for reference, and internal consistency relations of G.C. drift theories are presented. (orig.)

  14. Weak consistency and strong paraconsistency

    Directory of Open Access Journals (Sweden)

    Gemma Robles

    2009-11-01

    Full Text Available In a standard sense, consistency and paraconsistency are understood as, respectively, the absence of any contradiction and as the absence of the ECQ (“E contradictione quodlibet” rule that allows us to conclude any well formed formula from any contradiction. The aim of this paper is to explain the concepts of weak consistency alternative to the standard one, the concepts of paraconsistency related to them and the concept of strong paraconsistency, all of which have been defined by the author together with José M. Méndez.

  15. Consistent force fields for saccharides

    DEFF Research Database (Denmark)

    Rasmussen, Kjeld

    1999-01-01

    Consistent force fields for carbohydrates were hitherto developed by extensive optimization ofpotential energy function parameters on experimental data and on ab initio results. A wide range of experimental data is used: internal structures obtained from gas phase electron diffraction and from x......-anomeric effects are accounted for without addition of specific terms. The work is done in the framework of the Consistent Force Field which originatedin Israel and was further developed in Denmark. The actual methods and strategies employed havebeen described previously. Extensive testing of the force field...

  16. Time-consistent actuarial valuations

    NARCIS (Netherlands)

    Pelsser, A.A.J.; Salahnejhad Ghalehjooghi, A.

    2016-01-01

    Time-consistent valuations (i.e. pricing operators) can be created by backward iteration of one-period valuations. In this paper we investigate the continuous-time limits of well-known actuarial premium principles when such backward iteration procedures are applied. This method is applied to an

  17. Dynamically consistent oil import tariffs

    International Nuclear Information System (INIS)

    Karp, L.; Newbery, D.M.

    1992-01-01

    The standard theory of optimal tariffs considers tariffs on perishable goods produced abroad under static conditions, in which tariffs affect prices only in that period. Oil and other exhaustable resources do not fit this model, for current tariffs affect the amount of oil imported, which will affect the remaining stock and hence its future price. The problem of choosing a dynamically consistent oil import tariff when suppliers are competitive but importers have market power is considered. The open-loop Nash tariff is solved for the standard competitive case in which the oil price is arbitraged, and it was found that the resulting tariff rises at the rate of interest. This tariff was found to have an equilibrium that in general is dynamically inconsistent. Nevertheless, it is shown that necessary and sufficient conditions exist under which the tariff satisfies the weaker condition of time consistency. A dynamically consistent tariff is obtained by assuming that all agents condition their current decisions on the remaining stock of the resource, in contrast to open-loop strategies. For the natural case in which all agents choose their actions simultaneously in each period, the dynamically consistent tariff was characterized, and found to differ markedly from the time-inconsistent open-loop tariff. It was shown that if importers do not have overwhelming market power, then the time path of the world price is insensitive to the ability to commit, as is the level of wealth achieved by the importer. 26 refs., 4 figs

  18. Consistently violating the non-Gaussian consistency relation

    International Nuclear Information System (INIS)

    Mooij, Sander; Palma, Gonzalo A.

    2015-01-01

    Non-attractor models of inflation are characterized by the super-horizon evolution of curvature perturbations, introducing a violation of the non-Gaussian consistency relation between the bispectrum's squeezed limit and the power spectrum's spectral index. In this work we show that the bispectrum's squeezed limit of non-attractor models continues to respect a relation dictated by the evolution of the background. We show how to derive this relation using only symmetry arguments, without ever needing to solve the equations of motion for the perturbations

  19. Consistence of Network Filtering Rules

    Institute of Scientific and Technical Information of China (English)

    SHE Kun; WU Yuancheng; HUANG Juncai; ZHOU Mingtian

    2004-01-01

    The inconsistence of firewall/VPN(Virtual Private Network) rule makes a huge maintainable cost.With development of Multinational Company,SOHO office,E-government the number of firewalls/VPN will increase rapidly.Rule table in stand-alone or network will be increased in geometric series accordingly.Checking the consistence of rule table manually is inadequate.A formal approach can define semantic consistence,make a theoretic foundation of intelligent management about rule tables.In this paper,a kind of formalization of host rules and network ones for auto rule-validation based on SET theory were proporsed and a rule validation scheme was defined.The analysis results show the superior performance of the methods and demonstrate its potential for the intelligent management based on rule tables.

  20. Self-consistent radial sheath

    International Nuclear Information System (INIS)

    Hazeltine, R.D.

    1988-12-01

    The boundary layer arising in the radial vicinity of a tokamak limiter is examined, with special reference to the TEXT tokamak. It is shown that sheath structure depends upon the self-consistent effects of ion guiding-center orbit modification, as well as the radial variation of E /times/ B-induced toroidal rotation. Reasonable agreement with experiment is obtained from an idealized model which, however simplified, preserves such self-consistent effects. It is argued that the radial sheath, which occurs whenever confining magnetic field-lines lie in the plasma boundary surface, is an object of some intrinsic interest. It differs from the more familiar axial sheath because magnetized charges respond very differently to parallel and perpendicular electric fields. 11 refs., 1 fig

  1. Lagrangian multiforms and multidimensional consistency

    Energy Technology Data Exchange (ETDEWEB)

    Lobb, Sarah; Nijhoff, Frank [Department of Applied Mathematics, University of Leeds, Leeds LS2 9JT (United Kingdom)

    2009-10-30

    We show that well-chosen Lagrangians for a class of two-dimensional integrable lattice equations obey a closure relation when embedded in a higher dimensional lattice. On the basis of this property we formulate a Lagrangian description for such systems in terms of Lagrangian multiforms. We discuss the connection of this formalism with the notion of multidimensional consistency, and the role of the lattice from the point of view of the relevant variational principle.

  2. Consistency and Communication in Committees

    OpenAIRE

    Inga Deimen; Felix Ketelaar; Mark T. Le Quement

    2013-01-01

    This paper analyzes truthtelling incentives in pre-vote communication in heterogeneous committees. We generalize the classical Condorcet jury model by introducing a new informational structure that captures consistency of information. In contrast to the impossibility result shown by Coughlan (2000) for the classical model, full pooling of information followed by sincere voting is an equilibrium outcome of our model for a large set of parameter values implying the possibility of ex post confli...

  3. Deep Feature Consistent Variational Autoencoder

    OpenAIRE

    Hou, Xianxu; Shen, Linlin; Sun, Ke; Qiu, Guoping

    2016-01-01

    We present a novel method for constructing Variational Autoencoder (VAE). Instead of using pixel-by-pixel loss, we enforce deep feature consistency between the input and the output of a VAE, which ensures the VAE's output to preserve the spatial correlation characteristics of the input, thus leading the output to have a more natural visual appearance and better perceptual quality. Based on recent deep learning works such as style transfer, we employ a pre-trained deep convolutional neural net...

  4. Self-consistent modelling of ICRH

    International Nuclear Information System (INIS)

    Hellsten, T.; Hedin, J.; Johnson, T.; Laxaaback, M.; Tennfors, E.

    2001-01-01

    The performance of ICRH is often sensitive to the shape of the high energy part of the distribution functions of the resonating species. This requires self-consistent calculations of the distribution functions and the wave-field. In addition to the wave-particle interactions and Coulomb collisions the effects of the finite orbit width and the RF-induced spatial transport are found to be important. The inward drift dominates in general even for a symmetric toroidal wave spectrum in the centre of the plasma. An inward drift does not necessarily produce a more peaked heating profile. On the contrary, for low concentrations of hydrogen minority in deuterium plasmas it can even give rise to broader profiles. (author)

  5. Consistent biokinetic models for the actinide elements

    International Nuclear Information System (INIS)

    Leggett, R.W.

    2001-01-01

    The biokinetic models for Th, Np, Pu, Am and Cm currently recommended by the International Commission on Radiological Protection (ICRP) were developed within a generic framework that depicts gradual burial of skeletal activity in bone volume, depicts recycling of activity released to blood and links excretion to retention and translocation of activity. For other actinide elements such as Ac, Pa, Bk, Cf and Es, the ICRP still uses simplistic retention models that assign all skeletal activity to bone surface and depicts one-directional flow of activity from blood to long-term depositories to excreta. This mixture of updated and older models in ICRP documents has led to inconsistencies in dose estimates and interpretation of bioassay for radionuclides with reasonably similar biokinetics. This paper proposes new biokinetic models for Ac, Pa, Bk, Cf and Es that are consistent with the updated models for Th, Np, Pu, Am and Cm. The proposed models are developed within the ICRP's generic model framework for bone-surface-seeking radionuclides, and an effort has been made to develop parameter values that are consistent with results of comparative biokinetic data on the different actinide elements. (author)

  6. TOXRISK, Toxic Gas Release Accident Analysis

    International Nuclear Information System (INIS)

    Bennett, D.E.; Chanin, D.I.; Shiver, A.W.

    1993-01-01

    1 - Description of program or function: TOXRISK is an interactive program developed to aid in the evaluation of nuclear power plant control room habitability in the event of a nearby toxic material release. The program uses a model which is consistent with the approach described in the NRC Regulatory Guide 1.78. Release of the gas is treated as an initial puff followed by a continuous plume. The relative proportions of these as well as the plume release rate are supplied by the user. Transport of the gas is modeled as a Gaussian distribution and occurs through the action of a constant velocity, constant direction wind. Great flexibility is afforded the user in specifying the release description, meteorological conditions, relative geometry of the accident and plant, and the plant ventilation system characteristics. Two types of simulation can be performed: multiple case (parametric) studies and probabilistic analyses. Upon execution, TOXRISK presents a menu, and the user chooses between the Data Base Manager, the Multiple Case program, and the Probabilistic Study Program. The Data Base Manager provides a convenient means of storing, retrieving, and modifying blocks of data required by the analysis programs. The Multiple Case program calculates resultant gas concentrations inside the control room and presents a summary of information that describes the event for each set of conditions given. Optimally, a time history profile of inside and outside concentrations can also be produced. The Probabilistic Study program provides a means for estimating the annual probability of operator incapacitation due to toxic gas accidents on surrounding transportation routes and storage sites. 2 - Method of solution: Dispersion or diffusion of the gas during transport is described by modified Pasquill-Gifford dispersion coefficients

  7. News/Press Releases

    Data.gov (United States)

    Office of Personnel Management — A press release, news release, media release, press statement is written communication directed at members of the news media for the purpose of announcing programs...

  8. Umatilla Satellite and Release Sites Project : Final Siting Report.

    Energy Technology Data Exchange (ETDEWEB)

    Montgomery, James M.

    1992-04-01

    This report presents the results of site analysis for the Umatilla Satellite and Release Sites Project. The purpose of this project is to provide engineering services for the siting and conceptual design of satellite and release facilities for the Umatilla Basin hatchery program. The Umatilla Basin hatchery program consists of artificial production facilities for salmon and steelhead to enhance production in the Umatilla River as defined in the Umatilla master plan approved in 1989 by the Northwest Power Planning Council. Facilities identified in the master plan include adult salmon broodstock holding and spawning facilities, facilities for recovery, acclimation, and/or extended rearing of salmon juveniles, and development of river sites for release of hatchery salmon and steelhead. The historic and current distribution of fall chinook, summer chinook, and coho salmon and steelhead trout was summarized for the Umatilla River basin. Current and future production and release objectives were reviewed. Twenty seven sites were evaluated for the potential and development of facilities. Engineering and environmental attributes of the sites were evaluated and compared to facility requirements for water and space. Site screening was conducted to identify the sites with the most potential for facility development. Alternative sites were selected for conceptual design of each facility type. A proposed program for adult holding facilities, final rearing/acclimation, and direct release facilities was developed.

  9. Riola release report

    Energy Technology Data Exchange (ETDEWEB)

    Woodward, E.C.

    1983-08-04

    Eleven hours after execution of the Riola Event (at 0826 PDT on 25 September 1980) in hole U2eq of the Nevada Test Site (NTS), a release of radioactivity began. When the seepage stopped at about noon the following day, up to some 3200 Ci of activity had been dispersed by light variable winds. On 26 September, examination of the geophone records showed six hours of low-level, but fairly continuous, activity before the release. Electrical measurements indicated that most cables were still intact to a depth below the stemming platform. A survey of the ground zero area showed that the seepage came through cracks between the surface conductor and the pad, through cracks in the pad, and through a crack adjacent to the pad around the mousehole (a small hole adjacent to the emplacement hole). To preclude undue radiation exposure or injury from a surprise subsidence, safety measures were instituted. Tritium seepage was suffucient to postpone site activities until a box and pipeline were emplaced to contain and remove the gas. Radiation release modeling and calculations were generally consistent with observations. Plug-hole interaction calculations showed that the alluvium near the bottom of the plug may have been overstressed and that improvements in the design of the plug-medium interface can be made. Experimental studies verified that the surface appearance of the plug core was caused by erosion, but, assuming a normal strength for the plug material, that erosion alone could not account for the disappearance of such a large portion of the stemming platform. Samples from downhole plug experiments show that the plug may have been considerably weaker than had been indicted by quality assurance (QA) samples. 19 references, 32 figures, 10 tables.

  10. Riola release report

    International Nuclear Information System (INIS)

    Woodward, E.C.

    1983-01-01

    Eleven hours after execution of the Riola Event (at 0826 PDT on 25 September 1980) in hole U2eq of the Nevada Test Site (NTS), a release of radioactivity began. When the seepage stopped at about noon the following day, up to some 3200 Ci of activity had been dispersed by light variable winds. On 26 September, examination of the geophone records showed six hours of low-level, but fairly continuous, activity before the release. Electrical measurements indicated that most cables were still intact to a depth below the stemming platform. A survey of the ground zero area showed that the seepage came through cracks between the surface conductor and the pad, through cracks in the pad, and through a crack adjacent to the pad around the mousehole (a small hole adjacent to the emplacement hole). To preclude undue radiation exposure or injury from a surprise subsidence, safety measures were instituted. Tritium seepage was suffucient to postpone site activities until a box and pipeline were emplaced to contain and remove the gas. Radiation release modeling and calculations were generally consistent with observations. Plug-hole interaction calculations showed that the alluvium near the bottom of the plug may have been overstressed and that improvements in the design of the plug-medium interface can be made. Experimental studies verified that the surface appearance of the plug core was caused by erosion, but, assuming a normal strength for the plug material, that erosion alone could not account for the disappearance of such a large portion of the stemming platform. Samples from downhole plug experiments show that the plug may have been considerably weaker than had been indicted by quality assurance (QA) samples. 19 references, 32 figures, 10 tables

  11. Sociality and the telencephalic distribution of corticotrophin-releasing factor, urocortin 3, and binding sites for CRF type 1 and type 2 receptors: A comparative study of eusocial naked mole-rats and solitary Cape mole-rats.

    Science.gov (United States)

    Coen, Clive W; Kalamatianos, Theodosis; Oosthuizen, Maria K; Poorun, Ravi; Faulkes, Christopher G; Bennett, Nigel C

    2015-11-01

    Various aspects of social behavior are influenced by the highly conserved corticotrophin-releasing factor (CRF) family of peptides and receptors in the mammalian telencephalon. This study has mapped and compared the telencephalic distribution of the CRF receptors, CRF1 and CRF2 , and two of their ligands, CRF and urocortin 3, respectively, in African mole-rat species with diametrically opposed social behavior. Naked mole-rats live in large eusocial colonies that are characterized by exceptional levels of social cohesion, tolerance, and cooperation in burrowing, foraging, defense, and alloparental care for the offspring of the single reproductive female. Cape mole-rats are solitary; they tolerate conspecifics only fleetingly during the breeding season. The telencephalic sites at which the level of CRF1 binding in naked mole-rats exceeds that in Cape mole-rats include the basolateral amygdaloid nucleus, hippocampal CA3 subfield, and dentate gyrus; in contrast, the level is greater in Cape mole-rats in the shell of the nucleus accumbens and medial habenular nucleus. For CRF2 binding, the sites with a greater level in naked mole-rats include the basolateral amygdaloid nucleus and dentate gyrus, but the septohippocampal nucleus, lateral septal nuclei, amygdalostriatal transition area, bed nucleus of the stria terminalis, and medial habenular nucleus display a greater level in Cape mole-rats. The results are discussed with reference to neuroanatomical and behavioral studies of various species, including monogamous and promiscuous voles. By analogy with findings in those species, we speculate that the abundance of CRF1 binding in the nucleus accumbens of Cape mole-rats reflects their lack of affiliative behavior. © 2015 Wiley Periodicals, Inc.

  12. Designing apps for success developing consistent app design practices

    CERN Document Server

    David, Matthew

    2014-01-01

    In 2007, Apple released the iPhone. With this release came tools as revolutionary as the internet was to businesses and individuals back in the mid- and late-nineties: Apps. Much like websites drove (and still drive) business, so too do apps drive sales, efficiencies and communication between people. But also like web design and development, in its early years and iterations, guidelines and best practices for apps are few and far between.Designing Apps for Success provides web/app designers and developers with consistent app design practices that result in timely, appropriate, and efficiently

  13. Consistency based correlations for tailings consolidation

    Energy Technology Data Exchange (ETDEWEB)

    Azam, S.; Paul, A.C. [Regina Univ., Regina, SK (Canada). Environmental Systems Engineering

    2010-07-01

    The extraction of oil, uranium, metals and mineral resources from the earth generates significant amounts of tailings slurry. The tailings are contained in a disposal area with perimeter dykes constructed from the coarser fraction of the slurry. There are many unique challenges pertaining to the management of the containment facilities for several decades beyond mine closure that are a result of the slow settling rates of the fines and the high standing toxic waters. Many tailings dam failures in different parts of the world have been reported to result in significant contaminant releases causing public concern over the conventional practice of tailings disposal. Therefore, in order to reduce and minimize the environmental footprint, the fluid tailings need to undergo efficient consolidation. This paper presented an investigation into the consolidation behaviour of tailings in conjunction with soil consistency that captured physicochemical interactions. The paper discussed the large strain consolidation behaviour (volume compressibility and hydraulic conductivity) of six fine-grained soil slurries based on published data. The paper provided background information on the study and presented the research methodology. The geotechnical index properties of the selected materials were also presented. The large strain consolidation, volume compressibility correlations, and hydraulic conductivity correlations were provided. It was concluded that the normalized void ratio best described volume compressibility whereas liquidity index best explained the hydraulic conductivity. 17 refs., 3 tabs., 4 figs.

  14. Self-consistent nuclear energy systems

    International Nuclear Information System (INIS)

    Shimizu, A.; Fujiie, Y.

    1995-01-01

    A concept of self-consistent energy systems (SCNES) has been proposed as an ultimate goal of the nuclear energy system in the coming centuries. SCNES should realize a stable and unlimited energy supply without endangering the human race and the global environment. It is defined as a system that realizes at least the following four objectives simultaneously: (a) energy generation -attain high efficiency in the utilization of fission energy; (b) fuel production - secure inexhaustible energy source: breeding of fissile material with the breeding ratio greater than one and complete burning of transuranium through recycling; (c) burning of radionuclides - zero release of radionuclides from the system: complete burning of transuranium and elimination of radioactive fission products by neutron capture reactions through recycling; (d) system safety - achieve system safety both for the public and experts: eliminate criticality-related safety issues by using natural laws and simple logic. This paper describes the concept of SCNES and discusses the feasibility of the system. Both ''neutron balance'' and ''energbalance'' of the system are introduced as the necessary conditions to be satisfied at least by SCNES. Evaluations made so far indicate that both the neutron balance and the energy balance can be realized by fast reactors but not by thermal reactors. Concerning the system safety, two safety concepts: ''self controllability'' and ''self-terminability'' are introduced to eliminate the criticality-related safety issues in fast reactors. (author)

  15. Evaluating Temporal Consistency in Marine Biodiversity Hotspots.

    Science.gov (United States)

    Piacenza, Susan E; Thurman, Lindsey L; Barner, Allison K; Benkwitt, Cassandra E; Boersma, Kate S; Cerny-Chipman, Elizabeth B; Ingeman, Kurt E; Kindinger, Tye L; Lindsley, Amy J; Nelson, Jake; Reimer, Jessica N; Rowe, Jennifer C; Shen, Chenchen; Thompson, Kevin A; Heppell, Selina S

    2015-01-01

    With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monitoring dataset collected over an eight year period off the US Pacific Coast, we developed a methodological approach for avoiding biases associated with hotspot delineation. We aggregated benthic fish species data from research trawls and calculated mean hotspot thresholds for fish species richness and Shannon's diversity indices over the eight year dataset. We used a spatial frequency distribution method to assign hotspot designations to the grid cells annually. We found no areas containing consistently high biodiversity through the entire study period based on the mean thresholds, and no grid cell was designated as a hotspot for greater than 50% of the time-series. To test if our approach was sensitive to sampling effort and the geographic extent of the survey, we followed a similar routine for the northern region of the survey area. Our finding of low consistency in benthic fish biodiversity hotspots over time was upheld, regardless of biodiversity metric used, whether thresholds were calculated per year or across all years, or the spatial extent for which we calculated thresholds and identified hotspots. Our results suggest that static measures of benthic fish biodiversity off the US West Coast are insufficient for identification of hotspots and that long-term data are required to appropriately identify patterns of high temporal variability in biodiversity for these highly mobile taxa. Given that ecological communities are responding to a changing climate and other

  16. Sterile insect supply, emergence, and release

    International Nuclear Information System (INIS)

    Dowell, R.V.; Worley, J.; Gomes, P.J.

    2005-01-01

    Insect mass-rearing for a sterile insect technique (SIT) programme is designed to move beyond the large-scale rearing of insects in a laboratory to the industrial production of consistently high-quality insects for sterilization and release. Each facility reflects the unique biology of the insect reared within it, but there are some generalities for all rearing facilities. Rearing insects in self-contained modules offers flexibility, and increased safety from catastrophic occurrences, compared with using a single building which houses all facets of the rearing process. Although mechanizing certain aspects of the rearing steps helps provide a consistently high-quality insect, successful mass-rearing and delivery depends largely upon the human component. Besides production in centralized facilities, insects can be produced from purchased eggs, or nowadays, adult insects are often obtained from specialized satellite emergence/collection facilities. Interest in commercializing insect production and release is increasing. Shipping sterile insects, sometimes over long distances, is now common practice. Procedures for handling and chilling adult insects, and providing food and water prior to release, are continually being improved. Sterile insects are released via static-release receptacles, ground-release systems, or most commonly from the air. The aerial release of chilled sterile insects is the most efficient method of release, especially when aircraft flight paths are guided by a Global Positioning System (GPS) linked to a computer-controlled release mechanism. (author)

  17. Investigation of activity release during light water reactor core meltdown

    International Nuclear Information System (INIS)

    Albrecht, H.; Matschoss, V.; Wild, H.

    1978-01-01

    A test facility was developed for the determination of activity release and of aerosol characteristics under realistic light water reactor core melting conditions. It is composed of a high-frequency induction furnace, a ThO 2 crucible system, and a collection apparatus consisting of membrane and particulate filters. Thirty-gram samples of a representative core material mixture (corium) were melted under air, argon, or steam at 0.8 to 2.2 bar. In air at 2700 0 C, for example, the relative release was 0.4 to 0.7% for iron, chromium, and cobalt and 4 to 11% for tin, antimony, and manganese. Higher release values of 20 to 40% at lower temperatures (2150 0 C, air) were found for selenium, cadmium, tellurium, and cesium. The size distribution of the aerosol particles was trimodal with maxima at diameters of 0.17, 0.30, and 0.73 μm. The result of a qualitative x-ray microanalysis was that the main elements of the melt were contained in each aerosol particle. Further investigations will include larger melt masses and the additional influence of concrete on the release and aerosol behavior

  18. Toxics Release Inventory (TRI)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Toxics Release Inventory (TRI) is a dataset compiled by the U.S. Environmental Protection Agency (EPA). It contains information on the release and waste...

  19. Probabilistic consequence assessment of hydrogen sulphide releases from a heavy water plant

    International Nuclear Information System (INIS)

    1983-06-01

    This report is the second in a series concerned with the evaluation of the consequences to the public of an accidental release of hydrogen sulphide (H 2 S) to the atmosphere following a pipe or pressure envelope failure, or some other process upset, at a heavy water plant. It consists of documentation of the code GASPROB, which has been developed to provide consequence probabilities for a range of postulated releases. The code includes mathematical simulations of initial gas behaviour upon release to the atmosphere, such as gravitational settling of a cold release and the rise of jets and flares, subsequent atmospheric dispersion under a range of weather conditions, and the toxic effects on the exposed population. The code makes use of the site-specific dispersion climatology, topography and population distribution, as well as the probabilistic lethal dose data for the released gas. Output for a given postulated release can be provided in terms of the concentration of the gas at ground level around the point of release, projected numbers of fatalities within specified areas and the projected total fatalities regardless of location. This report includes a general description of GASPROB, and specifics of the code structure, the function of each subroutine, input and output data, and the permanent data files established. Three appendices to the report contain a complete code listing, detailed subroutine descriptions and a sample output

  20. Modelling of drug release from ensembles of aspirin microcapsules ...

    African Journals Online (AJOL)

    Purpose: In order to determine the drug release profile of an ensemble of aspirin crystals or microcapsules from its particle distribution a mathematical model that considered the individual release characteristics of the component single particles was developed. The model assumed that under sink conditions the release ...

  1. Parameters to be Considered in the Simulation of Drug Release ...

    African Journals Online (AJOL)

    Purpose: Drug microparticles may be microencapsulated with water-insoluble polymers to obtain controlled release, which may be further determined by the particle distribution. The purpose of this study was to determine the drug release parameters needed for the theoretical prediction of the release profiles of single ...

  2. CERN Press Release: CERN experiments observe particle consistent with long-sought Higgs boson

    CERN Multimedia

    2012-01-01

    Geneva, 4 July 2012. At a seminar held at CERN today as a curtain raiser to the year’s major particle physics conference, ICHEP2012 in Melbourne, the ATLAS and CMS experiments presented their latest preliminary results in the search for the long sought Higgs particle. Both experiments observe a new particle in the mass region around 125-126 GeV.   CERN physicists await the start of the Higgs seminar. “We observe in our data clear signs of a new particle, at the level of 5 sigma, in the mass region around 126 GeV. The outstanding performance of the LHC and ATLAS and the huge efforts of many people have brought us to this exciting stage,” said ATLAS experiment spokesperson Fabiola Gianotti, “but a little more time is needed to prepare these results for publication.” "The results are preliminary but the 5 sigma signal at around 125 GeV we’re seeing is dramatic. This is indeed a new particle. We know it must be a boson and it&...

  3. Consistent effects of a major QTL for thermal resistance in field-released Drosophila melanogaster

    DEFF Research Database (Denmark)

    Loeschcke, Volker; Kristensen, Torsten Nygård; Norry, Fabian M

    2011-01-01

    Molecular genetic markers can be used to identify quantitative trait loci (QTL) for thermal resistance and this has allowed characterization of a major QTL for knockdown resistance to high temperature in Drosophila melanogaster. The QTL showed trade-off associations with cold resistance under lab...... of field fitness at different environmental temperatures with genotypic variation in a QTL for thermal tolerance. Graphical abstract...

  4. Bootstrap-Based Inference for Cube Root Consistent Estimators

    DEFF Research Database (Denmark)

    Cattaneo, Matias D.; Jansson, Michael; Nagasawa, Kenichi

    This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known...... to be inconsistent. Our method restores consistency of the nonparametric bootstrap by altering the shape of the criterion function defining the estimator whose distribution we seek to approximate. This modification leads to a generic and easy-to-implement resampling method for inference that is conceptually distinct...... from other available distributional approximations based on some form of modified bootstrap. We offer simulation evidence showcasing the performance of our inference method in finite samples. An extension of our methodology to general M-estimation problems is also discussed....

  5. [Consistent Declarative Memory with Depressive Symptomatology].

    Science.gov (United States)

    Botelho de Oliveira, Silvia; Flórez, Ruth Natalia Suárez; Caballero, Diego Andrés Vásquez

    2012-12-01

    Some studies have suggested that potentiated remembrance of negative events on people with depressive disorders seems to be an important factor in the etiology, course and maintenance of depression. Evaluate the emotional memory in people with and without depressive symptomatology by means of an audio-visual test. 73 university students were evaluated, male and female, between 18 and 40 years old, distributed in two groups: with depressive symptomatology (32) and without depressive symptomatology (40), using the Scale from the Center of Epidemiologic Studies for Depression (CES-D, English Abbreviation) and a cutting point of 20. There were not meaningful differences between free and voluntary recalls, with and without depressive symptomatology, in spite of the fact that both groups had granted a higher emotional value to the audio-visual test and that they had associated it with emotional sadness. People with depressive symptomatology did not exhibit the effect of mnemonic potentiation generally associated to the content of the emotional version of the test; therefore, the hypothesis of emotional consistency was not validated. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  6. Self consistent field theory of virus assembly

    Science.gov (United States)

    Li, Siyu; Orland, Henri; Zandi, Roya

    2018-04-01

    The ground state dominance approximation (GSDA) has been extensively used to study the assembly of viral shells. In this work we employ the self-consistent field theory (SCFT) to investigate the adsorption of RNA onto positively charged spherical viral shells and examine the conditions when GSDA does not apply and SCFT has to be used to obtain a reliable solution. We find that there are two regimes in which GSDA does work. First, when the genomic RNA length is long enough compared to the capsid radius, and second, when the interaction between the genome and capsid is so strong that the genome is basically localized next to the wall. We find that for the case in which RNA is more or less distributed uniformly in the shell, regardless of the length of RNA, GSDA is not a good approximation. We observe that as the polymer-shell interaction becomes stronger, the energy gap between the ground state and first excited state increases and thus GSDA becomes a better approximation. We also present our results corresponding to the genome persistence length obtained through the tangent-tangent correlation length and show that it is zero in case of GSDA but is equal to the inverse of the energy gap when using SCFT.

  7. The inventory of sources, environmental releases and risk assessment for perfluorooctane sulfonate in China

    International Nuclear Information System (INIS)

    Zhang Lai; Liu Jianguo; Hu Jianxin; Liu Chao; Guo Weiguang; Wang Qiang; Wang Hong

    2012-01-01

    With about 100 t/y of the production volume, perfluorootane sulfonates (PFOS) are mainly used for metal plating, aqueous fire-fighting foams (AFFFs) and sulfluramidin China, and the use amount is about 30–40 t/y, 25–35 t/y and 4–8 t/y respectively. Based on the inventory of PFOS production and uses with geographic distribution educed from statistics, environmental risk assessment of PFOS was taken by using EUSES model, as well as its environmental releases were estimated both in local and regional levels in China. While the environmental release from manufacture is significant in Central China region, metal plating was identified as the major PFOS release source in regional level. The East China region shows the most strong emission strength of PFOS. Though the predicted environmental concentrations (PECs) were not exceed current relevant predicted no effect concentrations (PNECs) of the risk characterization for PFOS, higher PECs was estimated around major PFOS release sources showing undesirable environmental risk at local level. - Highlights: ► Inventory of production and uses of perfluorooctane sulfonate (PFOS) in China with geographical distribution. ► Characteristics of PFOS release sources and distribution consistent with social-economic situation in China. ► Effective model predicted results of PFOS environmental risk assessment in local and regional scales compared with relevant environmental monitoring data. - Inventory of PFOS production and use of with sectoral and regional distribution of China, environmental releases and risk status were indicated both in the local and regional level of the country.

  8. Modelling vesicular release at hippocampal synapses.

    Directory of Open Access Journals (Sweden)

    Suhita Nadkarni

    2010-11-01

    Full Text Available We study local calcium dynamics leading to a vesicle fusion in a stochastic, and spatially explicit, biophysical model of the CA3-CA1 presynaptic bouton. The kinetic model for vesicle release has two calcium sensors, a sensor for fast synchronous release that lasts a few tens of milliseconds and a separate sensor for slow asynchronous release that lasts a few hundred milliseconds. A wide range of data can be accounted for consistently only when a refractory period lasting a few milliseconds between releases is included. The inclusion of a second sensor for asynchronous release with a slow unbinding site, and thereby a long memory, affects short-term plasticity by facilitating release. Our simulations also reveal a third time scale of vesicle release that is correlated with the stimulus and is distinct from the fast and the slow releases. In these detailed Monte Carlo simulations all three time scales of vesicle release are insensitive to the spatial details of the synaptic ultrastructure. Furthermore, our simulations allow us to identify features of synaptic transmission that are universal and those that are modulated by structure.

  9. Hollywood blockbusters and long-tailed distributions

    Science.gov (United States)

    Sinha, S.; Raghavendra, S.

    2004-11-01

    Numerical data for all movies released in theaters in the USA during the period 1997-2003 are examined for the distribution of their popularity in terms of (i) the number of weeks they spent in the Top 60 according to the weekend earnings, and (ii) the box-office gross during the opening week, as well as, the total duration for which they were shown in theaters. These distributions show long tails where the most popular movies are located. Like the study of Redner [S. Redner, Eur. Phys. J. B 4, 131 (1998)] on the distribution of citations to individual papers, our results appear to be consistent with a power-law dependence of the rank distribution of gross revenues for the most popular movies with a exponent close to -1/2.

  10. Research on Human Dynamics of Information Release of WeChat Users

    OpenAIRE

    Zhang, Juliang; Zhang, Shengtai; Duo, Fan; Wang, Feifei

    2017-01-01

    The information release behavior of WeChat users is influenced by many factors, and studying the rules of the behavior of users in WeChat can provide theoretical help for the dynamic research of mobile social network users. By crawling WeChat moments information of nine users within 5 years, we used the human behavioral dynamics system to analyze users' behavior. The results show that the information distribution behavior of WeChat users is consistent with the power-law distribution for a cer...

  11. Evaluating the hydrological consistency of evaporation products

    KAUST Repository

    Lopez Valencia, Oliver Miguel; Houborg, Rasmus; McCabe, Matthew

    2017-01-01

    Advances in space-based observations have provided the capacity to develop regional- to global-scale estimates of evaporation, offering insights into this key component of the hydrological cycle. However, the evaluation of large-scale evaporation retrievals is not a straightforward task. While a number of studies have intercompared a range of these evaporation products by examining the variance amongst them, or by comparison of pixel-scale retrievals against ground-based observations, there is a need to explore more appropriate techniques to comprehensively evaluate remote-sensing-based estimates. One possible approach is to establish the level of product agreement between related hydrological components: for instance, how well do evaporation patterns and response match with precipitation or water storage changes? To assess the suitability of this "consistency"-based approach for evaluating evaporation products, we focused our investigation on four globally distributed basins in arid and semi-arid environments, comprising the Colorado River basin, Niger River basin, Aral Sea basin, and Lake Eyre basin. In an effort to assess retrieval quality, three satellite-based global evaporation products based on different methodologies and input data, including CSIRO-PML, the MODIS Global Evapotranspiration product (MOD16), and Global Land Evaporation: the Amsterdam Methodology (GLEAM), were evaluated against rainfall data from the Global Precipitation Climatology Project (GPCP) along with Gravity Recovery and Climate Experiment (GRACE) water storage anomalies. To ensure a fair comparison, we evaluated consistency using a degree correlation approach after transforming both evaporation and precipitation data into spherical harmonics. Overall we found no persistent hydrological consistency in these dryland environments. Indeed, the degree correlation showed oscillating values between periods of low and high water storage changes, with a phase difference of about 2–3 months

  12. Evaluating the hydrological consistency of evaporation products

    KAUST Repository

    Lopez Valencia, Oliver Miguel

    2017-01-18

    Advances in space-based observations have provided the capacity to develop regional- to global-scale estimates of evaporation, offering insights into this key component of the hydrological cycle. However, the evaluation of large-scale evaporation retrievals is not a straightforward task. While a number of studies have intercompared a range of these evaporation products by examining the variance amongst them, or by comparison of pixel-scale retrievals against ground-based observations, there is a need to explore more appropriate techniques to comprehensively evaluate remote-sensing-based estimates. One possible approach is to establish the level of product agreement between related hydrological components: for instance, how well do evaporation patterns and response match with precipitation or water storage changes? To assess the suitability of this "consistency"-based approach for evaluating evaporation products, we focused our investigation on four globally distributed basins in arid and semi-arid environments, comprising the Colorado River basin, Niger River basin, Aral Sea basin, and Lake Eyre basin. In an effort to assess retrieval quality, three satellite-based global evaporation products based on different methodologies and input data, including CSIRO-PML, the MODIS Global Evapotranspiration product (MOD16), and Global Land Evaporation: the Amsterdam Methodology (GLEAM), were evaluated against rainfall data from the Global Precipitation Climatology Project (GPCP) along with Gravity Recovery and Climate Experiment (GRACE) water storage anomalies. To ensure a fair comparison, we evaluated consistency using a degree correlation approach after transforming both evaporation and precipitation data into spherical harmonics. Overall we found no persistent hydrological consistency in these dryland environments. Indeed, the degree correlation showed oscillating values between periods of low and high water storage changes, with a phase difference of about 2–3 months

  13. A novel fluoride anion modified gelatin nanogel system for ultrasound-triggered drug release.

    Science.gov (United States)

    Wu, Daocheng; Wan, Mingxi

    2008-01-01

    Controlled drug release, especially tumor-targeted drug release, remains a great challenge. Here, we prepare a novel fluoride anion-modified gelatin nanogel system and investigate its characteristics of ultrasound-triggered drug release. Adriamycin gelatin nanogel modified with fluoride anion (ADM-GNMF) was prepared by a modified co-precipitation method with fluoride anion and sodium sulfate. The loading and encapsulation efficiency of the anti-neoplastic agent adriamycin (ADM) were measured by high performance liquid chromatography (HPLC). The size and shape of ADM-GNMF were determined by electron microscopy and photo-correlation spectroscopy. The size distribution and drug release efficiency of ADM-GNMF, before and after sonication, were measured by two designed measuring devices that consisted of either a submicron particle size analyzer and an ultrasound generator as well as an ultrasound generator, automatic sampler, and HPLC. The ADM-GNMF was stable in solution with an average diameter of 46+/-12 nm; the encapsulation and loading efficiency of adriamycin were 87.2% and 6.38%, respectively. The ultrasound-triggered drug release and size change were most efficient at a frequency of 20 kHz, power density of 0.4w/cm2, and a 1~2 min duration. Under this ultrasound-triggered condition, 51.5% of drug in ADM-GNMF was released within 1~2 min, while the size of ADM-GNMF changed from 46 +/- 12 nm to 1212 +/- 35 nm within 1~2 min of sonication and restored to its previous size in 2~3 min after the ultrasound stopped. In contrast, 8.2% of drug in ADM-GNMF was released within 2~3 min without sonication, and only negligible size changes were found. The ADM-GNMF system efficiently released the encompassed drug in response to ultrasound, offering a novel and promising controlled drug release system for targeted therapy for cancer or other diseases.

  14. Consistency Anchor Formalization and Correctness Proofs

    OpenAIRE

    Miguel, Correia; Bessani, Alysson

    2014-01-01

    This is report contains the formal proofs for the techniques for increasing the consistency of cloud storage as presented in "Bessani et al. SCFS: A Cloud-backed File System. Proc. of the 2014 USENIX Annual Technical Conference. June 2014." The consistency anchor technique allows one to increase the consistency provided by eventually consistent cloud storage services like Amazon S3. This technique has been used in the SCFS (Shared Cloud File System) cloud-backed file system for solving rea...

  15. Large scientific releases

    International Nuclear Information System (INIS)

    Pongratz, M.B.

    1981-01-01

    The motivation for active experiments in space is considered, taking into account the use of active techniques to obtain a better understanding of the natural space environment, the utilization of the advantages of space as a laboratory to study fundamental plasma physics, and the employment of active techniques to determine the magnitude, degree, and consequences of artificial modification of the space environment. It is pointed out that mass-injection experiments in space plasmas began about twenty years ago with the Project Firefly releases. Attention is given to mass-release techniques and diagnostics, operational aspects of mass release active experiments, the active observation of mass release experiments, active perturbation mass release experiments, simulating an artificial modification of the space environment, and active experiments to study fundamental plasma physics

  16. Self-consistency and coherent effects in nonlinear resonances

    International Nuclear Information System (INIS)

    Hofmann, I.; Franchetti, G.; Qiang, J.; Ryne, R. D.

    2003-01-01

    The influence of space charge on emittance growth is studied in simulations of a coasting beam exposed to a strong octupolar perturbation in an otherwise linear lattice, and under stationary parameters. We explore the importance of self-consistency by comparing results with a non-self-consistent model, where the space charge electric field is kept 'frozen-in' to its initial values. For Gaussian distribution functions we find that the 'frozen-in' model results in a good approximation of the self-consistent model, hence coherent response is practically absent and the emittance growth is self-limiting due to space charge de-tuning. For KV or waterbag distributions, instead, strong coherent response is found, which we explain in terms of absence of Landau damping

  17. Computer code to assess accidental pollutant releases

    International Nuclear Information System (INIS)

    Pendergast, M.M.; Huang, J.C.

    1980-07-01

    A computer code was developed to calculate the cumulative frequency distributions of relative concentrations of an air pollutant following an accidental release from a stack or from a building penetration such as a vent. The calculations of relative concentration are based on the Gaussian plume equations. The meteorological data used for the calculation are in the form of joint frequency distributions of wind and atmospheric stability

  18. A new approach to hull consistency

    Directory of Open Access Journals (Sweden)

    Kolev Lubomir

    2016-06-01

    Full Text Available Hull consistency is a known technique to improve the efficiency of iterative interval methods for solving nonlinear systems describing steady-states in various circuits. Presently, hull consistency is checked in a scalar manner, i.e. successively for each equation of the nonlinear system with respect to a single variable. In the present poster, a new more general approach to implementing hull consistency is suggested which consists in treating simultaneously several equations with respect to the same number of variables.

  19. Replica consistency in a Data Grid

    International Nuclear Information System (INIS)

    Domenici, Andrea; Donno, Flavia; Pucciani, Gianni; Stockinger, Heinz; Stockinger, Kurt

    2004-01-01

    A Data Grid is a wide area computing infrastructure that employs Grid technologies to provide storage capacity and processing power to applications that handle very large quantities of data. Data Grids rely on data replication to achieve better performance and reliability by storing copies of data sets on different Grid nodes. When a data set can be modified by applications, the problem of maintaining consistency among existing copies arises. The consistency problem also concerns metadata, i.e., additional information about application data sets such as indices, directories, or catalogues. This kind of metadata is used both by the applications and by the Grid middleware to manage the data. For instance, the Replica Management Service (the Grid middleware component that controls data replication) uses catalogues to find the replicas of each data set. Such catalogues can also be replicated and their consistency is crucial to the correct operation of the Grid. Therefore, metadata consistency generally poses stricter requirements than data consistency. In this paper we report on the development of a Replica Consistency Service based on the middleware mainly developed by the European Data Grid Project. The paper summarises the main issues in the replica consistency problem, and lays out a high-level architectural design for a Replica Consistency Service. Finally, results from simulations of different consistency models are presented

  20. The 2017 Release Cloudy

    Science.gov (United States)

    Ferland, G. J.; Chatzikos, M.; Guzmán, F.; Lykins, M. L.; van Hoof, P. A. M.; Williams, R. J. R.; Abel, N. P.; Badnell, N. R.; Keenan, F. P.; Porter, R. L.; Stancil, P. C.

    2017-10-01

    We describe the 2017 release of the spectral synthesis code Cloudy, summarizing the many improvements to the scope and accuracy of the physics which have been made since the previous release. Exporting the atomic data into external data files has enabled many new large datasets to be incorporated into the code. The use of the complete datasets is not realistic for most calculations, so we describe the limited subset of data used by default, which predicts significantly more lines than the previous release of Cloudy. This version is nevertheless faster than the previous release, as a result of code optimizations. We give examples of the accuracy limits using small models, and the performance requirements of large complete models. We summarize several advances in the H- and He-like iso-electronic sequences and use our complete collisional-radiative models to establish the densities where the coronal and local thermodynamic equilibrium approximations work.

  1. EIA new releases

    International Nuclear Information System (INIS)

    1994-09-01

    This report is a compliation of news releases from the Energy Information Administration. The september-october report includes articles on energy conservation, energy consumption in commercial buildings, and a short term energy model for a personal computer

  2. Sellafield (release of radioactivity)

    Energy Technology Data Exchange (ETDEWEB)

    Cunningham, J; Goodlad, A; Morris, M

    1986-02-06

    A government statement is reported, about the release of plutonium nitrate at the Sellafield site of British Nuclear Fuels plc on 5 February 1986. Matters raised included: details of accident; personnel monitoring; whether radioactive material was released from the site; need for public acceptance of BNFL activities; whether plant should be closed; need to reduce level of radioactive effluent; number of incidents at the plant.

  3. INVESTIGATION OF DRUG RELEASE FROM BIODEGRADABLE PLG MICROSPHERES: EXPERIMENT AND THEORY

    Energy Technology Data Exchange (ETDEWEB)

    ANDREWS, MALCOLM J. [Los Alamos National Laboratory; BERCHANE, NADER S. [Los Alamos National Laboratory; CARSON, KENNETH H. [Los Alamos National Laboratory; RICE-FICHT, ALLISON C. [Los Alamos National Laboratory

    2007-01-30

    Piroxicam containing PLG microspheres having different size distributions were fabricated, and in vitro release kinetics were determined for each preparation. Based on the experimental results, a suitable mathematical theory has been developed that incorporates the effect of microsphere size distribution and polymer degradation on drug release. We show from in vitro release experiments that microsphere size has a significant effect on drug release rate. The initial release rate decreased with an increase in microsphere size. In addition, the release profile changed from first order to concave-upward (sigmoidal) as the system size was increased. The mathematical model gave a good fit to the experimental release data.

  4. Consistency of Trend Break Point Estimator with Underspecified Break Number

    Directory of Open Access Journals (Sweden)

    Jingjing Yang

    2017-01-01

    Full Text Available This paper discusses the consistency of trend break point estimators when the number of breaks is underspecified. The consistency of break point estimators in a simple location model with level shifts has been well documented by researchers under various settings, including extensions such as allowing a time trend in the model. Despite the consistency of break point estimators of level shifts, there are few papers on the consistency of trend shift break point estimators in the presence of an underspecified break number. The simulation study and asymptotic analysis in this paper show that the trend shift break point estimator does not converge to the true break points when the break number is underspecified. In the case of two trend shifts, the inconsistency problem worsens if the magnitudes of the breaks are similar and the breaks are either both positive or both negative. The limiting distribution for the trend break point estimator is developed and closely approximates the finite sample performance.

  5. Student Effort, Consistency, and Online Performance

    Science.gov (United States)

    Patron, Hilde; Lopez, Salvador

    2011-01-01

    This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas…

  6. Translationally invariant self-consistent field theories

    International Nuclear Information System (INIS)

    Shakin, C.M.; Weiss, M.S.

    1977-01-01

    We present a self-consistent field theory which is translationally invariant. The equations obtained go over to the usual Hartree-Fock equations in the limit of large particle number. In addition to deriving the dynamic equations for the self-consistent amplitudes we discuss the calculation of form factors and various other observables

  7. Sticky continuous processes have consistent price systems

    DEFF Research Database (Denmark)

    Bender, Christian; Pakkanen, Mikko; Sayit, Hasanjan

    Under proportional transaction costs, a price process is said to have a consistent price system, if there is a semimartingale with an equivalent martingale measure that evolves within the bid-ask spread. We show that a continuous, multi-asset price process has a consistent price system, under...

  8. Consistent-handed individuals are more authoritarian.

    Science.gov (United States)

    Lyle, Keith B; Grillo, Michael C

    2014-01-01

    Individuals differ in the consistency with which they use one hand over the other to perform everyday activities. Some individuals are very consistent, habitually using a single hand to perform most tasks. Others are relatively inconsistent, and hence make greater use of both hands. More- versus less-consistent individuals have been shown to differ in numerous aspects of personality and cognition. In several respects consistent-handed individuals resemble authoritarian individuals. For example, both consistent-handedness and authoritarianism have been linked to cognitive inflexibility. Therefore we hypothesised that consistent-handedness is an external marker for authoritarianism. Confirming our hypothesis, we found that consistent-handers scored higher than inconsistent-handers on a measure of submission to authority, were more likely to identify with a conservative political party (Republican), and expressed less-positive attitudes towards out-groups. We propose that authoritarianism may be influenced by the degree of interaction between the left and right brain hemispheres, which has been found to differ between consistent- and inconsistent-handed individuals.

  9. Testing the visual consistency of web sites

    NARCIS (Netherlands)

    van der Geest, Thea; Loorbach, N.R.

    2005-01-01

    Consistency in the visual appearance of Web pages is often checked by experts, such as designers or reviewers. This article reports a card sort study conducted to determine whether users rather than experts could distinguish visual (in-)consistency in Web elements and pages. The users proved to

  10. Consistent spectroscopy for a extended gauge model

    International Nuclear Information System (INIS)

    Oliveira Neto, G. de.

    1990-11-01

    The consistent spectroscopy was obtained with a Lagrangian constructed with vector fields with a U(1) group extended symmetry. As consistent spectroscopy is understood the determination of quantum physical properties described by the model in an manner independent from the possible parametrizations adopted in their description. (L.C.J.A.)

  11. Modeling and Testing Legacy Data Consistency Requirements

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard

    2003-01-01

    An increasing number of data sources are available on the Internet, many of which offer semantically overlapping data, but based on different schemas, or models. While it is often of interest to integrate such data sources, the lack of consistency among them makes this integration difficult....... This paper addresses the need for new techniques that enable the modeling and consistency checking for legacy data sources. Specifically, the paper contributes to the development of a framework that enables consistency testing of data coming from different types of data sources. The vehicle is UML and its...... accompanying XMI. The paper presents techniques for modeling consistency requirements using OCL and other UML modeling elements: it studies how models that describe the required consistencies among instances of legacy models can be designed in standard UML tools that support XMI. The paper also considers...

  12. EPICS release 3.11 specific documentation -- EPICS release notes for 3.11

    International Nuclear Information System (INIS)

    1994-01-01

    EPICS release 3.11 is now ready for user testing. A person who wants to set up a simplified application environment to boot an IOC and create databases using R3.11 should follow the directions in Appendix B, page 27, of the EPICS Source/Release Control Manual, Sept. 20, 1993. The R3.11 EPICS path at ANL/APS is /net/phebos/epics/R3.11 so the command to get the new release is /net/phebos/epics/R3.11/Unix/share/bin/getrel /net/phebos/epics/R3.11. An existing R3.8 short form report can be copied to this new directory and used to create a database. ANL/APS is currently testing an Application Developers Source/Release control system. It is not yet ready for general distribution. Attached are the EPICS R3.11 release notes

  13. The cluster bootstrap consistency in generalized estimating equations

    KAUST Repository

    Cheng, Guang

    2013-03-01

    The cluster bootstrap resamples clusters or subjects instead of individual observations in order to preserve the dependence within each cluster or subject. In this paper, we provide a theoretical justification of using the cluster bootstrap for the inferences of the generalized estimating equations (GEE) for clustered/longitudinal data. Under the general exchangeable bootstrap weights, we show that the cluster bootstrap yields a consistent approximation of the distribution of the regression estimate, and a consistent approximation of the confidence sets. We also show that a computationally more efficient one-step version of the cluster bootstrap provides asymptotically equivalent inference. © 2012.

  14. ATP Release Channels

    Directory of Open Access Journals (Sweden)

    Akiyuki Taruno

    2018-03-01

    Full Text Available Adenosine triphosphate (ATP has been well established as an important extracellular ligand of autocrine signaling, intercellular communication, and neurotransmission with numerous physiological and pathophysiological roles. In addition to the classical exocytosis, non-vesicular mechanisms of cellular ATP release have been demonstrated in many cell types. Although large and negatively charged ATP molecules cannot diffuse across the lipid bilayer of the plasma membrane, conductive ATP release from the cytosol into the extracellular space is possible through ATP-permeable channels. Such channels must possess two minimum qualifications for ATP permeation: anion permeability and a large ion-conducting pore. Currently, five groups of channels are acknowledged as ATP-release channels: connexin hemichannels, pannexin 1, calcium homeostasis modulator 1 (CALHM1, volume-regulated anion channels (VRACs, also known as volume-sensitive outwardly rectifying (VSOR anion channels, and maxi-anion channels (MACs. Recently, major breakthroughs have been made in the field by molecular identification of CALHM1 as the action potential-dependent ATP-release channel in taste bud cells, LRRC8s as components of VRACs, and SLCO2A1 as a core subunit of MACs. Here, the function and physiological roles of these five groups of ATP-release channels are summarized, along with a discussion on the future implications of understanding these channels.

  15. Feeding Releases Endogenous Opioids in Humans.

    Science.gov (United States)

    Tuulari, Jetro J; Tuominen, Lauri; de Boer, Femke E; Hirvonen, Jussi; Helin, Semi; Nuutila, Pirjo; Nummenmaa, Lauri

    2017-08-23

    The endogenous opioid system supports a multitude of functions related to appetitive behavior in humans and animals, and it has been proposed to govern hedonic aspects of feeding thus contributing to the development of obesity. Here we used positron emission tomography to investigate whether feeding results in hedonia-dependent endogenous opioid release in humans. Ten healthy males were recruited for the study. They were scanned with the μ-opioid-specific ligand [ 11 C]carfentanil three times, as follows: after a palatable meal, a nonpalatable meal, and after an overnight fast. Subjective mood, satiety, and circulating hormone levels were measured. Feeding induced significant endogenous opioid release throughout the brain. This response was more pronounced following a nonpalatable meal versus a palatable meal, and independent of the subjective hedonic responses to feeding. We conclude that feeding consistently triggers cerebral opioid release even in the absence of subjective pleasure associated with feeding, suggesting that metabolic and homeostatic rather than exclusively hedonic responses play a role in the feeding-triggered cerebral opioid release. SIGNIFICANCE STATEMENT The endogenous opioid system supports both hedonic and homeostatic functions. It has been proposed that overeating and concomitant opioid release could downregulate opioid receptors and promote the development of obesity. However, it remains unresolved whether feeding leads to endogenous opioid release in humans. We used in vivo positron emission tomography to test whether feeding triggers cerebral opioid release and whether this response is associated with pleasurable sensations. We scanned volunteers using the μ-opioid receptor-specific radioligand [ 11 C]carfentanil three times, as follows: after an overnight fast, after consuming a palatable meal, and after consuming a nonpalatable meal. Feeding led to significant endogenous opioid release, and this occurred also in the absence of feeding

  16. Consistency in the World Wide Web

    DEFF Research Database (Denmark)

    Thomsen, Jakob Grauenkjær

    Tim Berners-Lee envisioned that computers will behave as agents of humans on the World Wide Web, where they will retrieve, extract, and interact with information from the World Wide Web. A step towards this vision is to make computers capable of extracting this information in a reliable...... and consistent way. In this dissertation we study steps towards this vision by showing techniques for the specication, the verication and the evaluation of the consistency of information in the World Wide Web. We show how to detect certain classes of errors in a specication of information, and we show how...... the World Wide Web, in order to help perform consistent evaluations of web extraction techniques. These contributions are steps towards having computers reliable and consistently extract information from the World Wide Web, which in turn are steps towards achieving Tim Berners-Lee's vision. ii...

  17. Consistent histories and operational quantum theory

    International Nuclear Information System (INIS)

    Rudolph, O.

    1996-01-01

    In this work a generalization of the consistent histories approach to quantum mechanics is presented. We first critically review the consistent histories approach to nonrelativistic quantum mechanics in a mathematically rigorous way and give some general comments about it. We investigate to what extent the consistent histories scheme is compatible with the results of the operational formulation of quantum mechanics. According to the operational approach, nonrelativistic quantum mechanics is most generally formulated in terms of effects, states, and operations. We formulate a generalized consistent histories theory using the concepts and the terminology which have proven useful in the operational formulation of quantum mechanics. The logical rule of the logical interpretation of quantum mechanics is generalized to the present context. The algebraic structure of the generalized theory is studied in detail

  18. Self-consistent areas law in QCD

    International Nuclear Information System (INIS)

    Makeenko, Yu.M.; Migdal, A.A.

    1980-01-01

    The problem of obtaining the self-consistent areas law in quantum chromodynamics (QCD) is considered from the point of view of the quark confinement. The exact equation for the loop average in multicolor QCD is reduced to a bootstrap form. Its iterations yield new manifestly gauge invariant perturbation theory in the loop space, reproducing asymptotic freedom. For large loops, the areas law apprears to be a self-consistent solution

  19. Consistency of the MLE under mixture models

    OpenAIRE

    Chen, Jiahua

    2016-01-01

    The large-sample properties of likelihood-based statistical inference under mixture models have received much attention from statisticians. Although the consistency of the nonparametric MLE is regarded as a standard conclusion, many researchers ignore the precise conditions required on the mixture model. An incorrect claim of consistency can lead to false conclusions even if the mixture model under investigation seems well behaved. Under a finite normal mixture model, for instance, the consis...

  20. RAVEN Beta Release

    International Nuclear Information System (INIS)

    Rabiti, Cristian; Alfonsi, Andrea; Cogliati, Joshua Joseph; Mandelli, Diego; Kinoshita, Robert Arthur; Wang, Congjian; Maljovec, Daniel Patrick; Talbot, Paul William

    2016-01-01

    This documents the release of the Risk Analysis Virtual Environment (RAVEN) code. A description of the RAVEN code is provided, and discussion of the release process for the M2LW-16IN0704045 milestone. The RAVEN code is a generic software framework to perform parametric and probabilistic analysis based on the response of complex system codes. RAVEN is capable of investigating the system response as well as the input space using Monte Carlo, Grid, or Latin Hyper Cube sampling schemes, but its strength is focused toward system feature discovery, such as limit surfaces, separating regions of the input space leading to system failure, using dynamic supervised learning techniques. RAVEN has now increased in maturity enough for the Beta 1.0 release.

  1. RAVEN Beta Release

    Energy Technology Data Exchange (ETDEWEB)

    Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Alfonsi, Andrea [Idaho National Lab. (INL), Idaho Falls, ID (United States); Cogliati, Joshua Joseph [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kinoshita, Robert Arthur [Idaho National Lab. (INL), Idaho Falls, ID (United States); Wang, Congjian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Maljovec, Daniel Patrick [Idaho National Lab. (INL), Idaho Falls, ID (United States); Talbot, Paul William [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-02-01

    This documents the release of the Risk Analysis Virtual Environment (RAVEN) code. A description of the RAVEN code is provided, and discussion of the release process for the M2LW-16IN0704045 milestone. The RAVEN code is a generic software framework to perform parametric and probabilistic analysis based on the response of complex system codes. RAVEN is capable of investigating the system response as well as the input space using Monte Carlo, Grid, or Latin Hyper Cube sampling schemes, but its strength is focused toward system feature discovery, such as limit surfaces, separating regions of the input space leading to system failure, using dynamic supervised learning techniques. RAVEN has now increased in maturity enough for the Beta 1.0 release.

  2. Self-consistent asset pricing models

    Science.gov (United States)

    Malevergne, Y.; Sornette, D.

    2007-08-01

    We discuss the foundations of factor or regression models in the light of the self-consistency condition that the market portfolio (and more generally the risk factors) is (are) constituted of the assets whose returns it is (they are) supposed to explain. As already reported in several articles, self-consistency implies correlations between the return disturbances. As a consequence, the alphas and betas of the factor model are unobservable. Self-consistency leads to renormalized betas with zero effective alphas, which are observable with standard OLS regressions. When the conditions derived from internal consistency are not met, the model is necessarily incomplete, which means that some sources of risk cannot be replicated (or hedged) by a portfolio of stocks traded on the market, even for infinite economies. Analytical derivations and numerical simulations show that, for arbitrary choices of the proxy which are different from the true market portfolio, a modified linear regression holds with a non-zero value αi at the origin between an asset i's return and the proxy's return. Self-consistency also introduces “orthogonality” and “normality” conditions linking the betas, alphas (as well as the residuals) and the weights of the proxy portfolio. Two diagnostics based on these orthogonality and normality conditions are implemented on a basket of 323 assets which have been components of the S&P500 in the period from January 1990 to February 2005. These two diagnostics show interesting departures from dynamical self-consistency starting about 2 years before the end of the Internet bubble. Assuming that the CAPM holds with the self-consistency condition, the OLS method automatically obeys the resulting orthogonality and normality conditions and therefore provides a simple way to self-consistently assess the parameters of the model by using proxy portfolios made only of the assets which are used in the CAPM regressions. Finally, the factor decomposition with the

  3. Optimization of nonthermal fusion power consistent with energy channeling

    International Nuclear Information System (INIS)

    Snyder, P.B.; Herrmann, M.C.; Fisch, N.J.

    1995-02-01

    If the energy of charged fusion products can be diverted directly to fuel ions, non-Maxwellian fuel ion distributions and temperature differences between species will result. To determine the importance of these nonthermal effects, the fusion power density is optimized at constant-β for nonthermal distributions that are self-consistently maintained by channeling of energy from charged fusion products. For D-T and D- 3 He reactors, with 75% of charged fusion product power diverted to fuel ions, temperature differences between electrons and ions increase the reactivity by 40-70%, while non- Maxwellian fuel ion distributions and temperature differences between ionic species increase the reactivity by an additional 3-15%

  4. Pressure-sensitive release mechanism for radiosonde applications

    International Nuclear Information System (INIS)

    Kulhanek, F.C.

    1975-01-01

    As part of the 1975 planetary boundary layer field experimental program, miniature radiosondes attached to pilot balloons were released into the atmosphere for routine sampling of the vertical temperature distribution. A new releasing mechanism used to continue sampling during descent by parachute is described

  5. Depth preference in released juvenile turbot Psetta maxima

    DEFF Research Database (Denmark)

    Albertsen, Christoffer Moesgaard; Støttrup, Josianne; Nielsen, Anders

    2014-01-01

    Hatchery-reared juvenile turbot Psetta maxima were tagged with Passive Integrated Transponder (PIT) tags and released at three different depths in a sandy bay in Denmark. About 2–7% of the released fish were registered daily to monitor their distribution using a tag antenna mounted on a modified...

  6. Sustained Release of a Watermgoluble tiring from Directly ...

    African Journals Online (AJOL)

    Sustained Release of a Watermgoluble tiring from Directly Compressed Okra Gum Matrix. Tablets. Val). Mill, MA. Gilllllbll'ifl'l' AND KT. ... in near noromorder release of aspirin from the matrix tablets. The results indicate that okra gum is .... porous structure including alteration of the shape and size distribution of the pores.

  7. A new release of the S3M code

    International Nuclear Information System (INIS)

    Pavlovic, M.; Bokor, J.; Regodic, M.; Sagatova, A.

    2015-01-01

    This paper presents a new release of the code that contains some additional routines and advanced features of displaying the results. Special attention is paid to the processing of the SRIM range file, which was not included in the previous release of the code. Examples of distributions provided by the S 3 M code for implanted ions in thyroid and iron are presented. (authors)

  8. Nuclear energy release from fragmentation

    Energy Technology Data Exchange (ETDEWEB)

    Li, Cheng [The Key Laboratory of Beam Technology and Material Modification of Ministry of Education, College of Nuclear Science and Technology, Beijing Normal University, Beijing 100875 (China); Beijing Radiation Center, Beijing 100875 (China); Souza, S.R. [Instituto de Física, Universidade Federal do Rio de Janeiro Cidade Universitária, Caixa Postal 68528, 21945-970 Rio de Janeiro (Brazil); Tsang, M.B. [The Key Laboratory of Beam Technology and Material Modification of Ministry of Education, College of Nuclear Science and Technology, Beijing Normal University, Beijing 100875 (China); Beijing Radiation Center, Beijing 100875 (China); National Superconducting Cyclotron Laboratory and Physics and Astronomy Department, Michigan State University, East Lansing, MI 48824 (United States); Zhang, Feng-Shou, E-mail: fszhang@bnu.edu.cn [The Key Laboratory of Beam Technology and Material Modification of Ministry of Education, College of Nuclear Science and Technology, Beijing Normal University, Beijing 100875 (China); Beijing Radiation Center, Beijing 100875 (China); Center of Theoretical Nuclear Physics, National Laboratory of Heavy Ion Accelerator of Lanzhou, Lanzhou 730000 (China)

    2016-08-15

    It is well known that binary fission occurs with positive energy gain. In this article we examine the energetics of splitting uranium and thorium isotopes into various numbers of fragments (from two to eight) with nearly equal size. We find that the energy released by splitting {sup 230,232}Th and {sup 235,238}U into three equal size fragments is largest. The statistical multifragmentation model (SMM) is applied to calculate the probability of different breakup channels for excited nuclei. By weighing the probability distributions of fragment multiplicity at different excitation energies, we find the peaks of energy release for {sup 230,232}Th and {sup 235,238}U are around 0.7–0.75 MeV/u at excitation energy between 1.2 and 2 MeV/u in the primary breakup process. Taking into account the secondary de-excitation processes of primary fragments with the GEMINI code, these energy peaks fall to about 0.45 MeV/u.

  9. Influence of LOD variations on seismic energy release

    Science.gov (United States)

    Riguzzi, F.; Krumm, F.; Wang, K.; Kiszely, M.; Varga, P.

    2009-04-01

    Tidal friction causes significant time variations of geodynamical parameters, among them geometrical flattening. The axial despinning of the Earth due to tidal friction through the change of flattening generates incremental meridional and azimuthal stresses. The stress pattern in an incompressible elastic upper mantle and crust is symmetric to the equator and has its inflection points at the critical latitude close to ±45°. Consequently the distribution of seismic energy released by strong, shallow focus earthquakes should have also sharp maxima at this latitude. To investigate the influence of length of day (LOD) variations on earthquake activity an earthquake catalogue of strongest seismic events (M>7.0) was completed for the period 1900-2007. It is shown with the use of this catalogue that for the studied time-interval the catalogue is complete and consists of the seismic events responsible for more than 90% of released seismic energy. Study of the catalogue for earthquakes M>7.0 shows that the seismic energy discharged by the strongest seismic events has significant maxima at ±45°, what renders probably that the seismic activity of our planet is influenced by an external component, i.e. by the tidal friction, which acts through the variation of the hydrostatic figure of the Earth caused by it. Distribution along the latitude of earthquake numbers and energies was investigated also for the case of global linear tectonic structures, such as mid ocean ridges and subduction zones. It can be shown that the number of the shallow focus shocks has a repartition along the latitude similar to the distribution of the linear tectonic structures. This means that the position of foci of seismic events is mainly controlled by the tectonic activity.

  10. Towards thermodynamical consistency of quasiparticle picture

    International Nuclear Information System (INIS)

    Biro, T.S.; Shanenko, A.A.; Toneev, V.D.; Research Inst. for Particle and Nuclear Physics, Hungarian Academy of Sciences, Budapest

    2003-01-01

    The purpose of the present article is to call attention to some realistic quasi-particle-based description of the quark/gluon matter and its consistent implementation in thermodynamics. A simple and transparent representation of the thermodynamical consistency conditions is given. This representation allows one to review critically and systemize available phenomenological approaches to the deconfinement problem with respect to their thermodynamical consistency. A particular attention is paid to the development of a method for treating the string screening in the dense matter of unbound color charges. The proposed method yields an integrable effective pair potential, which can be incorporated into the mean-field picture. The results of its application are in reasonable agreement with lattice data on the QCD thermodynamics [ru

  11. Toward thermodynamic consistency of quasiparticle picture

    International Nuclear Information System (INIS)

    Biro, T.S.; Toneev, V.D.; Shanenko, A.A.

    2003-01-01

    The purpose of the present article is to call attention to some realistic quasiparticle-based description of quark/gluon matter and its consistent implementation in thermodynamics. A simple and transparent representation of the thermodynamic consistency conditions is given. This representation allows one to review critically and systemize available phenomenological approaches to the deconfinement problem with respect to their thermodynamic consistency. Particular attention is paid to the development of a method for treating the string screening in the dense matter of unbound color charges. The proposed method yields an integrable effective pair potential that can be incorporated into the mean-field picture. The results of its application are in reasonable agreement with lattice data on the QCD thermodynamics

  12. Toward a consistent RHA-RPA

    International Nuclear Information System (INIS)

    Shepard, J.R.

    1991-01-01

    The authors examine the RPA based on a relativistic Hartree approximation description for nuclear ground states. This model includes contributions from the negative energy sea at the 1-loop level. They emphasize consistency between the treatment of the ground state and the RPA. This consistency is important in the description of low-lying collective levels but less important for the longitudinal (e, e') quasi-elastic response. They also study the effect of imposing a 3-momentum cutoff on negative energy sea contributions. A cutoff of twice the nucleon mass improves agreement with observed spin orbit splittings in nuclei compared to the standard infinite cutoff results, an effect traceable to the fact that imposing the cutoff reduces m*/m. The cutoff is much less important than consistency in the description of low-lying collective levels. The cutoff model provides excellent agreement with quasi-elastic (e, e') data

  13. Personalized recommendation based on unbiased consistence

    Science.gov (United States)

    Zhu, Xuzhen; Tian, Hui; Zhang, Ping; Hu, Zheng; Zhou, Tao

    2015-08-01

    Recently, in physical dynamics, mass-diffusion-based recommendation algorithms on bipartite network provide an efficient solution by automatically pushing possible relevant items to users according to their past preferences. However, traditional mass-diffusion-based algorithms just focus on unidirectional mass diffusion from objects having been collected to those which should be recommended, resulting in a biased causal similarity estimation and not-so-good performance. In this letter, we argue that in many cases, a user's interests are stable, and thus bidirectional mass diffusion abilities, no matter originated from objects having been collected or from those which should be recommended, should be consistently powerful, showing unbiased consistence. We further propose a consistence-based mass diffusion algorithm via bidirectional diffusion against biased causality, outperforming the state-of-the-art recommendation algorithms in disparate real data sets, including Netflix, MovieLens, Amazon and Rate Your Music.

  14. Electrosprayed nanoparticle delivery system for controlled release

    Energy Technology Data Exchange (ETDEWEB)

    Eltayeb, Megdi, E-mail: megdi.eltayeb@sustech.edu [Department of Biomedical Engineering, Sudan University of Science and Technology, PO Box 407, Khartoum (Sudan); Stride, Eleanor, E-mail: eleanor.stride@eng.ox.ac.uk [Institute of Biomedical Engineering, Department of Engineering Science, University of Oxford, Old Road Campus Research Building, Headington OX3 7DQ (United Kingdom); Edirisinghe, Mohan, E-mail: m.edirisinghe@ucl.ac.uk [Department of Mechanical Engineering, University College London, Torrington Place, London WC1E 7JE (United Kingdom); Harker, Anthony, E-mail: a.harker@ucl.ac.uk [London Centre for Nanotechnology, Gordon Street, London WC1H 0AH (United Kingdom); Department of Physics & Astronomy, University College London, Gower Street, London WC1E 6BT (United Kingdom)

    2016-09-01

    This study utilises an electrohydrodynamic technique to prepare core-shell lipid nanoparticles with a tunable size and high active ingredient loading capacity, encapsulation efficiency and controlled release. Using stearic acid and ethylvanillin as model shell and active ingredients respectively, we identify the processing conditions and ratios of lipid:ethylvanillin required to form nanoparticles. Nanoparticles with a mean size ranging from 60 to 70 nm at the rate of 1.37 × 10{sup 9} nanoparticles per minute were prepared with different lipid:ethylvanillin ratios. The polydispersity index was ≈ 21% and the encapsulation efficiency ≈ 70%. It was found that the rate of ethylvanillin release was a function of the nanoparticle size, and lipid:ethylvanillin ratio. The internal structure of the lipid nanoparticles was studied by transmission electron microscopy which confirmed that the ethylvanillin was encapsulated within a stearic acid shell. Fourier transform infrared spectroscopy analysis indicated that the ethylvanillin had not been affected. Extensive analysis of the release of ethylvanillin was performed using several existing models and a new diffusive release model incorporating a tanh function. The results were consistent with a core-shell structure. - Highlights: • Electrohydrodynamic spraying is used to produce lipid-coated nanoparticles. • A new model is proposed for the release rates of active components from nanoparticles. • The technique has potential applications in food science and medicine. • Electrohydrodynamic processing controlled release lipid nanoparticles.

  15. Renal epithelial cells can release ATP by vesicular fusion

    Directory of Open Access Journals (Sweden)

    Randi G Bjaelde

    2013-09-01

    Full Text Available Renal epithelial cells have the ability to release nucleotides as paracrine factors. In the intercalated cells of the collecting duct, ATP is released by connexin30 (cx30, which is selectively expressed in this cell type. However, ATP is released by virtually all renal epithelia and the aim of the present study was to identify possible alternative nucleotide release pathways in a renal epithelial cell model. We used MDCK (type1 cells to screen for various potential ATP release pathways. In these cells, inhibition of the vesicular H+-ATPases (bafilomycin reduced both the spontaneous and hypotonically (80%-induced nucleotide release. Interference with vesicular fusion using N-ethylamide markedly reduced the spontaneous nucleotide release, as did interference with trafficking from the endoplasmic reticulum to the Golgi apparatus (brefeldin A1 and vesicular transport (nocodazole. These findings were substantiated using a siRNA directed against SNAP-23, which significantly reduced spontaneous ATP release. Inhibition of pannexin and connexins did not affect the spontaneous ATP release in this cell type, which consists of ∼90% principal cells. TIRF-microscopy of either fluorescently-labeled ATP (MANT-ATP or quinacrine-loaded vesicles, revealed that spontaneous release of single vesicles could be promoted by either hypoosmolality (50% or ionomycin. This vesicular release decreased the overall cellular fluorescence by 5.8% and 7.6% respectively. In summary, this study supports the notion that spontaneous and induced ATP release can occur via exocytosis in renal epithelial cells.

  16. Hydraulic release oil tool

    International Nuclear Information System (INIS)

    Mims, M.G.; Mueller, M.D.; Ehlinger, J.C.

    1992-01-01

    This patent describes a hydraulic release tool. It comprises a setting assembly; a coupling member for coupling to drill string or petroleum production components, the coupling member being a plurality of sockets for receiving the dogs in the extended position and attaching the coupling member the setting assembly; whereby the setting assembly couples to the coupling member by engagement of the dogs in the sockets of releases from and disengages the coupling member in movement of the piston from its setting to its reposition in response to a pressure in the body in exceeding the predetermined pressure; and a relief port from outside the body into its bore and means to prevent communication between the relief port and the bore of the body axially of the piston when the piston is in the setting position and to establish such communication upon movement of the piston from the setting position to the release position and reduce the pressure in the body bore axially of the piston, whereby the reduction of the pressure signals that the tool has released the coupling member

  17. APASS Data Release 10

    Science.gov (United States)

    Henden, Arne A.; Levine, Stephen; Terrell, Dirk; Welch, Douglas L.; Munari, Ulisse; Kloppenborg, Brian K.

    2018-06-01

    The AAVSO Photometric All-Sky Survey (APASS) has been underway since 2010. This survey covers the entire sky from 7.5 knowledge of the optical train distortions. With these changes, DR10 includes many more stars than prior releases. We describe the survey, its remaining limitations, and prospects for the future, including a very-bright-star extension.

  18. Tritium adsorption/release behaviour of advanced EU breeder pebbles

    Energy Technology Data Exchange (ETDEWEB)

    Kolb, Matthias H.H., E-mail: matthias.kolb@kit.edu; Rolli, Rolf; Knitter, Regina

    2017-06-15

    The tritium loading of current grades of advanced ceramic breeder pebbles with three different lithium orthosilicate (LOS)/lithium metatitanate (LMT) compositions (20–30 mol% LMT in LOS) and pebbles of EU reference material, was performed in a consistent way. The temperature dependent release of the introduced tritium was subsequently investigated by temperature programmed desorption (TPD) experiments to gain insight into the desorption characteristics. The obtained TPD data was decomposed into individual release mechanisms according to well-established desorption kinetics. The analysis showed that the pebble composition of the tested samples does not severely change the release behaviour. Yet, an increased content of lithium metatitanate leads to additional desorption peaks at medium temperatures. The majority of tritium is released by high temperature release mechanisms of chemisorbed tritium, while the release of physisorbed tritium is marginal in comparison. The results allow valuable projections for the tritium release behaviour in a fusion blanket.

  19. Tritium adsorption/release behaviour of advanced EU breeder pebbles

    Science.gov (United States)

    Kolb, Matthias H. H.; Rolli, Rolf; Knitter, Regina

    2017-06-01

    The tritium loading of current grades of advanced ceramic breeder pebbles with three different lithium orthosilicate (LOS)/lithium metatitanate (LMT) compositions (20-30 mol% LMT in LOS) and pebbles of EU reference material, was performed in a consistent way. The temperature dependent release of the introduced tritium was subsequently investigated by temperature programmed desorption (TPD) experiments to gain insight into the desorption characteristics. The obtained TPD data was decomposed into individual release mechanisms according to well-established desorption kinetics. The analysis showed that the pebble composition of the tested samples does not severely change the release behaviour. Yet, an increased content of lithium metatitanate leads to additional desorption peaks at medium temperatures. The majority of tritium is released by high temperature release mechanisms of chemisorbed tritium, while the release of physisorbed tritium is marginal in comparison. The results allow valuable projections for the tritium release behaviour in a fusion blanket.

  20. Financial model calibration using consistency hints.

    Science.gov (United States)

    Abu-Mostafa, Y S

    2001-01-01

    We introduce a technique for forcing the calibration of a financial model to produce valid parameters. The technique is based on learning from hints. It converts simple curve fitting into genuine calibration, where broad conclusions can be inferred from parameter values. The technique augments the error function of curve fitting with consistency hint error functions based on the Kullback-Leibler distance. We introduce an efficient EM-type optimization algorithm tailored to this technique. We also introduce other consistency hints, and balance their weights using canonical errors. We calibrate the correlated multifactor Vasicek model of interest rates, and apply it successfully to Japanese Yen swaps market and US dollar yield market.

  1. Are PDFs still consistent with Tevatron data?

    Directory of Open Access Journals (Sweden)

    Sullivan Zack

    2018-01-01

    Full Text Available As active data taking has moved to the LHC at CERN, more and more LHC data have been included into fits of parton distribution functions. An anomaly has arisen where formerly excellent agreement between theoretical predictions and experiment in single-top-quark production at the Tevatron is no longer quite as good. Is this indicative of a deeper issue?

  2. On the Consistent Migration of Unsplittable Flows

    DEFF Research Database (Denmark)

    Förster, Klaus-Tycho

    2017-01-01

    in an inherently asynchronous system, the switches distributed over the network. To this end, a multitude of scheduling systems have been proposed since the initial papers of Reitblatt et al. (Abstractions for Network Update, SIGCOMM ’12) and Hong et al. (SWAN, SIGCOMM ’13). While the the complexity...

  3. Increased seedling establishment via enemy release at the upper elevational range limit of sugar maple.

    Science.gov (United States)

    Urli, Morgane; Brown, Carissa D; Narváez Perez, Rosela; Chagnon, Pierre-Luc; Vellend, Mark

    2016-11-01

    The enemy release hypothesis is frequently invoked to explain invasion by nonnative species, but studies focusing on the influence of enemies on natural plant range expansion due to climate change remain scarce. We combined multiple approaches to study the influence of plant-enemy interactions on the upper elevational range limit of sugar maple (Acer saccharum) in southeastern Québec, Canada, where a previous study had demonstrated intense seed predation just beyond the range limit. Consistent with the hypothesis of release from natural enemies at the range limit, data from both natural patterns of regeneration and from seed and seedling transplant experiments showed higher seedling densities at the range edge than in the core of the species' distribution. A growth chamber experiment manipulating soil origin and temperature indicated that this so-called "happy edge" was not likely caused by temperature (i.e., the possibility that climate warming has made high elevation temperatures optimal for sugar maple) or by abiotic soil factors that vary along the elevational gradient. Finally, an insect-herbivore-exclusion experiment showed that insect herbivory was a major cause of seedling mortality in the core of sugar maple's distribution, whereas seedlings transplanted at or beyond the range edge experienced minimal herbivory (i.e., enemy release). Insect herbivory did not completely explain the high levels of seedling mortality in the core of the species' distribution, suggesting that seedlings at or beyond the range edge may also experience release from pathogens. In sum, while some effects of enemies are magnified beyond range edges (e.g., seed predation), others are dampened at and beyond the range edge (e.g., insect herbivory), such that understanding the net outcome of different biotic interactions within, at and beyond the edge of distribution is critical to predicting species' responses to global change. © 2016 by the Ecological Society of America.

  4. Screw-released roller brake

    Science.gov (United States)

    Vranish, John M. (Inventor)

    1999-01-01

    A screw-released roller brake including an input drive assembly, an output drive assembly, a plurality of locking sprags, a mechanical tripper nut for unlocking the sprags, and a casing therefor. The sprags consist of three dimensional (3-D) sprag members having pairs of contact surface regions which engage respective pairs of contact surface regions included in angular grooves or slots formed in the casing and the output drive assembly. The sprags operate to lock the output drive assembly to the casing to prevent rotation thereof in an idle mode of operation. In a drive mode of operation, the tripper is either self actuated or motor driven and is translated linearly up and down against a spline and at the limit of its travel rotates the sprags which unlock while coupling the input drive assembly to the output drive assembly so as to impart a turning motion thereto in either a clockwise or counterclockwise direction.

  5. Formulation and Pharmacokinetic Evaluation of Controlled-Release ...

    African Journals Online (AJOL)

    The effect of several formulation variables on in ... The in vivo pharmacokinetics of the optimized formulation was compared ... Results: The core tablets exhibited extended release consisting of drug release from the embedded ... important factor in medical treatment with respect ... The solvents for high-performance liquid.

  6. Estimating release of polycyclic aromatic hydrocarbons from coal-tar contaminated soil at manufactured gas plant sites. Final report

    International Nuclear Information System (INIS)

    Lee, L.S.

    1998-04-01

    One of EPRI's goals regarding the environmental behavior of organic substances consists of developing information and predictive tools to estimate the release potential of polycyclic aromatic hydrocarbons (PAHs) from contaminated soils at manufactured gas (MGP) plant sites. A proper assessment of the distribution of contaminants under equilibrium conditions and the potential for mass-transfer constraints is essential in evaluating the environmental risks of contaminants in the subsurface at MGP sites and for selecting remediation options. The results of this research provide insights into estimating maximum release concentrations of PAHs from MGP soils that have been contaminated by direct contact with the tar or through years of contact with contaminated groundwater. Attention is also given to evaluating the use of water-miscible cosolvents for estimating aqueous phase concentrations, and assessing the role of mass-transfer constraints in the release of PAHs from MGP site soils

  7. Proteolysis and consistency of Meshanger cheese

    NARCIS (Netherlands)

    Jong, de L.

    1978-01-01

    Proteolysis in Meshanger cheese, estimated by quantitative polyacrylamide gel electrophoresis is discussed. The conversion of α s1 -casein was proportional to rennet concentration in the cheese. Changes in consistency, after a maximum, were correlated to breakdown of

  8. Developing consistent pronunciation models for phonemic variants

    CSIR Research Space (South Africa)

    Davel, M

    2006-09-01

    Full Text Available Pronunciation lexicons often contain pronunciation variants. This can create two problems: It can be difficult to define these variants in an internally consistent way and it can also be difficult to extract generalised grapheme-to-phoneme rule sets...

  9. Image recognition and consistency of response

    Science.gov (United States)

    Haygood, Tamara M.; Ryan, John; Liu, Qing Mary A.; Bassett, Roland; Brennan, Patrick C.

    2012-02-01

    Purpose: To investigate the connection between conscious recognition of an image previously encountered in an experimental setting and consistency of response to the experimental question. Materials and Methods: Twenty-four radiologists viewed 40 frontal chest radiographs and gave their opinion as to the position of a central venous catheter. One-to-three days later they again viewed 40 frontal chest radiographs and again gave their opinion as to the position of the central venous catheter. Half of the radiographs in the second set were repeated images from the first set and half were new. The radiologists were asked of each image whether it had been included in the first set. For this study, we are evaluating only the 20 repeated images. We used the Kruskal-Wallis test and Fisher's exact test to determine the relationship between conscious recognition of a previously interpreted image and consistency in interpretation of the image. Results. There was no significant correlation between recognition of the image and consistency in response regarding the position of the central venous catheter. In fact, there was a trend in the opposite direction, with radiologists being slightly more likely to give a consistent response with respect to images they did not recognize than with respect to those they did recognize. Conclusion: Radiologists' recognition of previously-encountered images in an observer-performance study does not noticeably color their interpretation on the second encounter.

  10. Consistent Valuation across Curves Using Pricing Kernels

    Directory of Open Access Journals (Sweden)

    Andrea Macrina

    2018-03-01

    Full Text Available The general problem of asset pricing when the discount rate differs from the rate at which an asset’s cash flows accrue is considered. A pricing kernel framework is used to model an economy that is segmented into distinct markets, each identified by a yield curve having its own market, credit and liquidity risk characteristics. The proposed framework precludes arbitrage within each market, while the definition of a curve-conversion factor process links all markets in a consistent arbitrage-free manner. A pricing formula is then derived, referred to as the across-curve pricing formula, which enables consistent valuation and hedging of financial instruments across curves (and markets. As a natural application, a consistent multi-curve framework is formulated for emerging and developed inter-bank swap markets, which highlights an important dual feature of the curve-conversion factor process. Given this multi-curve framework, existing multi-curve approaches based on HJM and rational pricing kernel models are recovered, reviewed and generalised and single-curve models extended. In another application, inflation-linked, currency-based and fixed-income hybrid securities are shown to be consistently valued using the across-curve valuation method.

  11. Guided color consistency optimization for image mosaicking

    Science.gov (United States)

    Xie, Renping; Xia, Menghan; Yao, Jian; Li, Li

    2018-01-01

    This paper studies the problem of color consistency correction for sequential images with diverse color characteristics. Existing algorithms try to adjust all images to minimize color differences among images under a unified energy framework, however, the results are prone to presenting a consistent but unnatural appearance when the color difference between images is large and diverse. In our approach, this problem is addressed effectively by providing a guided initial solution for the global consistency optimization, which avoids converging to a meaningless integrated solution. First of all, to obtain the reliable intensity correspondences in overlapping regions between image pairs, we creatively propose the histogram extreme point matching algorithm which is robust to image geometrical misalignment to some extents. In the absence of the extra reference information, the guided initial solution is learned from the major tone of the original images by searching some image subset as the reference, whose color characteristics will be transferred to the others via the paths of graph analysis. Thus, the final results via global adjustment will take on a consistent color similar to the appearance of the reference image subset. Several groups of convincing experiments on both the synthetic dataset and the challenging real ones sufficiently demonstrate that the proposed approach can achieve as good or even better results compared with the state-of-the-art approaches.

  12. Consistent application of codes and standards

    International Nuclear Information System (INIS)

    Scott, M.A.

    1989-01-01

    The guidelines presented in the US Department of Energy, General Design Criteria (DOE 6430.1A), and the Design and Evaluation Guidelines for Department of Energy Facilities Subject to Natural Phenomena Hazards (UCRL-15910) provide a consistent and well defined approach to determine the natural phenomena hazards loads for US Department of Energy site facilities. The guidelines for the application of loads combinations and allowables criteria are not as well defined and are more flexible in interpretation. This flexibility in the interpretation of load combinations can lead to conflict between the designer and overseer. The establishment of an efficient set of acceptable design criteria, based on US Department of Energy guidelines, provides a consistent baseline for analysis, design, and review. Additionally, the proposed method should not limit the design and analytical innovation necessary to analyze or qualify the unique structure. This paper investigates the consistent application of load combinations, analytical methods, and load allowables and suggests a reference path consistent with the US Department of Energy guidelines

  13. Consistency in multi-viewpoint architectural design

    NARCIS (Netherlands)

    Dijkman, R.M.; Dijkman, Remco Matthijs

    2006-01-01

    This thesis presents a framework that aids in preserving consistency in multi-viewpoint designs. In a multi-viewpoint design each stakeholder constructs his own design part. We call each stakeholder’s design part the view of that stakeholder. To construct his view, a stakeholder has a viewpoint.

  14. Consistent Visual Analyses of Intrasubject Data

    Science.gov (United States)

    Kahng, SungWoo; Chung, Kyong-Mee; Gutshall, Katharine; Pitts, Steven C.; Kao, Joyce; Girolami, Kelli

    2010-01-01

    Visual inspection of single-case data is the primary method of interpretation of the effects of an independent variable on a dependent variable in applied behavior analysis. The purpose of the current study was to replicate and extend the results of DeProspero and Cohen (1979) by reexamining the consistency of visual analysis across raters. We…

  15. Consistent Stochastic Modelling of Meteocean Design Parameters

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Sterndorff, M. J.

    2000-01-01

    Consistent stochastic models of metocean design parameters and their directional dependencies are essential for reliability assessment of offshore structures. In this paper a stochastic model for the annual maximum values of the significant wave height, and the associated wind velocity, current...

  16. On the existence of consistent price systems

    DEFF Research Database (Denmark)

    Bayraktar, Erhan; Pakkanen, Mikko S.; Sayit, Hasanjan

    2014-01-01

    We formulate a sufficient condition for the existence of a consistent price system (CPS), which is weaker than the conditional full support condition (CFS). We use the new condition to show the existence of CPSs for certain processes that fail to have the CFS property. In particular this condition...

  17. Dynamic phonon exchange requires consistent dressing

    International Nuclear Information System (INIS)

    Hahne, F.J.W.; Engelbrecht, C.A.; Heiss, W.D.

    1976-01-01

    It is shown that states with undersirable properties (such as ghosts, states with complex eigenenergies and states with unrestricted normalization) emerge from two-body calculations using dynamic effective interactions if one is not careful in introducing single-particle self-energy insertions in a consistent manner

  18. Consistent feeding positions of great tit parents

    NARCIS (Netherlands)

    Lessells, C.M.; Poelman, E.H.; Mateman, A.C.; Cassey, Ph.

    2006-01-01

    When parent birds arrive at the nest to provision their young, their position on the nest rim may influence which chick or chicks are fed. As a result, the consistency of feeding positions of the individual parents, and the difference in position between the parents, may affect how equitably food is

  19. Consistency of the postulates of special relativity

    International Nuclear Information System (INIS)

    Gron, O.; Nicola, M.

    1976-01-01

    In a recent article in this journal, Kingsley has tried to show that the postulates of special relativity contradict each other. It is shown that the arguments of Kingsley are invalid because of an erroneous appeal to symmetry in a nonsymmetric situation. The consistency of the postulates of special relativity and the relativistic kinematics deduced from them is restated

  20. Consistency of Network Traffic Repositories: An Overview

    NARCIS (Netherlands)

    Lastdrager, E.; Lastdrager, E.E.H.; Pras, Aiko

    2009-01-01

    Traffc repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffc that has been flowing over the network; little thoughts are made regarding the consistency of these repositories. Still, for

  1. Consistency analysis of network traffic repositories

    NARCIS (Netherlands)

    Lastdrager, Elmer; Lastdrager, E.E.H.; Pras, Aiko

    Traffic repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffic that has been flowing over the network; little thoughts are made regarding the consistency of these repositories. Still, for

  2. Making the Sustainable Development Goals Consistent with Sustainability

    Directory of Open Access Journals (Sweden)

    Mathis Wackernagel

    2017-07-01

    Full Text Available The UN’s Sustainable development Goals (SDGs are the most significant global effort so far to advance global sustainable development. Bertelsmann Stiftung and the sustainable development solutions network released an SDG index to assess countries’ average performance on SDGs. Ranking high on the SDG index strongly correlates with high per person demand on nature (or “Footprints”, and low ranking with low Footprints, making evident that the SDGs as expressed today vastly underperform on sustainability. Such underperformance is anti-poor because lowest-income people exposed to resource insecurity will lack the financial means to shield themselves from the consequences. Given the significance of the SDGs for guiding development, rigorous accounting is essential for making them consistent with the goals of sustainable development: thriving within the means of planet Earth.

  3. Making the Sustainable Development Goals Consistent with Sustainability

    Energy Technology Data Exchange (ETDEWEB)

    Wackernagel, Mathis, E-mail: mathis.wackernagel@footprintnetwork.org; Hanscom, Laurel; Lin, David [Global Footprint Network, Oakland, CA (United States)

    2017-07-11

    The UN’s Sustainable development Goals (SDGs) are the most significant global effort so far to advance global sustainable development. Bertelsmann Stiftung and the sustainable development solutions network released an SDG index to assess countries’ average performance on SDGs. Ranking high on the SDG index strongly correlates with high per person demand on nature (or “Footprints”), and low ranking with low Footprints, making evident that the SDGs as expressed today vastly underperform on sustainability. Such underperformance is anti-poor because lowest-income people exposed to resource insecurity will lack the financial means to shield themselves from the consequences. Given the significance of the SDGs for guiding development, rigorous accounting is essential for making them consistent with the goals of sustainable development: thriving within the means of planet Earth.

  4. Making the Sustainable Development Goals Consistent with Sustainability

    International Nuclear Information System (INIS)

    Wackernagel, Mathis; Hanscom, Laurel; Lin, David

    2017-01-01

    The UN’s Sustainable development Goals (SDGs) are the most significant global effort so far to advance global sustainable development. Bertelsmann Stiftung and the sustainable development solutions network released an SDG index to assess countries’ average performance on SDGs. Ranking high on the SDG index strongly correlates with high per person demand on nature (or “Footprints”), and low ranking with low Footprints, making evident that the SDGs as expressed today vastly underperform on sustainability. Such underperformance is anti-poor because lowest-income people exposed to resource insecurity will lack the financial means to shield themselves from the consequences. Given the significance of the SDGs for guiding development, rigorous accounting is essential for making them consistent with the goals of sustainable development: thriving within the means of planet Earth.

  5. Massively parallel self-consistent-field calculations

    International Nuclear Information System (INIS)

    Tilson, J.L.

    1994-01-01

    The advent of supercomputers with many computational nodes each with its own independent memory makes possible extremely fast computations. The author's work, as part of the US High Performance Computing and Communications Program (HPCCP), is focused on the development of electronic structure techniques for the solution of Grand Challenge-size molecules containing hundreds of atoms. Their efforts have resulted in a fully scalable Direct-SCF program that is portable and efficient. This code, named NWCHEM, is built around a distributed-data model. This distributed data is managed by a software package called Global Arrays developed within the HPCCP. They present performance results for Direct-SCF calculations of interest to the consortium

  6. Distributed password cracking

    OpenAIRE

    Crumpacker, John R.

    2009-01-01

    Approved for public release, distribution unlimited Password cracking requires significant processing power, which in today's world is located at a workstation or home in the form of a desktop computer. Berkeley Open Infrastructure for Network Computing (BOINC) is the conduit to this significant source of processing power and John the Ripper is the key. BOINC is a distributed data processing system that incorporates client-server relationships to generically process data. The BOINC structu...

  7. Characterization of drug-release kinetics in trabecular bone from titania nanotube implants

    Directory of Open Access Journals (Sweden)

    Aw MS

    2012-09-01

    vivo showed a consistent gradual release of model drug from the TNT–Ti implants, with a characteristic three-dimensional distribution into the surrounding bone, over a period of 5 days. The parameters including the flow rate of bone culture medium, differences in trabecular microarchitecture between bone samples, and mechanical loading were found to have the most significant influence on drug distribution in the bone.Conclusion: These results demonstrate the utility of the Zetos™ system for ex vivo drug-release studies in bone, which can be applied to optimize the delivery of specific therapies and to assist in the design of new drug delivery systems. This method has the potential to provide new knowledge to understand drug distribution in the bone environment and to considerably improve existing technologies for local administration in bone, including solving some critical problems in bone therapy and orthopedic implants.Keywords: local drug delivery, Zetos bone bioreactor, drug-releasing implant, drug diffusion

  8. Taking inventory on VOC releases from Amoco's Yorktown refinery

    International Nuclear Information System (INIS)

    Klee, H.H. Jr.; Schmitt, R.E.; Harrass, M.C.; Podar, M.K.

    1996-01-01

    Amoco's Yorktown, Virginia, refinery is a 35-year-old, 53,000 bbl/day facility that manufacturers gasoline, heating oil, liquid petroleum gas, sulfur, and coke. In a cooperative and voluntary effort, Amoco Corporation and the US Environmental Protection Agency conducted a joint project to study pollution prevention opportunities at an operating industrial facility. Source reduction efforts--key to pollution prevention strategies--require knowledge of specific sources of releases. However, data on releases from individual process units are limited in favor of data to monitor existing end-of-pipe pollution control requirements. The study's sampling program sought to portray the distribution of releases within the refinery, their management within the refinery, and ultimate releases leaving the refinery. Subsequent tests of blowdown stack and fugitive emissions further improved total release estimates. The initial study estimated that the refinery generates about 25,000 metric tons (t)/year of potential pollutants. Of these, about half are released from the refinery as airborne, waterborne, or land-disposed releases. Airborne releases comprise the majority of releases by mass, about 12,000 t/year. Most of the airborne releases are volatile organic compound hydrocarbons. The inventory sampling project and subsequent work identified differences with Toxic Release Inventory (TRI) values and standard emission factors (AP-42). The inventory and other data provided an opportunity to consider options for, and limitations of, specific pollution prevention or source reduction strategies

  9. A consistent interpretation of quantum mechanics

    International Nuclear Information System (INIS)

    Omnes, Roland

    1990-01-01

    Some mostly recent theoretical and mathematical advances can be linked together to yield a new consistent interpretation of quantum mechanics. It relies upon a unique and universal interpretative rule of a logical character which is based upon Griffiths consistent history. Some new results in semi-classical physics allow classical physics to be derived from this rule, including its logical aspects, and to prove accordingly the existence of determinism within the quantum framework. Together with decoherence, this can be used to retrieve the existence of facts, despite the probabilistic character of the theory. Measurement theory can then be made entirely deductive. It is accordingly found that wave packet reduction is a logical property, whereas one can always choose to avoid using it. The practical consequences of this interpretation are most often in agreement with the Copenhagen formulation but they can be proved never to give rise to any logical inconsistency or paradox. (author)

  10. Self-consistency in Capital Markets

    Science.gov (United States)

    Benbrahim, Hamid

    2013-03-01

    Capital Markets are considered, at least in theory, information engines whereby traders contribute to price formation with their diverse perspectives. Regardless whether one believes in efficient market theory on not, actions by individual traders influence prices of securities, which in turn influence actions by other traders. This influence is exerted through a number of mechanisms including portfolio balancing, margin maintenance, trend following, and sentiment. As a result market behaviors emerge from a number of mechanisms ranging from self-consistency due to wisdom of the crowds and self-fulfilling prophecies, to more chaotic behavior resulting from dynamics similar to the three body system, namely the interplay between equities, options, and futures. This talk will address questions and findings regarding the search for self-consistency in capital markets.

  11. Student Effort, Consistency and Online Performance

    Directory of Open Access Journals (Sweden)

    Hilde Patron

    2011-07-01

    Full Text Available This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas effort, or total minutes spent online, is not. Other independent variables include GPA and the difference between a pre-test and a post-test. The GPA is used as a measure of motivation, and the difference between a post-test and pre-test as marginal learning. As expected, the level of motivation is found statistically significant at a 99% confidence level, and marginal learning is also significant at a 95% level.

  12. Consistent thermodynamic properties of lipids systems

    DEFF Research Database (Denmark)

    Cunico, Larissa; Ceriani, Roberta; Sarup, Bent

    different pressures, with azeotrope behavior observed. Available thermodynamic consistency tests for TPx data were applied before performing parameter regressions for Wilson, NRTL, UNIQUAC and original UNIFAC models. The relevance of enlarging experimental databank of lipids systems data in order to improve......Physical and thermodynamic properties of pure components and their mixtures are the basic requirement for process design, simulation, and optimization. In the case of lipids, our previous works[1-3] have indicated a lack of experimental data for pure components and also for their mixtures...... the performance of predictive thermodynamic models was confirmed in this work by analyzing the calculated values of original UNIFAC model. For solid-liquid equilibrium (SLE) data, new consistency tests have been developed [2]. Some of the developed tests were based in the quality tests proposed for VLE data...

  13. Consistency relation for cosmic magnetic fields

    DEFF Research Database (Denmark)

    Jain, R. K.; Sloth, M. S.

    2012-01-01

    If cosmic magnetic fields are indeed produced during inflation, they are likely to be correlated with the scalar metric perturbations that are responsible for the cosmic microwave background anisotropies and large scale structure. Within an archetypical model of inflationary magnetogenesis, we show...... that there exists a new simple consistency relation for the non-Gaussian cross correlation function of the scalar metric perturbation with two powers of the magnetic field in the squeezed limit where the momentum of the metric perturbation vanishes. We emphasize that such a consistency relation turns out...... to be extremely useful to test some recent calculations in the literature. Apart from primordial non-Gaussianity induced by the curvature perturbations, such a cross correlation might provide a new observational probe of inflation and can in principle reveal the primordial nature of cosmic magnetic fields. DOI...

  14. Consistent Estimation of Partition Markov Models

    Directory of Open Access Journals (Sweden)

    Jesús E. García

    2017-04-01

    Full Text Available The Partition Markov Model characterizes the process by a partition L of the state space, where the elements in each part of L share the same transition probability to an arbitrary element in the alphabet. This model aims to answer the following questions: what is the minimal number of parameters needed to specify a Markov chain and how to estimate these parameters. In order to answer these questions, we build a consistent strategy for model selection which consist of: giving a size n realization of the process, finding a model within the Partition Markov class, with a minimal number of parts to represent the process law. From the strategy, we derive a measure that establishes a metric in the state space. In addition, we show that if the law of the process is Markovian, then, eventually, when n goes to infinity, L will be retrieved. We show an application to model internet navigation patterns.

  15. Internal Branding and Employee Brand Consistent Behaviours

    DEFF Research Database (Denmark)

    Mazzei, Alessandra; Ravazzani, Silvia

    2017-01-01

    constitutive processes. In particular, the paper places emphasis on the role and kinds of communication practices as a central part of the nonnormative and constitutive internal branding process. The paper also discusses an empirical study based on interviews with 32 Italian and American communication managers...... and 2 focus groups with Italian communication managers. Findings show that, in order to enhance employee brand consistent behaviours, the most effective communication practices are those characterised as enablement-oriented. Such a communication creates the organizational conditions adequate to sustain......Employee behaviours conveying brand values, named brand consistent behaviours, affect the overall brand evaluation. Internal branding literature highlights a knowledge gap in terms of communication practices intended to sustain such behaviours. This study contributes to the development of a non...

  16. Self-consistent velocity dependent effective interactions

    International Nuclear Information System (INIS)

    Kubo, Takayuki; Sakamoto, Hideo; Kammuri, Tetsuo; Kishimoto, Teruo.

    1993-09-01

    The field coupling method is extended to a system with a velocity dependent mean potential. By means of this method, we can derive the effective interactions which are consistent with the mean potential. The self-consistent velocity dependent effective interactions are applied to the microscopic analysis of the structures of giant dipole resonances (GDR) of 148,154 Sm, of the first excited 2 + states of Sn isotopes and of the first excited 3 - states of Mo isotopes. It is clarified that the interactions play crucial roles in describing the splitting of the resonant structure of GDR peaks, in restoring the energy weighted sum rule values, and in reducing B (Eλ) values. (author)

  17. Evaluating Temporal Consistency in Marine Biodiversity Hotspots

    OpenAIRE

    Piacenza, Susan E.; Thurman, Lindsey L.; Barner, Allison K.; Benkwitt, Cassandra E.; Boersma, Kate S.; Cerny-Chipman, Elizabeth B.; Ingeman, Kurt E.; Kindinger, Tye L.; Lindsley, Amy J.; Nelson, Jake; Reimer, Jessica N.; Rowe, Jennifer C.; Shen, Chenchen; Thompson, Kevin A.; Heppell, Selina S.

    2015-01-01

    With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monito...

  18. Consistency Analysis of Nearest Subspace Classifier

    OpenAIRE

    Wang, Yi

    2015-01-01

    The Nearest subspace classifier (NSS) finds an estimation of the underlying subspace within each class and assigns data points to the class that corresponds to its nearest subspace. This paper mainly studies how well NSS can be generalized to new samples. It is proved that NSS is strongly consistent under certain assumptions. For completeness, NSS is evaluated through experiments on various simulated and real data sets, in comparison with some other linear model based classifiers. It is also ...

  19. Presynaptic M1 muscarinic receptor modulates spontaneous release of acetylcholine from rat basal forearm slices

    International Nuclear Information System (INIS)

    Suzuki, T.; Fujimoto, LK.; Oohata, H.; Kawashima, K.

    1988-01-01

    Spontaneous release of (ACh) from rat basal forebrain slices in the presence of cholinesterase inhibitor was directly determined using a specific radioimmunoassay for ACh. The release was calcium dependent. A consistent amount of ACh release was observed throughout the experiment. Atropine (10- 8 to 10- 5 M) and pirenzepine (10- 7 to 10- 5 M) enhanced spontaneous ACh release. These findings indicate the presence of an M 1 muscarenic autoreceptor that modulates spontaneous release of ACh in the rat forebrain

  20. Decontamination for free release

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, K A; Elder, G R [Bradtec Ltd., Bristol (United Kingdom)

    1997-02-01

    Many countries are seeking to treat radioactive waste in ways which meet the local regulatory requirements, but yet are cost effective when all contributing factors are assessed. In some countries there are increasing amounts of waste, arising from nuclear plant decommissioning, which are categorized as low level waste: however with suitable treatment a large part of such wastes might become beyond regulatory control and be able to be released as non-radioactive. The benefits and disadvantages of additional treatment before disposal need to be considered. Several processes falling within the overall description of decontamination for free release have been developed and applied, and these are outlined. In one instance the process seeks to take advantage of techniques and equipment used for decontaminating water reactor circuits intermittently through reactor life. (author). 9 refs, 1 fig., 3 tabs.

  1. Atmospheric Release Advisory Capability

    International Nuclear Information System (INIS)

    Dickerson, M.H.; Gudiksen, P.H.; Sullivan, T.J.

    1983-02-01

    The Atmospheric Release Advisory Capability (ARAC) project is a Department of Energy (DOE) sponsored real-time emergency response service available for use by both federal and state agencies in case of a potential or actual atmospheric release of nuclear material. The project, initiated in 1972, is currently evolving from the research and development phase to full operation. Plans are underway to expand the existing capability to continuous operation by 1984 and to establish a National ARAC Center (NARAC) by 1988. This report describes the ARAC system, its utilization during the past two years, and plans for its expansion during the next five to six years. An integral part of this expansion is due to a very important and crucial effort sponsored by the Defense Nuclear Agency to extend the ARAC service to approximately 45 Department of Defense (DOD) sites throughout the continental US over the next three years

  2. Border cell release

    DEFF Research Database (Denmark)

    Mravec, Jozef

    2017-01-01

    Plant border cells are specialised cells derived from the root cap with roles in the biomechanics of root growth and in forming a barrier against pathogens. The mechanism of highly localised cell separation which is essential for their release to the environment is little understood. Here I present...... in situ analysis of Brachypodium distachyon, a model organism for grasses which possess type II primary cell walls poor in pectin content. Results suggest similarity in spatial dynamics of pectic homogalacturonan during dicot and monocot border cell release. Integration of observations from different...... species leads to the hypothesis that this process most likely does not involve degradation of cell wall material but rather employs unique cell wall structural and compositional means enabling both the rigidity of the root cap as well as detachability of given cells on its surface....

  3. Energy released in fission

    International Nuclear Information System (INIS)

    James, M.F.

    1969-05-01

    The effective energy released in and following the fission of U-235, Pu-239 and Pu-241 by thermal neutrons, and of U-238 by fission spectrum neutrons, is discussed. The recommended values are: U-235 ... 192.9 ± 0.5 MeV/fission; U-238 ... 193.9 ± 0.8 MeV/fission; Pu-239 ... 198.5 ± 0.8 MeV/fission; Pu-241 ... 200.3 ± 0.8 MeV/fission. These values include all contributions except from antineutrinos and very long-lived fission products. The detailed contributions are discussed, and inconsistencies in the experimental data are pointed out. In Appendix A, the contribution to the total useful energy release in a reactor from reactions other than fission are discussed briefly, and in Appendix B there is a discussion of the variations in effective energy from fission with incident neutron energy. (author)

  4. Consistency relations in effective field theory

    Energy Technology Data Exchange (ETDEWEB)

    Munshi, Dipak; Regan, Donough, E-mail: D.Munshi@sussex.ac.uk, E-mail: D.Regan@sussex.ac.uk [Astronomy Centre, School of Mathematical and Physical Sciences, University of Sussex, Brighton BN1 9QH (United Kingdom)

    2017-06-01

    The consistency relations in large scale structure relate the lower-order correlation functions with their higher-order counterparts. They are direct outcome of the underlying symmetries of a dynamical system and can be tested using data from future surveys such as Euclid. Using techniques from standard perturbation theory (SPT), previous studies of consistency relation have concentrated on continuity-momentum (Euler)-Poisson system of an ideal fluid. We investigate the consistency relations in effective field theory (EFT) which adjusts the SPT predictions to account for the departure from the ideal fluid description on small scales. We provide detailed results for the 3D density contrast δ as well as the scaled divergence of velocity θ-bar . Assuming a ΛCDM background cosmology, we find the correction to SPT results becomes important at k ∼> 0.05 h/Mpc and that the suppression from EFT to SPT results that scales as square of the wave number k , can reach 40% of the total at k ≈ 0.25 h/Mpc at z = 0. We have also investigated whether effective field theory corrections to models of primordial non-Gaussianity can alter the squeezed limit behaviour, finding the results to be rather insensitive to these counterterms. In addition, we present the EFT corrections to the squeezed limit of the bispectrum in redshift space which may be of interest for tests of theories of modified gravity.

  5. Consistent probabilities in loop quantum cosmology

    International Nuclear Information System (INIS)

    Craig, David A; Singh, Parampreet

    2013-01-01

    A fundamental issue for any quantum cosmological theory is to specify how probabilities can be assigned to various quantum events or sequences of events such as the occurrence of singularities or bounces. In previous work, we have demonstrated how this issue can be successfully addressed within the consistent histories approach to quantum theory for Wheeler–DeWitt-quantized cosmological models. In this work, we generalize that analysis to the exactly solvable loop quantization of a spatially flat, homogeneous and isotropic cosmology sourced with a massless, minimally coupled scalar field known as sLQC. We provide an explicit, rigorous and complete decoherent-histories formulation for this model and compute the probabilities for the occurrence of a quantum bounce versus a singularity. Using the scalar field as an emergent internal time, we show for generic states that the probability for a singularity to occur in this model is zero, and that of a bounce is unity, complementing earlier studies of the expectation values of the volume and matter density in this theory. We also show from the consistent histories point of view that all states in this model, whether quantum or classical, achieve arbitrarily large volume in the limit of infinite ‘past’ or ‘future’ scalar ‘time’, in the sense that the wave function evaluated at any arbitrary fixed value of the volume vanishes in that limit. Finally, we briefly discuss certain misconceptions concerning the utility of the consistent histories approach in these models. (paper)

  6. Orthology and paralogy constraints: satisfiability and consistency.

    Science.gov (United States)

    Lafond, Manuel; El-Mabrouk, Nadia

    2014-01-01

    A variety of methods based on sequence similarity, reconciliation, synteny or functional characteristics, can be used to infer orthology and paralogy relations between genes of a given gene family  G. But is a given set  C of orthology/paralogy constraints possible, i.e., can they simultaneously co-exist in an evolutionary history for  G? While previous studies have focused on full sets of constraints, here we consider the general case where  C does not necessarily involve a constraint for each pair of genes. The problem is subdivided in two parts: (1) Is  C satisfiable, i.e. can we find an event-labeled gene tree G inducing  C? (2) Is there such a G which is consistent, i.e., such that all displayed triplet phylogenies are included in a species tree? Previous results on the Graph sandwich problem can be used to answer to (1), and we provide polynomial-time algorithms for satisfiability and consistency with a given species tree. We also describe a new polynomial-time algorithm for the case of consistency with an unknown species tree and full knowledge of pairwise orthology/paralogy relationships, as well as a branch-and-bound algorithm in the case when unknown relations are present. We show that our algorithms can be used in combination with ProteinOrtho, a sequence similarity-based orthology detection tool, to extract a set of robust orthology/paralogy relationships.

  7. Modelling transient energy release from molten fuel coolant interaction debris

    International Nuclear Information System (INIS)

    Fletcher, D.F.

    1984-05-01

    A simple model of transient energy release in a Molten Fuel Coolant Interaction is presented. A distributed heat transfer model is used to examine the effect of heat transfer coefficient, time available for rapid energy heat transfer and particle size on transient energy release. The debris is assumed to have an Upper Limit Lognormal distribution. Model predictions are compared with results from the SUW series of experiments which used thermite-generated uranium dioxide molybdenum melts released below the surface of a pool of water. Uncertainties in the physical principles involved in the calculation of energy transfer rates are discussed. (author)

  8. Slow-release fertilizer

    Science.gov (United States)

    Ming, Douglas W.; Golden, D. C.

    1992-10-01

    A synthetic apatite containing agronutrients and a method for making the apatite are disclosed. The apatite comprises crystalline calcium phosphate having agronutrients dispersed in the crystalline structure. The agronutrients can comprise potassium, magnesium, sulfur, iron, manganese, molybdenum, chlorine, boron, copper and zinc in amounts suited for plant growth. The apatite can optionally comprise a carbonate and/or silicon solubility control agent. The agronutrients are released slowly as the apatite dissolves.

  9. EIA new releases

    International Nuclear Information System (INIS)

    1994-12-01

    This report was prepared by the Energy Information Administration. It contains news releases on items of interest to the petroleum, coal, nuclear, electric and alternate fuels industries ranging from economic outlooks to environmental concerns. There is also a listing of reports by industry and an energy education resource listing containing sources for free or low-cost energy-related educational materials for educators and primary and secondary students

  10. Atmospheric release advisory capability

    International Nuclear Information System (INIS)

    Sullivan, T.J.

    1981-01-01

    The ARAC system (Atmospheric Release Advisory Capability) is described. The system is a collection of people, computers, computer models, topographic data and meteorological input data that together permits a calculation of, in a quasi-predictive sense, where effluent from an accident will migrate through the atmosphere, where it will be deposited on the ground, and what instantaneous and integrated dose an exposed individual would receive

  11. Slow-release fertilizer

    Science.gov (United States)

    Ming, Douglas W. (Inventor); Golden, Dadigamuwage C. (Inventor)

    1995-01-01

    A synthetic apatite containing agronutrients and a method for making the apatite are disclosed. The apatite comprises crystalline calcium phosphate having agronutrients dispersed in the crystalline structure. The agronutrients can comprise potassium, magnesium, sulfur, iron, manganese, molybdenum, chlorine, boron, copper and zinc in amounts suited for plant growth. The apatite can optionally comprise a carbonate and/or silicon solubility control agent. The agronutrients are released slowly as the apatite dissolves.

  12. Secondary electron emission and self-consistent charge transport in semi-insulating samples

    Energy Technology Data Exchange (ETDEWEB)

    Fitting, H.-J. [Institute of Physics, University of Rostock, Universitaetsplatz 3, D-18051 Rostock (Germany); Touzin, M. [Unite Materiaux et Transformations, UMR CNRS 8207, Universite de Lille 1, F-59655 Villeneuve d' Ascq (France)

    2011-08-15

    Electron beam induced self-consistent charge transport and secondary electron emission (SEE) in insulators are described by means of an electron-hole flight-drift model (FDM) now extended by a certain intrinsic conductivity (c) and are implemented by an iterative computer simulation. Ballistic secondary electrons (SE) and holes, their attenuation to drifting charge carriers, and their recombination, trapping, and field- and temperature-dependent detrapping are included. As a main result the time dependent ''true'' secondary electron emission rate {delta}(t) released from the target material and based on ballistic electrons and the spatial distributions of currents j(x,t), charges {rho}(x,t), field F(x,t), and potential V(x,t) are obtained where V{sub 0} = V(0,t) presents the surface potential. The intrinsic electronic conductivity limits the charging process and leads to a conduction sample current to the support. In that case the steady-state total SE yield will be fixed below the unit: i.e., {sigma} {eta} + {delta} < 1.

  13. Contact: Releasing the news

    Science.gov (United States)

    Pinotti, Roberto

    The problem of mass behavior after man's future contacts with other intelligences in the universe is not only a challenge for social scientists and political leaders all over the world, but also a cultural time bomb as well. In fact, since the impact of CETI (Contact with Extraterrestrial Intelligence) on human civilization, with its different cultures, might cause a serious socio-anthropological shock, a common and predetermined worldwide strategy is necessary in releasing the news after the contact, in order to keep possible manifestations of fear, panic and hysteria under control. An analysis of past studies in this field and of parallel historical situations as analogs suggests a definite "authority crisis" in the public as a direct consequence of an unexpected release of the news, involving a devastating "chain reaction" process (from both the psychological and sociological viewpoints) of anomie and maybe the collapse of today's society. The only way to prevent all this is to prepare the world's public opinion concerning contact before releasing the news, and to develop a long-term strategy through the combined efforts of scientists, political leaders, intelligence agencies and the mass media, in order to create the cultural conditions in which a confrontation with ETI won't affect mankind in a traumatic way. Definite roles and tasks in this multi-level model are suggested.

  14. Consistency of color representation in smart phones.

    Science.gov (United States)

    Dain, Stephen J; Kwan, Benjamin; Wong, Leslie

    2016-03-01

    One of the barriers to the construction of consistent computer-based color vision tests has been the variety of monitors and computers. Consistency of color on a variety of screens has necessitated calibration of each setup individually. Color vision examination with a carefully controlled display has, as a consequence, been a laboratory rather than a clinical activity. Inevitably, smart phones have become a vehicle for color vision tests. They have the advantage that the processor and screen are associated and there are fewer models of smart phones than permutations of computers and monitors. Colorimetric consistency of display within a model may be a given. It may extend across models from the same manufacturer but is unlikely to extend between manufacturers especially where technologies vary. In this study, we measured the same set of colors in a JPEG file displayed on 11 samples of each of four models of smart phone (iPhone 4s, iPhone5, Samsung Galaxy S3, and Samsung Galaxy S4) using a Photo Research PR-730. The iPhones are white LED backlit LCD and the Samsung are OLEDs. The color gamut varies between models and comparison with sRGB space shows 61%, 85%, 117%, and 110%, respectively. The iPhones differ markedly from the Samsungs and from one another. This indicates that model-specific color lookup tables will be needed. Within each model, the primaries were quite consistent (despite the age of phone varying within each sample). The worst case in each model was the blue primary; the 95th percentile limits in the v' coordinate were ±0.008 for the iPhone 4 and ±0.004 for the other three models. The u'v' variation in white points was ±0.004 for the iPhone4 and ±0.002 for the others, although the spread of white points between models was u'v'±0.007. The differences are essentially the same for primaries at low luminance. The variation of colors intermediate between the primaries (e.g., red-purple, orange) mirror the variation in the primaries. The variation in

  15. Do Health Systems Have Consistent Performance Across Locations and Is Consistency Associated With Higher Performance?

    Science.gov (United States)

    Crespin, Daniel J; Christianson, Jon B; McCullough, Jeffrey S; Finch, Michael D

    This study addresses whether health systems have consistent diabetes care performance across their ambulatory clinics and whether increasing consistency is associated with improvements in clinic performance. Study data included 2007 to 2013 diabetes care intermediate outcome measures for 661 ambulatory clinics in Minnesota and bordering states. Health systems provided more consistent performance, as measured by the standard deviation of performance for clinics in a system, relative to propensity score-matched proxy systems created for comparison purposes. No evidence was found that improvements in consistency were associated with higher clinic performance. The combination of high performance and consistent care is likely to enhance a health system's brand reputation, allowing it to better mitigate the financial risks of consumers seeking care outside the organization. These results suggest that larger health systems are most likely to deliver the combination of consistent and high-performance care. Future research should explore the mechanisms that drive consistent care within health systems.

  16. Relative Release-to-Birth Indicators for Investigating TRISO Fuel Fission Gas Release Models

    International Nuclear Information System (INIS)

    Harp, Jason M.; Hawari, Ayman I.

    2008-01-01

    TRISO microsphere fuel is the fundamental fuel unit for Very High Temperature Reactors (VHTR). A single TRISO particle consists of an inner kernel of uranium dioxide or uranium oxycarbide surrounded by layers of pyrolytic carbon and silicon carbide. If the silicon carbide layer fails, fission products, especially the noble fission gases Kr and Xe, will begin to escape the failed particle. The release of fission gas is usually quantified by measuring the ratio of the released activity (R) to the original birth activity (B), which is designated as the R/B ratio. In this work, relative Release-to-Birth indicators (I) are proposed as a technique for interpreting the results of TRISO irradiation experiments. By implementing a relative metric, it is possible to reduce the sensitivity of the indicators to instrumental uncertainties and variations in experimental conditions. As an example, relative R/B indicators are applied to the interpretation of representative data from the Advanced Gas Reactor-1 TRISO fuel experiment that is currently taking place at the Advanced Test Reactor of Idaho National Laboratory. It is shown that the comparison of measured to predicted relative R/B indicators (I) gives insight into the physics of release and helps validate release models. Different trends displayed by the indicators are related to the mechanisms of fission gas release such as diffusion and recoil. The current analysis shows evidence for separate diffusion coefficients for Kr and Xe and supports the need to account for recoil release. (authors)

  17. Radionuclide releases from natural analogues of spent nuclear fuel

    International Nuclear Information System (INIS)

    Curtis, D.B.; Fabryka-Martin, J.; Dixon, P.; Aguilar, R.; Rokop, D.; Cramer, J.

    1993-01-01

    Measures of 99 Tc, 129 I, 239 Pu and U concentrations in rock samples from uranium deposits at Cigar Lake and Koongarra have been used to study processes of radionuclide release from uranium minerals. Rates of release have been immeasurably slow at Cigar Lake. At Koongarra release rates appear to have been faster, producing small deficiencies of 99 Tc, and larger ones of 129 I. The inferred differences in radionuclide release rates are consistent with expected differences in uranium mineral degradation rates produced by the differing hydrogeochemical environments at the two sites

  18. Integrated environmental modeling system for noble gas releases at the Savannah River Plant

    International Nuclear Information System (INIS)

    Cooper, R.E.

    1973-01-01

    The Savannah River Plant (SRP) is a large nuclear complex engaged in varied activities and is the AEC's major site for the production of weapons material. As a result of these activities, there are continuous and intermittent releases of radioactive gases to the atmosphere. Of these releases, the noble gases constitute about 11 percent of the total man-rem exposure to the population out to a distance of 100 km. Although SRP has an extensive radiological monitoring program, an environmental modeling system is necessary for adequately estimating effects on the environment. The integrated environmental modeling system in use at SRP consists of a series of computer programs that generate and use a library of environmental effects data as a function of azimuth and distance. Annual average atmospheric dispersion and azimuthal distribution of material assumed to be released as unit sources is estimated from a 2-year meteorological data base--assuming an arbitrary point of origin. The basic library of data consists of: ground-level concentrations according to isotope, and whole body gamma dose calculations that account for the total spatial distribution at discrete energy levels. These data are normalized to tritium measurements, and are subsequently used to generate similar library data that pertain to specific source locations, but always with respect to the same population grid. Thus, the total additive effects from all source points, both on- and off-site, can be estimated. The final program uses the library data to estimate population exposures for specified releases and source points for the nuclides of interest (including noble gases). Multiple source points are considered within a single pass to obtain the integrated effects from all sources

  19. Protecting privacy in data release

    CERN Document Server

    Livraga, Giovanni

    2015-01-01

    This book presents a comprehensive approach to protecting sensitive information when large data collections are released by their owners. It addresses three key requirements of data privacy: the protection of data explicitly released, the protection of information not explicitly released but potentially vulnerable due to a release of other data, and the enforcement of owner-defined access restrictions to the released data. It is also the first book with a complete examination of how to enforce dynamic read and write access authorizations on released data, applicable to the emerging data outsou

  20. Triggered Release from Polymer Capsules

    Energy Technology Data Exchange (ETDEWEB)

    Esser-Kahn, Aaron P. [Univ. of Illinois, Urbana, IL (United States). Beckman Inst. for Advanced Science and Technology and Dept. of Chemistry; Odom, Susan A. [Univ. of Illinois, Urbana, IL (United States). Beckman Inst. for Advanced Science and Technology and Dept. of Chemistry; Sottos, Nancy R. [Univ. of Illinois, Urbana, IL (United States). Beckman Inst. for Advanced Science and Technology and Dept. of Materials Science and Engineering; White, Scott R. [Univ. of Illinois, Urbana, IL (United States). Beckman Inst. for Advanced Science and Technology and Dept. of Aerospace Engineering; Moore, Jeffrey S. [Univ. of Illinois, Urbana, IL (United States). Beckman Inst. for Advanced Science and Technology and Dept. of Chemistry

    2011-07-06

    Stimuli-responsive capsules are of interest in drug delivery, fragrance release, food preservation, and self-healing materials. Many methods are used to trigger the release of encapsulated contents. Here we highlight mechanisms for the controlled release of encapsulated cargo that utilize chemical reactions occurring in solid polymeric shell walls. Triggering mechanisms responsible for covalent bond cleavage that result in the release of capsule contents include chemical, biological, light, thermal, magnetic, and electrical stimuli. We present methods for encapsulation and release, triggering methods, and mechanisms and conclude with our opinions on interesting obstacles for chemically induced activation with relevance for controlled release.

  1. Self-consistent gravitational self-force

    International Nuclear Information System (INIS)

    Pound, Adam

    2010-01-01

    I review the problem of motion for small bodies in general relativity, with an emphasis on developing a self-consistent treatment of the gravitational self-force. An analysis of the various derivations extant in the literature leads me to formulate an asymptotic expansion in which the metric is expanded while a representative worldline is held fixed. I discuss the utility of this expansion for both exact point particles and asymptotically small bodies, contrasting it with a regular expansion in which both the metric and the worldline are expanded. Based on these preliminary analyses, I present a general method of deriving self-consistent equations of motion for arbitrarily structured (sufficiently compact) small bodies. My method utilizes two expansions: an inner expansion that keeps the size of the body fixed, and an outer expansion that lets the body shrink while holding its worldline fixed. By imposing the Lorenz gauge, I express the global solution to the Einstein equation in the outer expansion in terms of an integral over a worldtube of small radius surrounding the body. Appropriate boundary data on the tube are determined from a local-in-space expansion in a buffer region where both the inner and outer expansions are valid. This buffer-region expansion also results in an expression for the self-force in terms of irreducible pieces of the metric perturbation on the worldline. Based on the global solution, these pieces of the perturbation can be written in terms of a tail integral over the body's past history. This approach can be applied at any order to obtain a self-consistent approximation that is valid on long time scales, both near and far from the small body. I conclude by discussing possible extensions of my method and comparing it to alternative approaches.

  2. Offsite doses from SRP radioactive releases - 1985

    International Nuclear Information System (INIS)

    Marter, W.L.

    1986-01-01

    This memorandum summarizes the offsite doses from releases of radioactive materials to the environment from SRP operations in 1985. These doses were calculated for inclusion in the environmental report for 1985 to be issued by the Health Protection Department (DPSPU-86-30-1). The environmental report is prepared annually for distribution to state environmental agencies, the news media, and interested members of the public. More detailed data on offsite exposures by radionuclide and exposure pathway will be included in the environmental report

  3. Consistency Checking of Web Service Contracts

    DEFF Research Database (Denmark)

    Cambronero, M. Emilia; Okika, Joseph C.; Ravn, Anders Peter

    2008-01-01

    Behavioural properties are analyzed for web service contracts formulated in Business Process Execution Language (BPEL) and Choreography Description Language (CDL). The key result reported is an automated technique to check consistency between protocol aspects of the contracts. The contracts...... are abstracted to (timed) automata and from there a simulation is set up, which is checked using automated tools for analyzing networks of finite state processes. Here we use the Concurrency Work Bench. The proposed techniques are illustrated with a case study that include otherwise difficult to analyze fault...

  4. A method for consistent precision radiation therapy

    International Nuclear Information System (INIS)

    Leong, J.

    1985-01-01

    Using a meticulous setup procedure in which repeated portal films were taken before each treatment until satisfactory portal verifications were obtained, a high degree of precision in patient positioning was achieved. A fluctuation from treatment to treatment, over 11 treatments, of less than +-0.10 cm (S.D.) for anatomical points inside the treatment field was obtained. This, however, only applies to specific anatomical points selected for this positioning procedure and does not apply to all points within the portal. We have generalized this procedure and have suggested a means by which any target volume can be consistently positioned which may approach this degree of precision. (orig.)

  5. Gentzen's centenary the quest for consistency

    CERN Document Server

    Rathjen, Michael

    2015-01-01

    Gerhard Gentzen has been described as logic’s lost genius, whom Gödel called a better logician than himself. This work comprises articles by leading proof theorists, attesting to Gentzen’s enduring legacy to mathematical logic and beyond. The contributions range from philosophical reflections and re-evaluations of Gentzen’s original consistency proofs to the most recent developments in proof theory. Gentzen founded modern proof theory. His sequent calculus and natural deduction system beautifully explain the deep symmetries of logic. They underlie modern developments in computer science such as automated theorem proving and type theory.

  6. Two consistent calculations of the Weinberg angle

    International Nuclear Information System (INIS)

    Fairlie, D.B.

    1979-01-01

    The Weinberg-Salam theory is reformulated as a pure Yang-Mills theory in a six-dimensional space, the Higgs field being interpreted as gauge potentials in the additional dimensions. Viewed in this way, the condition that the Higgs field transforms as a U(1) representation of charge one is equivalent to requiring a value of 30 0 C for the Weinberg angle. A second consistent determination comes from the idea borrowed from monopole theory that the electromagnetic field is in the direction of the Higgs field. (Author)

  7. Self-consistent Langmuir waves in resonantly driven thermal plasmas

    Science.gov (United States)

    Lindberg, R. R.; Charman, A. E.; Wurtele, J. S.

    2007-12-01

    The longitudinal dynamics of a resonantly driven Langmuir wave are analyzed in the limit that the growth of the electrostatic wave is slow compared to the bounce frequency. Using simple physical arguments, the nonlinear distribution function is shown to be nearly invariant in the canonical particle action, provided both a spatially uniform term and higher-order spatial harmonics are included along with the fundamental in the longitudinal electric field. Requirements of self-consistency with the electrostatic potential yield the basic properties of the nonlinear distribution function, including a frequency shift that agrees closely with driven, electrostatic particle simulations over a range of temperatures. This extends earlier work on nonlinear Langmuir waves by Morales and O'Neil [G. J. Morales and T. M. O'Neil, Phys. Rev. Lett. 28, 417 (1972)] and Dewar [R. L. Dewar, Phys. Plasmas 15, 712 (1972)], and could form the basis of a reduced kinetic treatment of plasma dynamics for accelerator applications or Raman backscatter.

  8. Self-consistent Langmuir waves in resonantly driven thermal plasmas

    International Nuclear Information System (INIS)

    Lindberg, R. R.; Charman, A. E.; Wurtele, J. S.

    2007-01-01

    The longitudinal dynamics of a resonantly driven Langmuir wave are analyzed in the limit that the growth of the electrostatic wave is slow compared to the bounce frequency. Using simple physical arguments, the nonlinear distribution function is shown to be nearly invariant in the canonical particle action, provided both a spatially uniform term and higher-order spatial harmonics are included along with the fundamental in the longitudinal electric field. Requirements of self-consistency with the electrostatic potential yield the basic properties of the nonlinear distribution function, including a frequency shift that agrees closely with driven, electrostatic particle simulations over a range of temperatures. This extends earlier work on nonlinear Langmuir waves by Morales and O'Neil [G. J. Morales and T. M. O'Neil, Phys. Rev. Lett. 28, 417 (1972)] and Dewar [R. L. Dewar, Phys. Plasmas 15, 712 (1972)], and could form the basis of a reduced kinetic treatment of plasma dynamics for accelerator applications or Raman backscatter

  9. Coagulation of Agglomerates Consisting of Polydisperse Primary Particles.

    Science.gov (United States)

    Goudeli, E; Eggersdorfer, M L; Pratsinis, S E

    2016-09-13

    The ballistic agglomeration of polydisperse particles is investigated by an event-driven (ED) method and compared to the coagulation of spherical particles and agglomerates consisting of monodisperse primary particles (PPs). It is shown for the first time to our knowledge that increasing the width or polydispersity of the PP size distribution initially accelerates the coagulation rate of their agglomerates but delays the attainment of their asymptotic fractal-like structure and self-preserving size distribution (SPSD) without altering them, provided that sufficiently large numbers of PPs are employed. For example, the standard asymptotic mass fractal dimension, Df, of 1.91 is attained when clusters are formed containing, on average, about 15 monodisperse PPs, consistent with fractal theory and the literature. In contrast, when polydisperse PPs with a geometric standard deviation of 3 are employed, about 500 PPs are needed to attain that Df. Even though the same asymptotic Df and mass-mobility exponent, Dfm, are attained regardless of PP polydispersity, the asymptotic prefactors or lacunarities of Df and Dfm increase with PP polydispersity. For monodisperse PPs, the average agglomerate radius of gyration, rg, becomes larger than the mobility radius, rm, when agglomerates consist of more than 15 PPs. Increasing PP polydispersity increases that number of PPs similarly to the above for the attainment of the asymptotic Df or Dfm. The agglomeration kinetics are quantified by the overall collision frequency function. When the SPSD is attained, the collision frequency is independent of PP polydispersity. Accounting for the SPSD polydispersity in the overall agglomerate collision frequency is in good agreement with that frequency from detailed ED simulations once the SPSD is reached. Most importantly, the coagulation of agglomerates is described well by a monodisperse model for agglomerate and PP sizes, whereas the detailed agglomerate size distribution can be obtained by

  10. Bootstrap consistency for general semiparametric M-estimation

    KAUST Repository

    Cheng, Guang

    2010-10-01

    Consider M-estimation in a semiparametric model that is characterized by a Euclidean parameter of interest and an infinite-dimensional nuisance parameter. As a general purpose approach to statistical inferences, the bootstrap has found wide applications in semiparametric M-estimation and, because of its simplicity, provides an attractive alternative to the inference approach based on the asymptotic distribution theory. The purpose of this paper is to provide theoretical justifications for the use of bootstrap as a semiparametric inferential tool. We show that, under general conditions, the bootstrap is asymptotically consistent in estimating the distribution of the M-estimate of Euclidean parameter; that is, the bootstrap distribution asymptotically imitates the distribution of the M-estimate. We also show that the bootstrap confidence set has the asymptotically correct coverage probability. These general onclusions hold, in particular, when the nuisance parameter is not estimable at root-n rate, and apply to a broad class of bootstrap methods with exchangeable ootstrap weights. This paper provides a first general theoretical study of the bootstrap in semiparametric models. © Institute of Mathematical Statistics, 2010.

  11. Power release estimation inside of fuel pins neighbouring fuel pin with gadolinium in a WWER-1000 type core

    International Nuclear Information System (INIS)

    Mikus, J.

    2006-01-01

    The purpose of this work consists in investigation of the gadolinium fuel pin (fps) influence on space power distribution, especially from viewpoint of the values and gradient occurrence inside of neighbouring FPs that could result in static loads with some consequences, e.g., FP bowing. Since detailed power distributions cannot be obtained in the NPPs, needed information is provided by means of experiments on research reactors. As for the power release measurement inside of FPs, some special (e.g. track) detectors placed between fuel pellets are usually used. Since such works are relatively complicated and time consuming, an evaluation method based on mathematical modelling and numerical approximation was proposed by means of that, and using measured (integral) power release in selected FPs, relevant information about power release inside of needed (investigated) FP, can be obtained. For this purpose, an experiment on light water, zero-power research reactor LR-0 was realized in a WWER-1000 type core with 7 fuel assemblies at zero boron concentration and containing gadolinium FPs. Application of the above evaluation method is demonstrated on investigated FP neighbouring a FP with gadolinium by means of the 1) Azimuthal power distribution inside of investigated FP on their fuel pellet surface in horizontal plane and 2) Gradient of the power distribution inside of investigated FP in two opposite positions on pellets surface that are situated to- and outwards a FP with gadolinium. Similar information can be relevant from the viewpoint of the FP failures occurrence investigation (Authors)

  12. Helium release from radioisotope heat sources

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, D.E.; Early, J.W.; Starzynski, J.S.; Land, C.C.

    1984-05-01

    Diffusion of helium in /sup 238/PuO/sub 2/ fuel was characterized as a function of the heating rate and the fuel microstructure. The samples were thermally ramped in an induction furnace and the helium release rates measured with an automated mass spectrometer. The diffusion constants and activation energies were obtained from the data using a simple diffusion model. The release rates of helium were correlated with the fuel microstructure by metallographic examination of fuel samples. The release mechanism consists of four regimes, which are dependent upon the temperature. Initially, the release is controlled by movement of point defects combined with trapping along grain boundaries. This regime is followed by a process dominated by formation and growth of helium bubbles along grain boundaries. The third regime involves volume diffusion controlled by movement of oxygen vacancies. Finally, the release at the highest temperatures follows the diffusion rate of intragranular bubbles. The tendency for helium to be trapped within the grain boundaries diminishes with small grain sizes, slow thermal pulses, and older fuel.

  13. Validation of software releases for CMS

    International Nuclear Information System (INIS)

    Gutsche, Oliver

    2010-01-01

    The CMS software stack currently consists of more than 2 Million lines of code developed by over 250 authors with a new version being released every week. CMS has setup a validation process for quality assurance which enables the developers to compare the performance of a release to previous releases and references. The validation process provides the developers with reconstructed datasets of real data and MC samples. The samples span the whole range of detector effects and important physics signatures to benchmark the performance of the software. They are used to investigate interdependency effects of all CMS software components and to find and fix bugs. The release validation process described here is an integral part of CMS software development and contributes significantly to ensure stable production and analysis. It represents a sizable contribution to the overall MC production of CMS. Its success emphasizes the importance of a streamlined release validation process for projects with a large code basis and significant number of developers and can function as a model for future projects.

  14. Helium release from radioisotope heat sources

    International Nuclear Information System (INIS)

    Peterson, D.E.; Early, J.W.; Starzynski, J.S.; Land, C.C.

    1984-05-01

    Diffusion of helium in 238 PuO 2 fuel was characterized as a function of the heating rate and the fuel microstructure. The samples were thermally ramped in an induction furnace and the helium release rates measured with an automated mass spectrometer. The diffusion constants and activation energies were obtained from the data using a simple diffusion model. The release rates of helium were correlated with the fuel microstructure by metallographic examination of fuel samples. The release mechanism consists of four regimes, which are dependent upon the temperature. Initially, the release is controlled by movement of point defects combined with trapping along grain boundaries. This regime is followed by a process dominated by formation and growth of helium bubbles along grain boundaries. The third regime involves volume diffusion controlled by movement of oxygen vacancies. Finally, the release at the highest temperatures follows the diffusion rate of intragranular bubbles. The tendency for helium to be trapped within the grain boundaries diminishes with small grain sizes, slow thermal pulses, and older fuel

  15. A model for cytoplasmic rheology consistent with magnetic twisting cytometry.

    Science.gov (United States)

    Butler, J P; Kelly, S M

    1998-01-01

    Magnetic twisting cytometry is gaining wide applicability as a tool for the investigation of the rheological properties of cells and the mechanical properties of receptor-cytoskeletal interactions. Current technology involves the application and release of magnetically induced torques on small magnetic particles bound to or inside cells, with measurements of the resulting angular rotation of the particles. The properties of purely elastic or purely viscous materials can be determined by the angular strain and strain rate, respectively. However, the cytoskeleton and its linkage to cell surface receptors display elastic, viscous, and even plastic deformation, and the simultaneous characterization of these properties using only elastic or viscous models is internally inconsistent. Data interpretation is complicated by the fact that in current technology, the applied torques are not constant in time, but decrease as the particles rotate. This paper describes an internally consistent model consisting of a parallel viscoelastic element in series with a parallel viscoelastic element, and one approach to quantitative parameter evaluation. The unified model reproduces all essential features seen in data obtained from a wide variety of cell populations, and contains the pure elastic, viscoelastic, and viscous cases as subsets.

  16. Consistent resolution of some relativistic quantum paradoxes

    International Nuclear Information System (INIS)

    Griffiths, Robert B.

    2002-01-01

    A relativistic version of the (consistent or decoherent) histories approach to quantum theory is developed on the basis of earlier work by Hartle, and used to discuss relativistic forms of the paradoxes of spherical wave packet collapse, Bohm's formulation of the Einstein-Podolsky-Rosen paradox, and Hardy's paradox. It is argued that wave function collapse is not needed for introducing probabilities into relativistic quantum mechanics, and in any case should never be thought of as a physical process. Alternative approaches to stochastic time dependence can be used to construct a physical picture of the measurement process that is less misleading than collapse models. In particular, one can employ a coarse-grained but fully quantum-mechanical description in which particles move along trajectories, with behavior under Lorentz transformations the same as in classical relativistic physics, and detectors are triggered by particles reaching them along such trajectories. States entangled between spacelike separate regions are also legitimate quantum descriptions, and can be consistently handled by the formalism presented here. The paradoxes in question arise because of using modes of reasoning which, while correct for classical physics, are inconsistent with the mathematical structure of quantum theory, and are resolved (or tamed) by using a proper quantum analysis. In particular, there is no need to invoke, nor any evidence for, mysterious long-range superluminal influences, and thus no incompatibility, at least from this source, between relativity theory and quantum mechanics

  17. Self-consistent model of confinement

    International Nuclear Information System (INIS)

    Swift, A.R.

    1988-01-01

    A model of the large-spatial-distance, zero--three-momentum, limit of QCD is developed from the hypothesis that there is an infrared singularity. Single quarks and gluons do not propagate because they have infinite energy after renormalization. The Hamiltonian formulation of the path integral is used to quantize QCD with physical, nonpropagating fields. Perturbation theory in the infrared limit is simplified by the absence of self-energy insertions and by the suppression of large classes of diagrams due to vanishing propagators. Remaining terms in the perturbation series are resummed to produce a set of nonlinear, renormalizable integral equations which fix both the confining interaction and the physical propagators. Solutions demonstrate the self-consistency of the concepts of an infrared singularity and nonpropagating fields. The Wilson loop is calculated to provide a general proof of confinement. Bethe-Salpeter equations for quark-antiquark pairs and for two gluons have finite-energy solutions in the color-singlet channel. The choice of gauge is addressed in detail. Large classes of corrections to the model are discussed and shown to support self-consistency

  18. Subgame consistent cooperation a comprehensive treatise

    CERN Document Server

    Yeung, David W K

    2016-01-01

    Strategic behavior in the human and social world has been increasingly recognized in theory and practice. It is well known that non-cooperative behavior could lead to suboptimal or even highly undesirable outcomes. Cooperation suggests the possibility of obtaining socially optimal solutions and the calls for cooperation are prevalent in real-life problems. Dynamic cooperation cannot be sustainable if there is no guarantee that the agreed upon optimality principle at the beginning is maintained throughout the cooperation duration. It is due to the lack of this kind of guarantees that cooperative schemes fail to last till its end or even fail to get started. The property of subgame consistency in cooperative dynamic games and the corresponding solution mechanism resolve this “classic” problem in game theory. This book is a comprehensive treatise on subgame consistent dynamic cooperation covering the up-to-date state of the art analyses in this important topic. It sets out to provide the theory, solution tec...

  19. Sludge characterization: the role of physical consistency

    Energy Technology Data Exchange (ETDEWEB)

    Spinosa, Ludovico; Wichmann, Knut

    2003-07-01

    The physical consistency is an important parameter in sewage sludge characterization as it strongly affects almost all treatment, utilization and disposal operations. In addition, in many european Directives a reference to the physical consistency is reported as a characteristic to be evaluated for fulfilling the regulations requirements. Further, in many analytical methods for sludge different procedures are indicated depending on whether a sample is liquid or not, is solid or not. Three physical behaviours (liquid, paste-like and solid) can be observed with sludges, so the development of analytical procedures to define the boundary limit between liquid and paste-like behaviours (flowability) and that between solid and paste-like ones (solidity) is of growing interest. Several devices can be used for evaluating the flowability and solidity properties, but often they are costly and difficult to be operated in the field. Tests have been carried out to evaluate the possibility to adopt a simple extrusion procedure for flowability measurements, and a Vicat needle for solidity ones. (author)

  20. Consistent mutational paths predict eukaryotic thermostability

    Directory of Open Access Journals (Sweden)

    van Noort Vera

    2013-01-01

    Full Text Available Abstract Background Proteomes of thermophilic prokaryotes have been instrumental in structural biology and successfully exploited in biotechnology, however many proteins required for eukaryotic cell function are absent from bacteria or archaea. With Chaetomium thermophilum, Thielavia terrestris and Thielavia heterothallica three genome sequences of thermophilic eukaryotes have been published. Results Studying the genomes and proteomes of these thermophilic fungi, we found common strategies of thermal adaptation across the different kingdoms of Life, including amino acid biases and a reduced genome size. A phylogenetics-guided comparison of thermophilic proteomes with those of other, mesophilic Sordariomycetes revealed consistent amino acid substitutions associated to thermophily that were also present in an independent lineage of thermophilic fungi. The most consistent pattern is the substitution of lysine by arginine, which we could find in almost all lineages but has not been extensively used in protein stability engineering. By exploiting mutational paths towards the thermophiles, we could predict particular amino acid residues in individual proteins that contribute to thermostability and validated some of them experimentally. By determining the three-dimensional structure of an exemplar protein from C. thermophilum (Arx1, we could also characterise the molecular consequences of some of these mutations. Conclusions The comparative analysis of these three genomes not only enhances our understanding of the evolution of thermophily, but also provides new ways to engineer protein stability.

  1. Consistency of extreme flood estimation approaches

    Science.gov (United States)

    Felder, Guido; Paquet, Emmanuel; Penot, David; Zischg, Andreas; Weingartner, Rolf

    2017-04-01

    Estimations of low-probability flood events are frequently used for the planning of infrastructure as well as for determining the dimensions of flood protection measures. There are several well-established methodical procedures to estimate low-probability floods. However, a global assessment of the consistency of these methods is difficult to achieve, the "true value" of an extreme flood being not observable. Anyway, a detailed comparison performed on a given case study brings useful information about the statistical and hydrological processes involved in different methods. In this study, the following three different approaches for estimating low-probability floods are compared: a purely statistical approach (ordinary extreme value statistics), a statistical approach based on stochastic rainfall-runoff simulation (SCHADEX method), and a deterministic approach (physically based PMF estimation). These methods are tested for two different Swiss catchments. The results and some intermediate variables are used for assessing potential strengths and weaknesses of each method, as well as for evaluating the consistency of these methods.

  2. Self-consistent equilibria in the pulsar magnetosphere

    International Nuclear Information System (INIS)

    Endean, V.G.

    1976-01-01

    For a 'collisionless' pulsar magnetosphere the self-consistent equilibrium particle distribution functions are functions of the constants of the motion ony. Reasons are given for concluding that to a good approximation they will be functions of the rotating frame Hamiltonian only. This is shown to result in a rigid rotation of the plasma, which therefore becomes trapped inside the velocity of light cylinder. The self-consistent field equations are derived, and a method of solving them is illustrated. The axial component of the magnetic field decays to zero at the plasma boundary. In practice, some streaming of particles into the wind zone may occur as a second-order effect. Acceleration of such particles to very high energies is expected when they approach the velocity of light cylinder, but they cannot be accelerated to very high energies near the star. (author)

  3. Self-consistent potential variations in magnetic wells

    International Nuclear Information System (INIS)

    Kesner, J.; Knorr, G.; Nicholson, D.R.

    1981-01-01

    Self-consistent electrostatic potential variations are considered in a spatial region of weak magnetic field, as in the proposed tandem mirror thermal barriers (with no trapped ions). For some conditions, equivalent to ion distributions with a sufficiently high net drift speed along the magnetic field, the desired potential depressions are found. When the net drift speed is not high enough, potential depressions are found only in combination with strong electric fields on the boundaries of the system. These potential depressions are not directly related to the magnetic field depression. (author)

  4. Poisson solvers for self-consistent multi-particle simulations

    International Nuclear Information System (INIS)

    Qiang, J; Paret, S

    2014-01-01

    Self-consistent multi-particle simulation plays an important role in studying beam-beam effects and space charge effects in high-intensity beams. The Poisson equation has to be solved at each time-step based on the particle density distribution in the multi-particle simulation. In this paper, we review a number of numerical methods that can be used to solve the Poisson equation efficiently. The computational complexity of those numerical methods will be O(N log(N)) or O(N) instead of O(N2), where N is the total number of grid points used to solve the Poisson equation

  5. Synthesis, characterization and drug release properties of 3D chitosan/clinoptilolite biocomposite cryogels.

    Science.gov (United States)

    Dinu, Maria Valentina; Cocarta, Ana Irina; Dragan, Ecaterina Stela

    2016-11-20

    Three-dimensional (3D) biocomposites based on chitosan (CS) and clinoptilolite (CPL) were prepared by cryogelation and their potential application as drug carriers was investigated. Variation of CPL content from 0 to 33wt.% allowed the formation of biocomposites with heterogeneous morphologies consisting of randomly distributed pores. The further increase of CPL content led to ordered porous architectures where parallel pore channels were observed. The CPL content had a strong influence on water uptake, as well as on the cumulative release of diclofenac sodium (DS) and indomethacin (IDM). It was demonstrated that the drug delivery preferentially takes place in phosphate buffer saline (pH 7.4) in comparison to simulated gastric fluid (pH 1.2), where only a reduced drug release was observed. The drug release mechanism dominating these systems is described as a pseudo-Fickian diffusion, but it changes to non-Fickian release when 33wt.% of CPL was entrapped into the CS matrix or when IDM was loaded into biocomposites. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. A fission gas release model for MOX fuel and its verification

    International Nuclear Information System (INIS)

    Koo, Y.H.; Sohn, D.S.; Strijov, P.

    2000-01-01

    A fission gas release model for MOX fuel has been developed based on a model for UO 2 fuel. Using the concept of equivalent cell, the model considers the uneven distribution of Pu within the fuel matrix and a number of Pu-rich particles that could lead to a non-uniform fission rate and fission gas distribution across the fuel pellet. The model has been incorporated into a code, COSMOS, and some parametric studies were made to analyze the effect of the size and Pu content of Pu-rich agglomerates. The model was then applied to the experimental data obtained from the FIGARO program, which consisted of the base irradiation of MOX fuels in the BEZNAU-1 PWR and the subsequent irradiation of four refabricated fuel segments in the Halden reactor. The calculated gas releases show good agreement with the measured ones. In addition, the present analysis indicates that the microstructure of the MOX fuel used in the FIGARO program is such that it has produced little difference in terms of gas release compared with UO 2 fuel. (author)

  7. Request for approval, vented container annual release fraction

    International Nuclear Information System (INIS)

    HILL, J.S.

    1999-01-01

    In accordance with the approval conditions for Modification to the Central Waste Complex (CWC) Radioactive Air Emissions Notice of Construction (NOC). dated August 24,1998, a new release fraction has been developed for submittal to the Washington State Department of Health (WDOH). The proposed annual release fraction of 2.50 E-14 is proposed for use in future NOCs involving the storage and handling operations associated with vented containers on the Hanford Site. The proposed annual release fraction was the largest release fraction calculated from alpha measurements of the NucFil filters from 10 vented containers consisting of nine 55-gallon drums and one burial box with dimensions of 9.3 x 5.7 x 6.4 feet. An annual release fraction of 2.0 E-09 was used in the modification to the CWC radioactive air emissions NOC. This study confirmed that the release fraction used in the CWC radioactive air emissions NOC was conservative

  8. Request for approval, vented container annual release fraction; FINAL

    International Nuclear Information System (INIS)

    HILL, J.S.

    1999-01-01

    In accordance with the approval conditions for Modification to the Central Waste Complex (CWC) Radioactive Air Emissions Notice of Construction (NOC). dated August 24,1998, a new release fraction has been developed for submittal to the Washington State Department of Health (WDOH). The proposed annual release fraction of 2.50 E-14 is proposed for use in future NOCs involving the storage and handling operations associated with vented containers on the Hanford Site. The proposed annual release fraction was the largest release fraction calculated from alpha measurements of the NucFil filters from 10 vented containers consisting of nine 55-gallon drums and one burial box with dimensions of 9.3 x 5.7 x 6.4 feet. An annual release fraction of 2.0 E-09 was used in the modification to the CWC radioactive air emissions NOC. This study confirmed that the release fraction used in the CWC radioactive air emissions NOC was conservative

  9. Allegheny County Toxics Release Inventory

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — The Toxics Release Inventory (TRI) data provides information about toxic substances released into the environment or managed through recycling, energy recovery, and...

  10. Post-release monitoring of Antillean manatees: an assessment of the Brazilian rehabilitation and release programme

    Science.gov (United States)

    Normande, Iran C.; Malhado, Ana C. M.; Reid, James P.; Viana Junior, P.C.; Savaget, P. V. S.; Correia, R. A.; Luna, F. O.; R. J. Ladle,

    2016-01-01

    Mammalian reintroduction programmes frequently aim to reconnect isolated sub-populations and restore population viability. However, these long-term objectives are rarely evaluated due to the inadequacy of post-release monitoring. Here, we report the results of a unique long term telemetry-based monitoring programme for rehabilitated Antillean manatees (Trichechus manatus manatus) reintroduced into selected sites in northeast Brazil with the aim of reconnecting isolated relict populations. Twenty-one satellite-tagged rehabilitated manatees, 13 males and 8 females, were released into the wild from two sites between November 2008 and June 2013. Individual accumulation curves were plotted and home ranges were calculated through the fixed kernel method using 95% of the utilization distribution. The number and size of the Centres of Activity (COAs) were calculated using 50% of the utilization distribution. Manatees displayed a dichotomous pattern of movement, with individuals either characterized by sedentary habits or by much more extensive movements. Moreover, home range size was not significantly influenced by gender, age at release or release site. COAs were strongly associated with sheltered conditions within reefs and estuaries, and also by the presence of freshwater and feeding sites. Our data confirm that manatee reintroductions in Brazil have the potential to reconnect distant sub-populations. However, pre-release identification of potential long-distance migrants is currently unfeasible, and further analysis would be required to confirm genetic mixing of distant sub-populations.

  11. Consistency of canonical formulation of Horava gravity

    International Nuclear Information System (INIS)

    Soo, Chopin

    2011-01-01

    Both the non-projectable and projectable version of Horava gravity face serious challenges. In the non-projectable version, the constraint algebra is seemingly inconsistent. The projectable version lacks a local Hamiltonian constraint, thus allowing for an extra graviton mode which can be problematic. A new formulation (based on arXiv:1007.1563) of Horava gravity which is naturally realized as a representation of the master constraint algebra (instead of the Dirac algebra) studied by loop quantum gravity researchers is presented. This formulation yields a consistent canonical theory with first class constraints; and captures the essence of Horava gravity in retaining only spatial diffeomorphisms as the physically relevant non-trivial gauge symmetry. At the same time the local Hamiltonian constraint is equivalently enforced by the master constraint.

  12. Consistency of canonical formulation of Horava gravity

    Energy Technology Data Exchange (ETDEWEB)

    Soo, Chopin, E-mail: cpsoo@mail.ncku.edu.tw [Department of Physics, National Cheng Kung University, Tainan, Taiwan (China)

    2011-09-22

    Both the non-projectable and projectable version of Horava gravity face serious challenges. In the non-projectable version, the constraint algebra is seemingly inconsistent. The projectable version lacks a local Hamiltonian constraint, thus allowing for an extra graviton mode which can be problematic. A new formulation (based on arXiv:1007.1563) of Horava gravity which is naturally realized as a representation of the master constraint algebra (instead of the Dirac algebra) studied by loop quantum gravity researchers is presented. This formulation yields a consistent canonical theory with first class constraints; and captures the essence of Horava gravity in retaining only spatial diffeomorphisms as the physically relevant non-trivial gauge symmetry. At the same time the local Hamiltonian constraint is equivalently enforced by the master constraint.

  13. A consistent thermodynamic database for cement minerals

    International Nuclear Information System (INIS)

    Blanc, P.; Claret, F.; Burnol, A.; Marty, N.; Gaboreau, S.; Tournassat, C.; Gaucher, E.C.; Giffault, E.; Bourbon, X.

    2010-01-01

    work - the formation enthalpy and the Cp(T) function are taken from the literature or estimated - finally, the Log K(T) function is calculated, based on the selected dataset and it is compared to experimental data gathered at different temperatures. Each experimental point is extracted from solution compositions by using PHREEQC with a selection of aqueous complexes, consistent with the Thermochimie database. The selection was tested namely by drawing activity diagrams, allowing to assess phases relations. An example of such a diagram, drawn in the CaO-Al 2 O 3 -SiO 2 -H 2 O system is displayed. It can be seen that low pH concrete alteration proceeds essentially in decreasing the C/S ratio in C-S-H phases to the point where C-S-H are no longer stable and replaced by zeolite, then clay minerals. This evolution corresponds to a decrease in silica activity, which is consistent with the pH decrease, as silica concentration depends essentially on pH. Some rather consistent phase relations have been obtained for the SO 3 -Al 2 O 3 -CaO-CO 2 -H 2 O system. Addition of iron III enlarges the AFm-SO 4 stability field to the low temperature domain, whereas it decreases the pH domain where ettringite is stable. On the other hand, the stability field of katoite remains largely ambiguous, namely with respect to a hydro-garnet/grossular solid solution. With respect to other databases this work was made in consistency with a larger mineral selection, so that it can be used for modelling works in the cement clay interaction context

  14. Non linear self consistency of microtearing modes

    International Nuclear Information System (INIS)

    Garbet, X.; Mourgues, F.; Samain, A.

    1987-01-01

    The self consistency of a microtearing turbulence is studied in non linear regimes where the ergodicity of the flux lines determines the electron response. The current which sustains the magnetic perturbation via the Ampere law results from the combines action of the radial electric field in the frame where the island chains are static and of the thermal electron diamagnetism. Numerical calculations show that at usual values of β pol in Tokamaks the turbulence can create a diffusion coefficient of order ν th p 2 i where p i is the ion larmor radius and ν th the electron ion collision frequency. On the other hand, collisionless regimes involving special profiles of each mode near the resonant surface seem possible

  15. Consistent evolution in a pedestrian flow

    Science.gov (United States)

    Guan, Junbiao; Wang, Kaihua

    2016-03-01

    In this paper, pedestrian evacuation considering different human behaviors is studied by using a cellular automaton (CA) model combined with the snowdrift game theory. The evacuees are divided into two types, i.e. cooperators and defectors, and two different human behaviors, herding behavior and independent behavior, are investigated. It is found from a large amount of numerical simulations that the ratios of the corresponding evacuee clusters are evolved to consistent states despite 11 typically different initial conditions, which may largely owe to self-organization effect. Moreover, an appropriate proportion of initial defectors who are of herding behavior, coupled with an appropriate proportion of initial defectors who are of rationally independent thinking, are two necessary factors for short evacuation time.

  16. Thermodynamically consistent model calibration in chemical kinetics

    Directory of Open Access Journals (Sweden)

    Goutsias John

    2011-05-01

    Full Text Available Abstract Background The dynamics of biochemical reaction systems are constrained by the fundamental laws of thermodynamics, which impose well-defined relationships among the reaction rate constants characterizing these systems. Constructing biochemical reaction systems from experimental observations often leads to parameter values that do not satisfy the necessary thermodynamic constraints. This can result in models that are not physically realizable and may lead to inaccurate, or even erroneous, descriptions of cellular function. Results We introduce a thermodynamically consistent model calibration (TCMC method that can be effectively used to provide thermodynamically feasible values for the parameters of an open biochemical reaction system. The proposed method formulates the model calibration problem as a constrained optimization problem that takes thermodynamic constraints (and, if desired, additional non-thermodynamic constraints into account. By calculating thermodynamically feasible values for the kinetic parameters of a well-known model of the EGF/ERK signaling cascade, we demonstrate the qualitative and quantitative significance of imposing thermodynamic constraints on these parameters and the effectiveness of our method for accomplishing this important task. MATLAB software, using the Systems Biology Toolbox 2.1, can be accessed from http://www.cis.jhu.edu/~goutsias/CSS lab/software.html. An SBML file containing the thermodynamically feasible EGF/ERK signaling cascade model can be found in the BioModels database. Conclusions TCMC is a simple and flexible method for obtaining physically plausible values for the kinetic parameters of open biochemical reaction systems. It can be effectively used to recalculate a thermodynamically consistent set of parameter values for existing thermodynamically infeasible biochemical reaction models of cellular function as well as to estimate thermodynamically feasible values for the parameters of new

  17. Merging By Decentralized Eventual Consistency Algorithms

    Directory of Open Access Journals (Sweden)

    Ahmed-Nacer Mehdi

    2015-12-01

    Full Text Available Merging mechanism is an essential operation for version control systems. When each member of collaborative development works on an individual copy of the project, software merging allows to reconcile modifications made concurrently as well as managing software change through branching. The collaborative system is in charge to propose a merge result that includes user’s modifications. Theusers now have to check and adapt this result. The adaptation should be as effort-less as possible, otherwise, the users may get frustrated and will quit the collaboration. This paper aims to reduce the conflicts during the collaboration and im prove the productivity. It has three objectives: study the users’ behavior during the collaboration, evaluate the quality of textual merging results produced by specific algorithms and propose a solution to improve the r esult quality produced by the default merge tool of distributed version control systems. Through a study of eight open-source repositories totaling more than 3 million lines of code, we observe the behavior of the concurrent modifications during t he merge p rocedure. We i dentified when th e ex isting merge techniques under-perform, and we propose solutions to improve the quality of the merge. We finally compare with the traditional merge tool through a large corpus of collaborative editing.

  18. Exploring the Consistent behavior of Information Services

    Directory of Open Access Journals (Sweden)

    Kapidakis Sarantos

    2016-01-01

    Full Text Available Computer services are normally assumed to work well all the time. This usually happens for crucial services like bank electronic services, but not necessarily so for others, that there is no commercial interest in their operation. In this work we examined the operation and the errors of information services and tried to find clues that will help predicting the consistency of the behavior and the quality of the harvesting, which is harder because of the transient conditions and the many services and the huge amount of harvested information. We found many unexpected situations. The services that always successfully satisfy a request may in fact return part of it. A significant part of the OAI services have ceased working while many other serves occasionally fail to respond. Some services fail in the same way each time, and we pronounce them dead, as we do not see a way to overcome that. Others also always, or sometimes fail, but not in the same way, and we hope that their behavior is affected by temporary factors, that may improve later on. We categorized the services into classes, to study their behavior in more detail.

  19. A Consistent Phylogenetic Backbone for the Fungi

    Science.gov (United States)

    Ebersberger, Ingo; de Matos Simoes, Ricardo; Kupczok, Anne; Gube, Matthias; Kothe, Erika; Voigt, Kerstin; von Haeseler, Arndt

    2012-01-01

    The kingdom of fungi provides model organisms for biotechnology, cell biology, genetics, and life sciences in general. Only when their phylogenetic relationships are stably resolved, can individual results from fungal research be integrated into a holistic picture of biology. However, and despite recent progress, many deep relationships within the fungi remain unclear. Here, we present the first phylogenomic study of an entire eukaryotic kingdom that uses a consistency criterion to strengthen phylogenetic conclusions. We reason that branches (splits) recovered with independent data and different tree reconstruction methods are likely to reflect true evolutionary relationships. Two complementary phylogenomic data sets based on 99 fungal genomes and 109 fungal expressed sequence tag (EST) sets analyzed with four different tree reconstruction methods shed light from different angles on the fungal tree of life. Eleven additional data sets address specifically the phylogenetic position of Blastocladiomycota, Ustilaginomycotina, and Dothideomycetes, respectively. The combined evidence from the resulting trees supports the deep-level stability of the fungal groups toward a comprehensive natural system of the fungi. In addition, our analysis reveals methodologically interesting aspects. Enrichment for EST encoded data—a common practice in phylogenomic analyses—introduces a strong bias toward slowly evolving and functionally correlated genes. Consequently, the generalization of phylogenomic data sets as collections of randomly selected genes cannot be taken for granted. A thorough characterization of the data to assess possible influences on the tree reconstruction should therefore become a standard in phylogenomic analyses. PMID:22114356

  20. Consistency between GRUAN sondes, LBLRTM and IASI

    Directory of Open Access Journals (Sweden)

    X. Calbet

    2017-06-01

    Full Text Available Radiosonde soundings from the GCOS Reference Upper-Air Network (GRUAN data record are shown to be consistent with Infrared Atmospheric Sounding Instrument (IASI-measured radiances via LBLRTM (Line-By-Line Radiative Transfer Model in the part of the spectrum that is mostly affected by water vapour absorption in the upper troposphere (from 700 hPa up. This result is key for climate data records, since GRUAN, IASI and LBLRTM constitute reference measurements or a reference radiative transfer model in each of their fields. This is specially the case for night-time radiosonde measurements. Although the sample size is small (16 cases, daytime GRUAN radiosonde measurements seem to have a small dry bias of 2.5 % in absolute terms of relative humidity, located mainly in the upper troposphere, with respect to LBLRTM and IASI. Full metrological closure is not yet possible and will not be until collocation uncertainties are better characterized and a full uncertainty covariance matrix is clarified for GRUAN.

  1. Toward a consistent model for glass dissolution

    International Nuclear Information System (INIS)

    Strachan, D.M.; McGrail, B.P.; Bourcier, W.L.

    1994-01-01

    Understanding the process of glass dissolution in aqueous media has advanced significantly over the last 10 years through the efforts of many scientists around the world. Mathematical models describing the glass dissolution process have also advanced from simple empirical functions to structured models based on fundamental principles of physics, chemistry, and thermodynamics. Although borosilicate glass has been selected as the waste form for disposal of high-level wastes in at least 5 countries, there is no international consensus on the fundamental methodology for modeling glass dissolution that could be used in assessing the long term performance of waste glasses in a geologic repository setting. Each repository program is developing their own model and supporting experimental data. In this paper, we critically evaluate a selected set of these structured models and show that a consistent methodology for modeling glass dissolution processes is available. We also propose a strategy for a future coordinated effort to obtain the model input parameters that are needed for long-term performance assessments of glass in a geologic repository. (author) 4 figs., tabs., 75 refs

  2. Cobalt release from inexpensive jewellery

    DEFF Research Database (Denmark)

    Thyssen, Jacob Pontoppidan; Jellesen, Morten Stendahl; Menné, Torkil

    2010-01-01

    . Conclusions: This study showed that only a minority of inexpensive jewellery purchased in Denmark released cobalt when analysed with the cobalt spot test. As fashion trends fluctuate and we found cobalt release from dark appearing jewellery, cobalt release from consumer items should be monitored in the future......Objectives: The aim was to study 354 consumer items using the cobalt spot test. Cobalt release was assessed to obtain a risk estimate of cobalt allergy and dermatitis in consumers who would wear the jewellery. Methods: The cobalt spot test was used to assess cobalt release from all items...

  3. Self-consistent approach for neutral community models with speciation

    Science.gov (United States)

    Haegeman, Bart; Etienne, Rampal S.

    2010-03-01

    Hubbell’s neutral model provides a rich theoretical framework to study ecological communities. By incorporating both ecological and evolutionary time scales, it allows us to investigate how communities are shaped by speciation processes. The speciation model in the basic neutral model is particularly simple, describing speciation as a point-mutation event in a birth of a single individual. The stationary species abundance distribution of the basic model, which can be solved exactly, fits empirical data of distributions of species’ abundances surprisingly well. More realistic speciation models have been proposed such as the random-fission model in which new species appear by splitting up existing species. However, no analytical solution is available for these models, impeding quantitative comparison with data. Here, we present a self-consistent approximation method for neutral community models with various speciation modes, including random fission. We derive explicit formulas for the stationary species abundance distribution, which agree very well with simulations. We expect that our approximation method will be useful to study other speciation processes in neutral community models as well.

  4. View from Europe: stability, consistency or pragmatism

    International Nuclear Information System (INIS)

    Dunster, H.J.

    1988-01-01

    The last few years of this decade look like a period of reappraisal of radiation protection standards. The revised risk estimates from Japan will be available, and the United Nations Scientific Committee on the Effects of Atomic Radiation will be publishing new reports on biological topics. The International Commission on Radiological Protection (ICRP) has started a review of its basic recommendations, and the new specification for dose equivalent in radiation fields of the International Commission on Radiation Units and Measurements (ICRU) will be coming into use. All this is occurring at a time when some countries are still trying to catch up with committed dose equivalent and the recently recommended change in the value of the quality factor for neutrons. In Europe, the problems of adapting to new ICRP recommendations are considerable. The European Community, including 12 states and nine languages, takes ICRP recommendations as a basis and develops council directives that are binding on member states, which have then to arrange for their own regulatory changes. Any substantial adjustments could take 5 y or more to work through the system. Clearly, the regulatory preference is for stability. Equally clearly, trade unions and public interest groups favor a rapid response to scientific developments (provided that the change is downward). Organizations such as the ICRP have to balance their desire for internal consistency and intellectual purity against the practical problems of their clients in adjusting to change. This paper indicates some of the changes that might be necessary over the next few years and how, given a pragmatic approach, they might be accommodated in Europe without too much regulatory confusion

  5. The Consistency Between Clinical and Electrophysiological Diagnoses

    Directory of Open Access Journals (Sweden)

    Esra E. Okuyucu

    2009-09-01

    Full Text Available OBJECTIVE: The aim of this study was to provide information concerning the impact of electrophysiological tests in the clinical management and diagnosis of patients, and to evaluate the consistency between referring clinical diagnoses and electrophysiological diagnoses. METHODS: The study included 957 patients referred to the electroneuromyography (ENMG laboratory from different clinics with different clinical diagnoses in 2008. Demographic data, referring clinical diagnoses, the clinics where the requests wanted, and diagnoses after ENMG testing were recorded and statistically evaluated. RESULTS: In all, 957 patients [644 (67.3% female and 313 (32.7% male] were included in the study. Mean age of the patients was 45.40 ± 14.54 years. ENMG requests were made by different specialists; 578 (60.4% patients were referred by neurologists, 122 (12.8% by orthopedics, 140 (14.6% by neurosurgeons, and 117 (12.2% by physical treatment and rehabilitation departments. According to the results of ENMG testing, 513 (53.6% patients’ referrals were related to their referral diagnosis, whereas 397 (41.5% patients had normal ENMG test results, and 47 (4.9% patients had a diagnosis that differed from the referring diagnosis. Among the relation between the referral diagnosis and electrophysiological diagnosis according to the clinics where the requests were made, there was no statistical difference (p= 0.794, but there were statistically significant differences between the support of different clinical diagnoses, such as carpal tunnel syndrome, polyneuropathy, radiculopathy-plexopathy, entrapment neuropathy, and myopathy based on ENMG test results (p< 0.001. CONCLUSION: ENMG is a frequently used neurological examination. As such, referrals for ENMG can be made to either support the referring diagnosis or to exclude other diagnoses. This may explain the inconsistency between clinical referring diagnoses and diagnoses following ENMG

  6. Self-consistent meson mass spectrum

    International Nuclear Information System (INIS)

    Balazs, L.A.P.

    1982-01-01

    A dual-topological-unitarization (or dual-fragmentation) approach to the calculation of hadron masses is presented, in which the effect of planar ''sea''-quark loops is taken into account from the beginning. Using techniques based on analyticity and generalized ladder-graph dynamics, we first derive the approximate ''generic'' Regge-trajectory formula α(t) = max (S 1 +S 2 , S 3 +S 4 )-(1/2) +2alpha-circumflex'[s/sub a/ +(1/2)(t-summationm/sub i/ 2 )] for any given hadronic process 1+2→3+4, where S/sub i/ and m/sub i/ are the spins and masses of i = 1,2,3,4, and √s/sub a/ is the effective mass of the lowest nonvanishing contribution (a) exchanged in the crossed channel. By requiring a minimization of secondary (background, etc.) contributions to a, and demanding simultaneous consistency for entire sets of such processes, we are then able to calculate the masses of all the lowest pseudoscalar and vector qq-bar states with q = u,d,s and the Regge trajectories on which they lie. By making certain additional assumptions we are also able to do this with q = u,d,c and q = u,d,b. Our only arbitrary parameters are m/sub rho/, m/sub K/*, m/sub psi/, and m/sub Upsilon/, one of which merely serves to fix the energy scale. In contrast to many other approaches, a small m/sub π/ 2 /m/sub rho/ 2 ratio arises quite naturally in the present scheme

  7. Speed Consistency in the Smart Tachograph.

    Science.gov (United States)

    Borio, Daniele; Cano, Eduardo; Baldini, Gianmarco

    2018-05-16

    In the transportation sector, safety risks can be significantly reduced by monitoring the behaviour of drivers and by discouraging possible misconducts that entail fatigue and can increase the possibility of accidents. The Smart Tachograph (ST), the new revision of the Digital Tachograph (DT), has been designed with this purpose: to verify that speed limits and compulsory rest periods are respected by drivers. In order to operate properly, the ST periodically checks the consistency of data from different sensors, which can be potentially manipulated to avoid the monitoring of the driver behaviour. In this respect, the ST regulation specifies a test procedure to detect motion conflicts originating from inconsistencies between Global Navigation Satellite System (GNSS) and odometry data. This paper provides an experimental evaluation of the speed verification procedure specified by the ST regulation. Several hours of data were collected using three vehicles and considering light urban and highway environments. The vehicles were equipped with an On-Board Diagnostics (OBD) data reader and a GPS/Galileo receiver. The tests prescribed by the regulation were implemented with specific focus on synchronization aspects. The experimental analysis also considered aspects such as the impact of tunnels and the presence of data gaps. The analysis shows that the metrics selected for the tests are resilient to data gaps, latencies between GNSS and odometry data and simplistic manipulations such as data scaling. The new ST forces an attacker to falsify data from both sensors at the same time and in a coherent way. This makes more difficult the implementation of frauds in comparison to the current version of the DT.

  8. Determination of accident related release data

    International Nuclear Information System (INIS)

    Koch, W.; Nolte, O.; Lange, F.; Martens, R.

    2004-01-01

    For accident safety analyses, for the assessment of potential radiological consequences, for the review of current requirements of the Transport Regulations and for their possible further development as well as for the demonstration that radioactive materials such as LDM candidate material fulfil the regulatory requirements reliable release data following mechanical impact are required. This is definitely one of the demanding issues in the field of transport safety of radioactive materials. In this context special attention has to be paid to radioactive wastes immobilised in brittle materials, e.g. cement/concrete, glass, ceramics or other brittle materials such as fresh and spent fuel. In this presentation we report on a long-term experimental program aiming at improving the general physical understanding of the release process as well as the quantity and the quality of release data. By combining laboratory experiments using small scale test specimens with a few key scaling experiments with large scale test objects significant progress was achieved to meet this objective. The laboratory equipment enables the in-situ determination of the amount and aerodynamic size distribution of the airborne particles generated upon impact of the test specimen on a hard target. Impact energies cover the range experienced in transport accidents including aircraft accidents. The well defined experimental boundary conditions and the good reproducibility of the experimental procedure allowed for systematic studies to exactly measure the amount and aerodynamic size distribution of the airborne release and to quantify its dependence on relevant parameters such as energy input, material properties, and specimen geometry. The experimental program was performed within the scope of various national and international (e.g. EU-funded) projects. The small scale experiments with brittle materials revealed a pronounced universality of the airborne release in view of the material properties and

  9. Gamma-amino butyric acid (GABA) release in the ciliated protozoon Paramecium occurs by neuronal-like exocytosis.

    Science.gov (United States)

    Ramoino, P; Milanese, M; Candiani, S; Diaspro, A; Fato, M; Usai, C; Bonanno, G

    2010-04-01

    Paramecium primaurelia expresses a significant amount of gamma-amino butyric acid (GABA). Paramecia possess both glutamate decarboxylase (GAD)-like and vesicular GABA transporter (vGAT)-like proteins, indicating the ability to synthesize GABA from glutamate and to transport GABA into vesicles. Using antibodies raised against mammalian GAD and vGAT, bands with an apparent molecular weight of about 67 kDa and 57 kDa were detected. The presence of these bands indicated a similarity between the proteins in Paramecium and in mammals. VAMP, syntaxin and SNAP, putative proteins of the release machinery that form the so-called SNARE complex, are present in Paramecium. Most VAMP, syntaxin and SNAP fluorescence is localized in spots that vary in size and density and are primarily distributed near the plasma membrane. Antibodies raised against mammal VAMP-3, sintaxin-1 or SNAP-25 revealed protein immunoblot bands having molecular weights consistent with those observed in mammals. Moreover, P. primaurelia spontaneously releases GABA into the environment, and this neurotransmitter release significantly increases after membrane depolarization. The depolarization-induced GABA release was strongly reduced not only in the absence of extracellular Ca(2+) but also by pre-incubation with bafilomycin A1 or with botulinum toxin C1 serotype. It can be concluded that GABA occurs in Paramecium, where it is probably stored in vesicles capable of fusion with the cell membrane; accordingly, GABA can be released from Paramecium by stimulus-induced, neuronal-like exocytotic mechanisms.

  10. 5-Fluorouracil Encapsulated Chitosan Nanoparticles for pH-Stimulated Drug Delivery: Evaluation of Controlled Release Kinetics

    Directory of Open Access Journals (Sweden)

    R. Seda Tığlı Aydın

    2012-01-01

    Full Text Available Nanoparticles consisting of human therapeutic drugs are suggested as a promising strategy for targeted and localized drug delivery to tumor cells. In this study, 5-fluorouracil (5-FU encapsulated chitosan nanoparticles were prepared in order to investigate potentials of localized drug delivery for tumor environment due to pH sensitivity of chitosan nanoparticles. Optimization of chitosan and 5-FU encapsulated nanoparticles production revealed 148.8±1.1 nm and 243.1±17.9 nm particle size diameters with narrow size distributions, which are confirmed by scanning electron microscope (SEM images. The challenge was to investigate drug delivery of 5-FU encapsulated chitosan nanoparticles due to varied pH changes. To achieve this objective, pH sensitivity of prepared chitosan nanoparticle was evaluated and results showed a significant swelling response for pH 5 with particle diameter of ∼450 nm. In vitro release studies indicated a controlled and sustained release of 5-FU from chitosan nanoparticles with the release amounts of 29.1–60.8% due to varied pH environments after 408 h of the incubation period. pH sensitivity is confirmed by mathematical modeling of release kinetics since chitosan nanoparticles showed stimuli-induced release. Results suggested that 5-FU encapsulated chitosan nanoparticles can be launched as pH-responsive smart drug delivery agents for possible applications of cancer treatments.

  11. The study of consistent properties of gelatinous shampoo with minoxidil

    Directory of Open Access Journals (Sweden)

    I. V. Gnitko

    2016-04-01

    Full Text Available The aim of the work is the study of consistent properties of gelatinous shampoo with minoxidil 1% for the complex therapy and prevention of alopecia. This shampoo with minoxidil was selected according to the complex physical-chemical, biopharmaceutical and microbiological investigations. Methods and results. It has been established that consistent properties of the gelatinous minoxidil 1% shampoo and the «mechanical stability» (1.70 describe the formulation as exceptionally thixotropic composition with possibility of restoration after mechanical loads. Also this fact allows to predict stability of the consistent properties during long storage. Conclusion. Factors of dynamic flowing for the foam detergent gel with minoxidil (Кd1=38.9%; Kd2=78.06% quantitatively confirm sufficient degree of distribution at the time of spreading composition on the skin surface of the hairy part of head or during technological operations of manufacturing. Insignificant difference of «mechanical stability» for the gelatinous minoxidil 1% shampoo and its base indicates the absence of interactions between active substance and the base.

  12. Marginal Consistency: Upper-Bounding Partition Functions over Commutative Semirings.

    Science.gov (United States)

    Werner, Tomás

    2015-07-01

    Many inference tasks in pattern recognition and artificial intelligence lead to partition functions in which addition and multiplication are abstract binary operations forming a commutative semiring. By generalizing max-sum diffusion (one of convergent message passing algorithms for approximate MAP inference in graphical models), we propose an iterative algorithm to upper bound such partition functions over commutative semirings. The iteration of the algorithm is remarkably simple: change any two factors of the partition function such that their product remains the same and their overlapping marginals become equal. In many commutative semirings, repeating this iteration for different pairs of factors converges to a fixed point when the overlapping marginals of every pair of factors coincide. We call this state marginal consistency. During that, an upper bound on the partition function monotonically decreases. This abstract algorithm unifies several existing algorithms, including max-sum diffusion and basic constraint propagation (or local consistency) algorithms in constraint programming. We further construct a hierarchy of marginal consistencies of increasingly higher levels and show than any such level can be enforced by adding identity factors of higher arity (order). Finally, we discuss instances of the framework for several semirings, including the distributive lattice and the max-sum and sum-product semirings.

  13. CONSISTENCY UNDER SAMPLING OF EXPONENTIAL RANDOM GRAPH MODELS.

    Science.gov (United States)

    Shalizi, Cosma Rohilla; Rinaldo, Alessandro

    2013-04-01

    The growing availability of network data and of scientific interest in distributed systems has led to the rapid development of statistical models of network structure. Typically, however, these are models for the entire network, while the data consists only of a sampled sub-network. Parameters for the whole network, which is what is of interest, are estimated by applying the model to the sub-network. This assumes that the model is consistent under sampling , or, in terms of the theory of stochastic processes, that it defines a projective family. Focusing on the popular class of exponential random graph models (ERGMs), we show that this apparently trivial condition is in fact violated by many popular and scientifically appealing models, and that satisfying it drastically limits ERGM's expressive power. These results are actually special cases of more general results about exponential families of dependent random variables, which we also prove. Using such results, we offer easily checked conditions for the consistency of maximum likelihood estimation in ERGMs, and discuss some possible constructive responses.

  14. Impact of Industrial Releases on Inshas Area

    International Nuclear Information System (INIS)

    El-Messiry, A.M.; Aly, M.M.

    1999-01-01

    Two Egyptian research reactors are located within the nuclear research center at Inshas area, 30 km north east of Cairo. This area are crowded by different industrial plants. The releases from them has a hazardous and economical effects on the research center workers and the surrounding inhabitancy. In the present work we study these effects. A meteorological regional specific data is considered all over the year, including a wind rose characterization. The study considered both normal operating conditions and an accidental releases. The results shows that there are a considerable risk due to normal releases in some areas downwind direction of the major releases, and a highly risk in areas subjected to major exposure. Regional maps of emission distribution, economical damage , pollutant concentration are obtained. The study helps to identify solutions to problems of atmospheric protection. It can be used as a decision support for the environmental, economic, and innovation planning at the national levels taking into consideration the national pollution standards and variety of existing emission sources

  15. Nuclear refugees after large radioactive releases

    International Nuclear Information System (INIS)

    Pascucci-Cahen, Ludivine; Groell, Jérôme

    2016-01-01

    However improbable, large radioactive releases from a nuclear power plant would entail major consequences for the surrounding population. In Fukushima, 80,000 people had to evacuate the most contaminated areas around the NPP for a prolonged period of time. These people have been called “nuclear refugees”. The paper first argues that the number of nuclear refugees is a better measure of the severity of radiological consequences than the number of fatalities, although the latter is widely used to assess other catastrophic events such as earthquakes or tsunami. It is a valuable partial indicator in the context of comprehensive studies of overall consequences. Section 2 makes a clear distinction between long-term relocation and emergency evacuation and proposes a method to estimate the number of refugees. Section 3 examines the distribution of nuclear refugees with respect to weather and release site. The distribution is asymmetric and fat-tailed: unfavorable weather can lead to the contamination of large areas of land; large cities have in turn a higher probability of being contaminated. - Highlights: • Number of refugees is a good indicator of the severity of radiological consequences. • It is a better measure of the long-term consequences than the number of fatalities. • A representative meteorological sample should be sufficiently large. • The number of refugees highly depends on the release site in a country like France.

  16. THE ACCUMULATION AND RELEASE OF ARSENIC FROM DISTRIBUTION SYSTEM SOLIDS

    Science.gov (United States)

    The recently promulgated Arsenic Rule will require that many new drinking water systems treat their water to remove arsenic. Iron based treatment technologies including iron removal and iron coagulation are effective at reducing arsenic in water because iron surfaces have a stron...

  17. THE ACCUMULATION AND RELEASE OF CONTAMINANTS FROM DISTRIBUTION SYSTEM SOLIDS

    Science.gov (United States)

    The recently promulgated Arsenic Rule will require that many new drinking water systems treat their water to remove arsenic. Iron based treatment technologies including iron removal and iron coagulation are effective at reducing arsenic in water because iron surfaces have a stron...

  18. Release of CFC-11 from disposal of polyurethane foam waste

    DEFF Research Database (Denmark)

    Kjeldsen, Peter; Jensen, M.H.

    2001-01-01

    The halocarbon CFC-11 has extensively been used as a blowing agent for polyurethane (PUR) insulation foams in home appliances and for residential and industrial construction. Release of CFCs is an important factor in the depletion of the ozone layer. For CFC-11 the future atmospheric concentrations...... will mainly depend on the continued release from PUR foams. Little is known about rates and time frames of the CFC release from foams especially after treatment and disposal of foam containing waste products. The CFC release is mainly controlled by slow diffusion out through the PUR. From the literature...... and by reevaluation of an old reported experiment, diffusion coefficients in the range of 0.05-1.7.10(-14) m(2) s(-1) were found reflecting differences in foam properties and experimental designs. Laboratory experiments studying the distribution of CFC in the foam and the short-term releases after shredding showed...

  19. Release rates of soluble species at Yucca Mountain

    International Nuclear Information System (INIS)

    Lee, W.W.-L.; Pigford, T.H.

    1989-02-01

    Experimental leaching of spent fuel shows that some fission product species are preferentially released upon contact with water. We analyze the conservative case of bare spent fuel in contact with saturated tuff using diffusional mass transfer analysis. For the parameter values used, the USNRC release rate limit is not exceeded, except for 99 Tc. The presence of a container and the distribution of water contact over time will assist in meeting this criterion. 6 figs., 2 tabs

  20. Future emissions pathways consistent with limiting warming to 1.5°C

    Science.gov (United States)

    Millar, R.; Fuglestvedt, J. S.; Grubb, M.; Rogelj, J.; Skeie, R. B.; Friedlingstein, P.; Forster, P.; Frame, D. J.; Pierrehumbert, R.; Allen, M. R.

    2016-12-01

    The stated aim of the 2015 UNFCCC Paris Agreement is `holding the increase in global average temperature to well below 2°C above pre-industrial levels and to pursue efforts to limit temperature increases to 1.5°C'. We show that emissions reductions proportional to those achieved in an ambitious mitigation scenario, RCP2.6, but beginning in 2017, give a median estimated peak warming of 1.5°C, with a likely (66% probability) range of uncertainty of 1.2-2.0°C. Such a scenario would be approximately consistent with the most ambitious interpretation of the 2030 emissions pledges, but requires reduction rates exceeding 0.3GtC/yr/yr after 2030. A steady reduction at less than half this rate would achieve the same temperature outcome if initiated in 2020. Limiting total CO2 emissions after 2015 to 200GtC would limit future warming to likely less than 0.6°C above present, consistent with 1.5°C above pre-industrial, based on the distribution of responses of the CMIP5 Earth System, but the CMIP5 simulations do not correspond to scenarios that aim to limit warming to such low levels. If future CO2 emissions are successfully adapted to the emerging climate response so as to limit warming in 2100 to 0.6°C above present, and non-CO2 emissions follow the ambitious RCP2.6 scenario, then we estimate that resulting CO2 emissions will unlikely be restricted to less than 250GtC given current uncertainties in climate system response, although still-poorly-modelled carbon cycle feedbacks, such as release from permafrost, may encroach on this budget. Even under a perfectly successful adaptive mitigation regime, emissions consistent with limiting warming to 0.6°C above present are unlikely to be greater than 500GtC.These estimates suggest the 1.5°C goal may not yet be geophysically insurmountable but will nevertheless require, at minimum, the full implementation of the most ambitious interpretation of the Paris pledges followed by accelerated and more fundamental changes in our

  1. Wave-induced release of methane : littoral zones as a source of methane in lakes

    OpenAIRE

    Hofmann, Hilmar; Federwisch, Luisa; Peeters, Frank

    2010-01-01

    This study investigates the role of surface waves and the associated disturbance of littoral sediments for the release and later distribution of dissolved methane in lakes. Surface wave field, wave-induced currents, acoustic backscatter strength, and the concentration and distribution of dissolved methane were measured simultaneously in Lake Constance, Germany. The data indicate that surface waves enhance the release of dissolved methane in the shallow littoral zone via burst-like releases of...

  2. A fission gas release model

    Energy Technology Data Exchange (ETDEWEB)

    Denis, A; Piotrkowski, R [Argentine Atomic Energy Commission, Buenos Aires (Argentina)

    1997-08-01

    The hypothesis contained in the model developed in this work are as follows. The UO{sub 2} is considered as a collection of spherical grains. Nuclear reactions produce fission gases, mainly Xe and Kr, within the grains. Due to the very low solubility of these gases in UO{sub 2}, intragranular bubbles are formed, of a few nanometers is size. The bubbles are assumed to be immobile and to act as traps which capture gas atoms. Free atoms diffuse towards the grain boundaries, where they give origin to intergranular, lenticular bubbles, of the order of microns. The gas atoms in bubbles, either inter or intragranular, can re-enter the matrix through the mechanism of resolution induced by fission fragment impact. The amount of gas stored in intergranular bubbles grows up to a saturation value. Once saturation is reached, intergranular bubbles inter-connect and the gas in excess is released through different channels to the external surface of the fuel. The resolution of intergranular bubbles particularly affects the region of the grain adjacent to the grain boundary. During grain growth, the grain boundary traps the gas atoms, either free or in intragranular bubbles, contained in the swept volume. The grain boundary is considered as a perfect sink, i.e. the gas concentration is zero at that surface of the grain. Due to the spherical symmetry of the problem, the concentration gradient is null at the centre of the grain. The diffusion equation was solved using the implicit finite difference method. The initial solution was analytically obtained by the Laplace transform. The calculations were performed at different constant temperatures and were compared with experimental results. They show the asymptotic growth of the grain radius as a function of burnup, the gas distribution within the grain at every instant, the growth of the gas content at the grain boundary up to the saturation value and the fraction of gas released by the fuel element referred to the total gas generated

  3. Released radioactivity reducing facility

    International Nuclear Information System (INIS)

    Tanaka, Takeaki.

    1992-01-01

    Upon occurrence of a reactor accident, penetration portions of a reactor container, as a main leakage source from a reactor container, are surrounded by a plurality of gas-tight chambers, the outside of which is surrounded by highly gas-tightly buildings. Branched pipelines of an emergency gas processing system are introduced to each of the gas-tight chambers and they are joined and in communication with an emergency gas processing device. With such a constitution, radioactive materials are prevented from leaking directly from the buildings. Further, pipeline openings of the emergency gas processing facility are disposed in the plurality highly gas-tight penetration chambers. If the radioactive materials are leaked from the reactor to elevate the pressure in the penetration chambers, the radioactive materials are introduced to a filter device in the emergency gas processing facility by way of the branched pipelines, filtered and then released to the atmosphere. Accordingly, the reliability and safety of the system can be improved. (T.M.)

  4. Containment and release management

    International Nuclear Information System (INIS)

    Lehner, J.R.; Pratt, W.T.

    1988-01-01

    Reducing the risk from potentially severe accidents by appropriate accident management strategies is receiving increased attention from the international reactor safety community. Considerable uncertainty still surrounds some of the physical phenomena likely to occur during a severe accident. The USNRC, in developing its research plan for accident management, wants to ensure that both the developers and implementers of accident management strategies are aware of the uncertainty associated with the plant operators' ability to correctly diagnose an accident, as well as the uncertainties associated with various preventive and mitigative strategies. The use of a particular accident management strategy can have both positive and negative effects on the status of a plant and these effects must be carefully weighed before a particular course of action is chosen and implemented. By using examples of severe accident scenarios, initial insights are presented here regarding the indications plant operators may have to alert them to particular accident states. Insights are also offered on the various management actions operators and plant technical staff might pursue for particular accident situations and the pros and cons associated with such actions. The examples given are taken for the most part from the containment and release phase of accident management, since this is the current focus of the effort in the accident management area at Brookhaven National Laboratory. 2 refs

  5. Released radioactivity reducing device

    International Nuclear Information System (INIS)

    Miyamoto, Yumi.

    1995-01-01

    A water scrubber is disposed in a scrubber tank and a stainless steel fiber filter is disposed above the water scrubber. The upper end of the scrubber tank is connected by way of a second bent tube to a capturing vessel incorporating a moisture removing layer and an activated carbon filter. The exit of the capturing vessel is connected to a stack. Upon occurrence of an accident of a BWR-type power plant, gases containing radioactive materials released from a reactor container are discharged into the water scrubber from a first bent tube through a venturi tube nozzle, and water soluble and aerosol-like radioactive materials are captured in the water. Aerosol and splashes of water droplets which can not be captured thoroughly by the water scrubber are captured by the stainless steel fiber filter. Gases passing through the scrubber tank are introduced to a capturing vessel through a second bent tube, and organic iodine is captured by the activated carbon filter. (I.N.)

  6. COMMERCIAL SNF ACCIDENT RELEASE FRACTIONS

    Energy Technology Data Exchange (ETDEWEB)

    S.O. Bader

    1999-10-18

    The purpose of this design analysis is to specify and document the total and respirable fractions for radioactive materials that are released from an accident event at the Monitored Geologic Repository (MGR) involving commercial spent nuclear fuel (CSNF) in a dry environment. The total and respirable release fractions will be used to support the preclosure licensing basis for the MGR. The total release fraction is defined as the fraction of total CSNF assembly inventory, typically expressed as an activity inventory (e.g., curies), of a given radionuclide that is released to the environment from a waste form. The radionuclides are released from the inside of breached fuel rods (or pins) and from the detachment of radioactive material (crud) from the outside surfaces of fuel rods and other components of fuel assemblies. The total release fraction accounts for several mechanisms that tend to retain, retard, or diminish the amount of radionuclides that are available for transport to dose receptors or otherwise can be shown to reduce exposure of receptors to radiological releases. The total release fraction includes a fraction of airborne material that is respirable and could result in inhalation doses. This subset of the total release fraction is referred to as the respirable release fraction. Potential accidents may involve waste forms that are characterized as either bare (unconfined) fuel assemblies or confined fuel assemblies. The confined CSNF assemblies at the MGR are contained in shipping casks, canisters, or disposal containers (waste packages). In contrast to the bare fuel assemblies, the container that confines the fuel assemblies has the potential of providing an additional barrier for diminishing the total release fraction should the fuel rod cladding breach during an accident. However, this analysis will not take credit for this additional bamer and will establish only the total release fractions for bare unconfined CSNF assemblies, which may however be

  7. COMMERCIAL SNF ACCIDENT RELEASE FRACTIONS

    International Nuclear Information System (INIS)

    S.O. Bader

    1999-01-01

    The purpose of this design analysis is to specify and document the total and respirable fractions for radioactive materials that are released from an accident event at the Monitored Geologic Repository (MGR) involving commercial spent nuclear fuel (CSNF) in a dry environment. The total and respirable release fractions will be used to support the preclosure licensing basis for the MGR. The total release fraction is defined as the fraction of total CSNF assembly inventory, typically expressed as an activity inventory (e.g., curies), of a given radionuclide that is released to the environment from a waste form. The radionuclides are released from the inside of breached fuel rods (or pins) and from the detachment of radioactive material (crud) from the outside surfaces of fuel rods and other components of fuel assemblies. The total release fraction accounts for several mechanisms that tend to retain, retard, or diminish the amount of radionuclides that are available for transport to dose receptors or otherwise can be shown to reduce exposure of receptors to radiological releases. The total release fraction includes a fraction of airborne material that is respirable and could result in inhalation doses. This subset of the total release fraction is referred to as the respirable release fraction. Potential accidents may involve waste forms that are characterized as either bare (unconfined) fuel assemblies or confined fuel assemblies. The confined CSNF assemblies at the MGR are contained in shipping casks, canisters, or disposal containers (waste packages). In contrast to the bare fuel assemblies, the container that confines the fuel assemblies has the potential of providing an additional barrier for diminishing the total release fraction should the fuel rod cladding breach during an accident. However, this analysis will not take credit for this additional bamer and will establish only the total release fractions for bare unconfined CSNF assemblies, which may however be

  8. Release of fission and activation products during LWR core meltdown

    International Nuclear Information System (INIS)

    Albrecht, H.; Matschoss, V.; Wild, H.

    1978-01-01

    Experiments are described by which activity release fractions and aerosol characteristics were investigated for various core melting conditions. Samples of corium and fissium were heated by induction to temperatures of 2800 0 C under air, argon and steam. Release values are presented for Cr, Mn, Fe, Co, Se, Zr, Mo, Cd, Sn, Sb, Te, J, Cs and U. The deposition behaviour of the released products was found to depend strongly on the volatility and on the gas flow rate. Preliminary results of additional measurements indicate that the size distribution of the aerosol particles is trimodal. (author)

  9. Generation and release of radioactive gases in LLW disposal facilities

    Energy Technology Data Exchange (ETDEWEB)

    Yim, M.S. [Harvard School Public Health, Boston, MA (United States); Simonson, S.A. [Massachusetts Institute of Technology, Cambridge, MA (United States)

    1995-02-01

    The atmospheric release of radioactive gases from a generic engineered LLW disposal facility and its radiological impacts were examined. To quantify the generation of radioactive gases, detailed characterization of source inventory for carbon-14, tritium, iodine-129, krypton-85, and radon-222, was performed in terms of their activity concentrations; their distribution within different waste classes, waste forms and containers; and their subsequent availability for release in volatile or gaseous form. The generation of gases was investigated for the processes of microbial activity, radiolysis, and corrosion of waste containers and metallic components in wastes. The release of radionuclides within these gases to the atmosphere was analyzed under the influence of atmospheric pressure changes.

  10. PAVAN, Atmospheric Dispersion of Radioactive Releases from Nuclear Power Plants

    International Nuclear Information System (INIS)

    2001-01-01

    1 - Description of program or function: PAVAN estimates down-wind ground-level air concentrations for potential accidental releases of radioactive material from nuclear facilities. Options can account for variation in the location of release points, additional plume dispersion due to building wakes, plume meander under low wind speed conditions, and adjustments to consider non-straight trajectories. It computes an effective plume height using the physical release height which can be reduced by inputted terrain features. 2 - Method of solution: Using joint frequency distributions of wind direction and wind speed by atmospheric stability, the program provides relative air concentration (X/Q) values as functions of direction for various time periods at the exclusion area boundary (EAB) and the outer boundary of the low population zone (LPZ). Calculations of X/Q values can be made for assumed ground-level releases or evaluated releases from free-standing stacks. The X/Q calculations are based on the theory that material released to the atmosphere will be normally distributed (Gaussian) about the plume centerline. A straight-line trajectory is assumed between the point of release and all distances for which X/Q values are calculated. 3 - Restrictions on the complexity of the problem: - The code cannot handle multiple emission sources

  11. Underground water stress release models

    Science.gov (United States)

    Li, Yong; Dang, Shenjun; Lü, Shaochuan

    2011-08-01

    The accumulation of tectonic stress may cause earthquakes at some epochs. However, in most cases, it leads to crustal deformations. Underground water level is a sensitive indication of the crustal deformations. We incorporate the information of the underground water level into the stress release models (SRM), and obtain the underground water stress release model (USRM). We apply USRM to the earthquakes occurred at Tangshan region. The analysis shows that the underground water stress release model outperforms both Poisson model and stress release model. Monte Carlo simulation shows that the simulated seismicity by USRM is very close to the real seismicity.

  12. Self-consistent modeling of amorphous silicon devices

    International Nuclear Information System (INIS)

    Hack, M.

    1987-01-01

    The authors developed a computer model to describe the steady-state behaviour of a range of amorphous silicon devices. It is based on the complete set of transport equations and takes into account the important role played by the continuous distribution of localized states in the mobility gap of amorphous silicon. Using one set of parameters they have been able to self-consistently simulate the current-voltage characteristics of p-i-n (or n-i-p) solar cells under illumination, the dark behaviour of field-effect transistors, p-i-n diodes and n-i-n diodes in both the ohmic and space charge limited regimes. This model also describes the steady-state photoconductivity of amorphous silicon, in particular, its dependence on temperature, doping and illumination intensity

  13. Planck 2013 results. XXXI. Consistency of the Planck data

    DEFF Research Database (Denmark)

    Ade, P. A. R.; Arnaud, M.; Ashdown, M.

    2014-01-01

    The Planck design and scanning strategy provide many levels of redundancy that can be exploited to provide tests of internal consistency. One of the most important is the comparison of the 70 GHz (amplifier) and 100 GHz (bolometer) channels. Based on dierent instrument technologies, with feeds...... in the HFI channels would result in shifts in the posterior distributions of parameters of less than 0.3σ except for As, the amplitude of the primordial curvature perturbations at 0.05 Mpc-1, which changes by about 1.We extend these comparisons to include the sky maps from the complete nine-year mission...... located dierently in the focal plane, analysed independently by dierent teams using dierent software, and near∫ the minimum of diuse foreground emission, these channels are in eect two dierent experiments. The 143 GHz channel has the lowest noise level on Planck, and is near the minimum of unresolved...

  14. Multirobot FastSLAM Algorithm Based on Landmark Consistency Correction

    Directory of Open Access Journals (Sweden)

    Shi-Ming Chen

    2014-01-01

    Full Text Available Considering the influence of uncertain map information on multirobot SLAM problem, a multirobot FastSLAM algorithm based on landmark consistency correction is proposed. Firstly, electromagnetism-like mechanism is introduced to the resampling procedure in single-robot FastSLAM, where we assume that each sampling particle is looked at as a charged electron and attraction-repulsion mechanism in electromagnetism field is used to simulate interactive force between the particles to improve the distribution of particles. Secondly, when multiple robots observe the same landmarks, every robot is regarded as one node and Kalman-Consensus Filter is proposed to update landmark information, which further improves the accuracy of localization and mapping. Finally, the simulation results show that the algorithm is suitable and effective.

  15. Time-Consistent and Market-Consistent Evaluations (Revised version of 2012-086)

    NARCIS (Netherlands)

    Stadje, M.A.; Pelsser, A.

    2014-01-01

    Abstract: We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  16. Hierarchical fault diagnosis for discrete-event systems under local consistency

    NARCIS (Netherlands)

    Su, Rong; Wonham, W.M.

    2006-01-01

    In previous work the authors proposed a distributed diagnosis approach consisting of two phases—preliminary diagnosis in each local diagnoser and inter-diagnoser communication. The objective of communication is to achieve either global or local consistency among local diagnoses, where global

  17. Flash release an alternative for releasing complex MEMS devices

    NARCIS (Netherlands)

    Deladi, S.; Krijnen, Gijsbertus J.M.; Elwenspoek, Michael Curt

    2004-01-01

    A novel time-saving and cost-effective release technique has been developed and is described. The physical nature of the process is explained in combination with experimental observations. The results of the flash release process are compared with those of freeze-drying and supercritical CO2

  18. Release Early, Release Often: Predicting Change in Versioned Knowledge Organization Systems on the Web

    OpenAIRE

    Meroño-Peñuela, Albert; Guéret, Christophe; Schlobach, Stefan

    2015-01-01

    The Semantic Web is built on top of Knowledge Organization Systems (KOS) (vocabularies, ontologies, concept schemes) that provide a structured, interoperable and distributed access to Linked Data on the Web. The maintenance of these KOS over time has produced a number of KOS version chains: subsequent unique version identifiers to unique states of a KOS. However, the release of new KOS versions pose challenges to both KOS publishers and users. For publishers, updating a KOS is a knowledge int...

  19. Efficient self-consistency for magnetic tight binding

    Science.gov (United States)

    Soin, Preetma; Horsfield, A. P.; Nguyen-Manh, D.

    2011-06-01

    Tight binding can be extended to magnetic systems by including an exchange interaction on an atomic site that favours net spin polarisation. We have used a published model, extended to include long-ranged Coulomb interactions, to study defects in iron. We have found that achieving self-consistency using conventional techniques was either unstable or very slow. By formulating the problem of achieving charge and spin self-consistency as a search for stationary points of a Harris-Foulkes functional, extended to include spin, we have derived a much more efficient scheme based on a Newton-Raphson procedure. We demonstrate the capabilities of our method by looking at vacancies and self-interstitials in iron. Self-consistency can indeed be achieved in a more efficient and stable manner, but care needs to be taken to manage this. The algorithm is implemented in the code PLATO. Program summaryProgram title:PLATO Catalogue identifier: AEFC_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFC_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 228 747 No. of bytes in distributed program, including test data, etc.: 1 880 369 Distribution format: tar.gz Programming language: C and PERL Computer: Apple Macintosh, PC, Unix machines Operating system: Unix, Linux, Mac OS X, Windows XP Has the code been vectorised or parallelised?: Yes. Up to 256 processors tested RAM: Up to 2 Gbytes per processor Classification: 7.3 External routines: LAPACK, BLAS and optionally ScaLAPACK, BLACS, PBLAS, FFTW Catalogue identifier of previous version: AEFC_v1_0 Journal reference of previous version: Comput. Phys. Comm. 180 (2009) 2616 Does the new version supersede the previous version?: Yes Nature of problem: Achieving charge and spin self-consistency in magnetic tight binding can be very

  20. Percutaneous carpal tunnel release compared with mini-open release using ultrasonographic guidance for both techniques.

    Science.gov (United States)

    Nakamichi, Ken-ichi; Tachibana, Shintaro; Yamamoto, Seizo; Ida, Masayoshi

    2010-03-01

    To compare the outcomes of percutaneous carpal tunnel release (PCTR) and mini-open carpal tunnel release (mini-OCTR) using ultrasonographic guidance for both techniques. We included 74 hands of 65 women with idiopathic carpal tunnel syndrome (age, 52-71 y; mean, 58 y). Thirty-five hands of 29 women had the PCTR (release with a device consisting of an angled blade, guide, and holder, along a line midway between the median nerve and ulnar artery (safe line) under ultrasonography (incision, 4 mm), and 39 hands of 36 women had the mini-OCTR (release along the safe line, distally under direct vision (incision, 1-1.5 cm) and proximally under ultrasonography, using a device consisting of a basket punch and outer tube. Assessments at 3, 6, 13, 26, 52, and 104 weeks showed no significant differences in neurologic recovery between the groups (p > .05). The PCTR group had significantly less pain, greater grip and key-pinch strengths, and better satisfaction scores at 3 and 6 weeks (p < .05), and less scar sensitivity at 3, 6, and 13 weeks (p < .05). There were no complications. The PCTR provides the same neurologic recovery as does the mini-OCTR. The former leads to less postoperative morbidity and earlier functional return and achievement of satisfaction. Therapeutic III. Copyright 2010. Published by Elsevier Inc.

  1. Consistency check of photon beam physical data after recommissioning process

    International Nuclear Information System (INIS)

    Kadman, B; Chawapun, N; Ua-apisitwong, S; Asakit, T; Chumpu, N; Rueansri, J

    2016-01-01

    In radiotherapy, medical linear accelerator (Linac) is the key system used for radiation treatments delivery. Although, recommissioning was recommended after major modification of the machine by AAPM TG53, but it might not be practical in radiotherapy center with heavy workloads. The main purpose of this study was to compare photon beam physical data between initial commissioning and recommissioning of 6 MV Elekta Precise linac. The parameters for comparing were the percentage depth dose (PDD) and beam profiles. The clinical commissioning test cases followed IAEA-TECDOC-1583 were planned on REF 91230 IMRT Dose Verification Phantom by Philips’ Pinnacle treatment planning system. The Delta 4PT was used for dose distribution verification with 90% passing criteria of the gamma index (3%/3mm). Our results revealed that the PDDs and beam profiles agreed within a tolerance limit recommended by TRS430. Most of the point doses and dose distribution verification passed the acceptance criteria. This study showed the consistency of photon beam physical data after recommissioning process. There was a good agreement between initial commissioning and recommissioning within a tolerance limit, demonstrated that the full recommissioning process might not be required. However, in the complex treatment planning geometry, the initial data should be applied with great caution. (paper)

  2. Externally controlled triggered-release of drug from PLGA micro and nanoparticles.

    Directory of Open Access Journals (Sweden)

    Xin Hua

    Full Text Available Biofilm infections are extremely hard to eradicate and controlled, triggered and controlled drug release properties may prolong drug release time. In this study, the ability to externally control drug release from micro and nanoparticles was investigated. We prepared micro/nanoparticles containing ciprofloxacin (CIP and magnetic nanoparticles encapsulated in poly (lactic-co-glycolic acid PLGA. Both micro/nanoparticles were observed to have narrow size distributions. We investigated and compared their passive and externally triggered drug release properties based on their different encapsulation structures for the nano and micro systems. In passive release studies, CIP demonstrated a fast rate of release in first 2 days which then slowed and sustained release for approximately 4 weeks. Significantly, magnetic nanoparticles containing systems all showed ability to have triggered drug release when exposed to an external oscillating magnetic field (OMF. An experiment where the OMF was turned on and off also confirmed the ability to control the drug release in a pulsatile manner. The magnetically triggered release resulted in a 2-fold drug release increase compared with normal passive release. To confirm drug integrity following release, the antibacterial activity of released drug was evaluated in Pseudomonas aeruginosa biofilms in vitro. CIP maintained its antimicrobial activity after encapsulation and triggered release.

  3. Externally controlled triggered-release of drug from PLGA micro and nanoparticles.

    Science.gov (United States)

    Hua, Xin; Tan, Shengnan; Bandara, H M H N; Fu, Yujie; Liu, Siguo; Smyth, Hugh D C

    2014-01-01

    Biofilm infections are extremely hard to eradicate and controlled, triggered and controlled drug release properties may prolong drug release time. In this study, the ability to externally control drug release from micro and nanoparticles was investigated. We prepared micro/nanoparticles containing ciprofloxacin (CIP) and magnetic nanoparticles encapsulated in poly (lactic-co-glycolic acid) PLGA. Both micro/nanoparticles were observed to have narrow size distributions. We investigated and compared their passive and externally triggered drug release properties based on their different encapsulation structures for the nano and micro systems. In passive release studies, CIP demonstrated a fast rate of release in first 2 days which then slowed and sustained release for approximately 4 weeks. Significantly, magnetic nanoparticles containing systems all showed ability to have triggered drug release when exposed to an external oscillating magnetic field (OMF). An experiment where the OMF was turned on and off also confirmed the ability to control the drug release in a pulsatile manner. The magnetically triggered release resulted in a 2-fold drug release increase compared with normal passive release. To confirm drug integrity following release, the antibacterial activity of released drug was evaluated in Pseudomonas aeruginosa biofilms in vitro. CIP maintained its antimicrobial activity after encapsulation and triggered release.

  4. The self-consistent calculation of the edge states in bilayer quantum Hall bar

    International Nuclear Information System (INIS)

    Kavruk, A E; Orzturk, T; Orzturk, A; Atav, U; Yuksel, H

    2011-01-01

    In this study, we present the spatial distributions of the edge channels for each layer in bilayer quantum Hall bar geometry for a wide range of applied magnetic fields. For this purpose, we employ a self-consistent Thomas-Fermi-Poisson approach to obtain the electron density distributions and related screened potential distributions. In order to have a more realistic description of the system we solve three dimensional Poisson equation numerically in each iteration step to obtain self consistency in the Thomas-Fermi-Poisson approach instead of employing a 'frozen gate' approximation.

  5. Self-consistent modeling of electron cyclotron resonance ion sources

    International Nuclear Information System (INIS)

    Girard, A.; Hitz, D.; Melin, G.; Serebrennikov, K.; Lecot, C.

    2004-01-01

    In order to predict the performances of electron cyclotron resonance ion source (ECRIS), it is necessary to perfectly model the different parts of these sources: (i) magnetic configuration; (ii) plasma characteristics; (iii) extraction system. The magnetic configuration is easily calculated via commercial codes; different codes also simulate the ion extraction, either in two dimension, or even in three dimension (to take into account the shape of the plasma at the extraction influenced by the hexapole). However the characteristics of the plasma are not always mastered. This article describes the self-consistent modeling of ECRIS: we have developed a code which takes into account the most important construction parameters: the size of the plasma (length, diameter), the mirror ratio and axial magnetic profile, whether a biased probe is installed or not. These input parameters are used to feed a self-consistent code, which calculates the characteristics of the plasma: electron density and energy, charge state distribution, plasma potential. The code is briefly described, and some of its most interesting results are presented. Comparisons are made between the calculations and the results obtained experimentally

  6. Self-consistent modeling of electron cyclotron resonance ion sources

    Science.gov (United States)

    Girard, A.; Hitz, D.; Melin, G.; Serebrennikov, K.; Lécot, C.

    2004-05-01

    In order to predict the performances of electron cyclotron resonance ion source (ECRIS), it is necessary to perfectly model the different parts of these sources: (i) magnetic configuration; (ii) plasma characteristics; (iii) extraction system. The magnetic configuration is easily calculated via commercial codes; different codes also simulate the ion extraction, either in two dimension, or even in three dimension (to take into account the shape of the plasma at the extraction influenced by the hexapole). However the characteristics of the plasma are not always mastered. This article describes the self-consistent modeling of ECRIS: we have developed a code which takes into account the most important construction parameters: the size of the plasma (length, diameter), the mirror ratio and axial magnetic profile, whether a biased probe is installed or not. These input parameters are used to feed a self-consistent code, which calculates the characteristics of the plasma: electron density and energy, charge state distribution, plasma potential. The code is briefly described, and some of its most interesting results are presented. Comparisons are made between the calculations and the results obtained experimentally.

  7. Analytic Intermodel Consistent Modeling of Volumetric Human Lung Dynamics.

    Science.gov (United States)

    Ilegbusi, Olusegun; Seyfi, Behnaz; Neylon, John; Santhanam, Anand P

    2015-10-01

    Human lung undergoes breathing-induced deformation in the form of inhalation and exhalation. Modeling the dynamics is numerically complicated by the lack of information on lung elastic behavior and fluid-structure interactions between air and the tissue. A mathematical method is developed to integrate deformation results from a deformable image registration (DIR) and physics-based modeling approaches in order to represent consistent volumetric lung dynamics. The computational fluid dynamics (CFD) simulation assumes the lung is a poro-elastic medium with spatially distributed elastic property. Simulation is performed on a 3D lung geometry reconstructed from four-dimensional computed tomography (4DCT) dataset of a human subject. The heterogeneous Young's modulus (YM) is estimated from a linear elastic deformation model with the same lung geometry and 4D lung DIR. The deformation obtained from the CFD is then coupled with the displacement obtained from the 4D lung DIR by means of the Tikhonov regularization (TR) algorithm. The numerical results include 4DCT registration, CFD, and optimal displacement data which collectively provide consistent estimate of the volumetric lung dynamics. The fusion method is validated by comparing the optimal displacement with the results obtained from the 4DCT registration.

  8. Neutronic data consistency analysis for lithium blanket and shield design

    International Nuclear Information System (INIS)

    Reupke, W.A.; Muir, D.W.

    1976-01-01

    Using a compact least-squares treatment we analyze the consistency of evaluated cross sections with calculated and measured tritium production in /sup n/Li and 7 Li detectors embedded in a 14-MeV neutron driven /sup n/LiD sphere. The tritium production experimental error matrix is evaluated and an initial reduced chi 2 of 3.0 is calculated. A perturbation calculation of the tritium production cross section sensitivities is performed with secondary neutron energy and angular distributions held constant. The cross section error matrix is evaluated by the external consistency of available cross section measurements. A statistical adjustment of the combined data yields a reduced chi 2 of 2.3 and represents a tenfold improvement in statistical likelihood. The improvement is achieved by a decrease in the 7 Li(n,xt) 14-MeV group cross section from 328 mb to 284 mb and an adjustment of the /sup n/Li data closer to calculated values. The uncertainty in the tritium breeding ratio in pure 7 LiD is reduced by one-fifth

  9. Efficient reconstruction of contaminant release history

    Energy Technology Data Exchange (ETDEWEB)

    Alezander, Francis [Los Alamos National Laboratory; Anghel, Marian [Los Alamos National Laboratory; Gulbahce, Natali [NON LANL; Tartakovsky, Daniel [NON LANL

    2009-01-01

    We present a generalized hybrid Monte Carlo (GHMC) method for fast, statistically optimal reconstruction of release histories of reactive contaminants. The approach is applicable to large-scale, strongly nonlinear systems with parametric uncertainties and data corrupted by measurement errors. The use of discrete adjoint equations facilitates numerical implementation of GHMC, without putting any restrictions on the degree of nonlinearity of advection-dispersion-reaction equations that are used to described contaminant transport in the subsurface. To demonstrate the salient features of the proposed algorithm, we identify the spatial extent of a distributed source of contamination from concentration measurements of a reactive solute.

  10. Validation of kinetic modeling of progesterone release from polymeric membranes

    Directory of Open Access Journals (Sweden)

    Analia Irma Romero

    2018-01-01

    Full Text Available Mathematical modeling in drug release systems is fundamental in development and optimization of these systems, since it allows to predict drug release rates and to elucidate the physical transport mechanisms involved. In this paper we validate a novel mathematical model that describes progesterone (Prg controlled release from poly-3-hydroxybutyric acid (PHB membranes. A statistical analysis was conducted to compare the fitting of our model with six different models and the Akaike information criterion (AIC was used to find the equation with best-fit. A simple relation between mass and drug released rate was found, which allows predicting the effect of Prg loads on the release behavior. Our proposed model was the one with minimum AIC value, and therefore it was the one that statistically fitted better the experimental data obtained for all the Prg loads tested. Furthermore, the initial release rate was calculated and therefore, the interface mass transfer coefficient estimated and the equilibrium distribution constant of Prg between the PHB and the release medium was also determined. The results lead us to conclude that our proposed model is the one which best fits the experimental data and can be successfully used to describe Prg drug release in PHB membranes.

  11. A self-referential HOWTO on release engineering

    Energy Technology Data Exchange (ETDEWEB)

    Galassi, Mark C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-31

    Release engineering is a fundamental part of the software development cycle: it is the point at which quality control is exercised and bug fixes are integrated. The way in which software is released also gives the end user her first experience of a software package, while in scientific computing release engineering can guarantee reproducibility. For these reasons and others, the release process is a good indicator of the maturity and organization of a development team. Software teams often do not put in place a release process at the beginning. This is unfortunate because the team does not have early and continuous execution of test suites, and it does not exercise the software in the same conditions as the end users. I describe an approach to release engineering based on the software tools developed and used by the GNU project, together with several specific proposals related to packaging and distribution. I do this in a step-by-step manner, demonstrating how this very paper is written and built using proper release engineering methods. Because many aspects of release engineering are not exercised in the building of the paper, the accompanying software repository also contains examples of software libraries.

  12. Investigation of tritium release and retention in lithium aluminate

    International Nuclear Information System (INIS)

    Kopasz, J.P.; Tistchenko, S.; Botter, F.

    1991-01-01

    Tritium release from lithium aluminate, although previously investigated by both in-reactor and ex-reactor experiments, remains poorly understood. Agreement between experiments is lacking, and the mechanisms responsible for tritium release from lithium aluminate are under debate. In an effort to improve our understanding of the mechanisms of tritium release from lithium ceramics, we have investigated tritium release from pure lithium aluminate and lithium aluminate doped with impurities. The results of these experiments on large grain size material indicate that after anneals at low temperature, a large fraction of the tritium present before the anneal remains in the sample. We have modeled this behavior based on first-order release from three types of sites. At the lowest temperature, the release is dominated by one site, while the tritium in the other sites is retained in the solid. Adding magnesium dopant to the ceramic appears to alter the distribution of tritium between the sites. This addition decreases the fraction of tritium released at 777 degree C, while increasing the fractions released at 538 and 950 degree C. 11 refs., 8 figs., 1 tab

  13. Consistently Showing Your Best Side? Intra-individual Consistency in #Selfie Pose Orientation

    Science.gov (United States)

    Lindell, Annukka K.

    2017-01-01

    Painted and photographic portraits of others show an asymmetric bias: people favor their left cheek. Both experimental and database studies confirm that the left cheek bias extends to selfies. To date all such selfie studies have been cross-sectional; whether individual selfie-takers tend to consistently favor the same pose orientation, or switch between multiple poses, remains to be determined. The present study thus examined intra-individual consistency in selfie pose orientations. Two hundred selfie-taking participants (100 male and 100 female) were identified by searching #selfie on Instagram. The most recent 10 single-subject selfies for the each of the participants were selected and coded for type of selfie (normal; mirror) and pose orientation (left, midline, right), resulting in a sample of 2000 selfies. Results indicated that selfie-takers do tend to consistently adopt a preferred pose orientation (α = 0.72), with more participants showing an overall left cheek bias (41%) than would be expected by chance (overall right cheek bias = 31.5%; overall midline bias = 19.5%; no overall bias = 8%). Logistic regression modellng, controlling for the repeated measure of participant identity, indicated that sex did not affect pose orientation. However, selfie type proved a significant predictor when comparing left and right cheek poses, with a stronger left cheek bias for mirror than normal selfies. Overall, these novel findings indicate that selfie-takers show intra-individual consistency in pose orientation, and in addition, replicate the previously reported left cheek bias for selfies and other types of portrait, confirming that the left cheek bias also presents within individuals’ selfie corpora. PMID:28270790

  14. Nonlinear and self-consistent treatment of ECRH

    Energy Technology Data Exchange (ETDEWEB)

    Tsironis, C.; Vlahos, L.

    2005-07-01

    A self-consistent formulation for the nonlinear interaction of electromagnetic waves with relativistic magnetized electrons is applied for the description of the ECRH. In general, electron-cyclotron absorption is the result of resonances between the cyclotron harmonics and the Doppler-shifted waver frequency. The resonant interaction results to an intense wave-particle energy exchange and an electron acceleration, and for that reason it is widely applied in fusion experiments for plasma heating and current drive. The linear theory, for the wave absorption, as well as the quasilinear theory for the electron distribution function, are the most frequently-used tools for the study of wave-particle interactions. However, in many cases the validity of these theories is violated, namely cases where nonlinear effects, like, e. g. particle trapping in the wave field, are dominant in the particle phase-space. Our model consists of electrons streaming and gyrating in a tokamak plasma slab, which is finite in the directions perpendicular to the main magnetic field. The particles interact with an electromagnetic electron-cyclotron wave of the ordinary (O-) or the extraordinary (X-) mode. A set of nonlinear and relativistic equations is derived, which take into account the effects of the charged particle motions on the wave. These consist of the equations of motion for the plasma electrons in the slab, as well as the wave equation in terms of the vector potential. The effect of the electron motions on the temporal evolution of the wave is reflected in the current density source term. (Author)

  15. Nonlinear and self-consistent treatment of ECRH

    International Nuclear Information System (INIS)

    Tsironis, C.; Vlahos, L.

    2005-01-01

    A self-consistent formulation for the nonlinear interaction of electromagnetic waves with relativistic magnetized electrons is applied for the description of the ECRH. In general, electron-cyclotron absorption is the result of resonances between the cyclotron harmonics and the Doppler-shifted waver frequency. The resonant interaction results to an intense wave-particle energy exchange and an electron acceleration, and for that reason it is widely applied in fusion experiments for plasma heating and current drive. The linear theory, for the wave absorption, as well as the quasilinear theory for the electron distribution function, are the most frequently-used tools for the study of wave-particle interactions. However, in many cases the validity of these theories is violated, namely cases where nonlinear effects, like, e. g. particle trapping in the wave field, are dominant in the particle phase-space. Our model consists of electrons streaming and gyrating in a tokamak plasma slab, which is finite in the directions perpendicular to the main magnetic field. The particles interact with an electromagnetic electron-cyclotron wave of the ordinary (O-) or the extraordinary (X-) mode. A set of nonlinear and relativistic equations is derived, which take into account the effects of the charged particle motions on the wave. These consist of the equations of motion for the plasma electrons in the slab, as well as the wave equation in terms of the vector potential. The effect of the electron motions on the temporal evolution of the wave is reflected in the current density source term. (Author)

  16. Press Oil Final Release Survey

    Energy Technology Data Exchange (ETDEWEB)

    Whicker, Jeffrey Jay [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ruedig, Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-05-11

    There are forty-eight 55 gallon barrels filled with hydraulic oil that are candidates for release and recycle. This oil needs to be characterized prior to release. Principles of sampling as provided in MARSAME/MARSSIM approaches were used as guidance for sampling.

  17. Workload Control with Continuous Release

    NARCIS (Netherlands)

    Phan, B. S. Nguyen; Land, M. J.; Gaalman, G. J. C.

    2009-01-01

    Workload Control (WLC) is a production planning and control concept which is suitable for the needs of make-to-order job shops. Release decisions based on the workload norms form the core of the concept. This paper develops continuous time WLC release variants and investigates their due date

  18. Implications of the Khrgian-Mazin Distribution Function for Water Clouds and Distribution Consistencies With Aerosols and Rain

    Science.gov (United States)

    1991-12-06

    particle concentration of aerosols). Dk. Akad. Nauk. (Soy. Phys. Dokl.) 63. de Saussure . H.B. (1789) Mem. Acad. Turin 4:409-424. (Historic reference...atmospheric optics). de Saussure , H.B. (1789) Mem. Acad. Turin, 4:425-440. (Historic reference, atmospheric optics). de Saussure . H.B. (1791) Mem. de

  19. Toxic releases from power plants

    International Nuclear Information System (INIS)

    Rubin, E.S.

    1999-01-01

    Beginning in 1998, electric power plants burning coal or oil must estimate and report their annual releases of toxic chemicals listed in the Toxics Release Inventory (TRI) published by the US Environmental Protection Agency (EPA). This paper identifies the toxic chemicals of greatest significance for the electric utility sector and develops quantitative estimates of the toxic releases reportable to the TRI for a representative coal-fired power plant. Key factors affecting the magnitude and types of toxic releases for individual power plants also are discussed. A national projection suggests that the magnitude of electric utility industry releases will surpass those of the manufacturing industries which current report to the TRI. Risk communication activities at the community level will be essential to interpret and provide context for the new TRI results

  20. Radioecological studies of activation products released from a nuclear power plant into the marine environment

    Energy Technology Data Exchange (ETDEWEB)

    Nilsson, M.; Mattsson, S.; Holm, E.

    1984-01-01

    The Barseback nuclear power plant, located on the Oresund sound between Denmark and Sweden, consists of two boiling water reactors. The release of radionuclides, mainly activation products, is quite low during normal operation. During the summer, when annual overhaul and partial refuelling take place, the discharge is much higher. Samples of seaweeds and crustaceans collected along the coast were analyzed for radionuclides. Seaweeds (Fucus vesiculosus, F. Serratus, Ascophyllum nodosum and Cladophora glomerata) and crustaceans (Idothea and Gammarus) proved to be excellent bioindicators for radioactive corrosion products released from the nuclear power plant into the marine environment. These bioindicators have been used to map the spatial and temporal distribution of the released radioactivity. The activity has been followed up to 150 km from the power plant, and the decrease in activity concentration in the bioindicators with distance can be expressed by a power function. The variation with time of activity concentration reflects the amount of activity discharged from the power plant, with good resolution in time. The bioindicators exhibit different uptake patterns of the radionuclides detected. The crustacean Idothea showed variations in the Co/sup 60/ activity concentration between winter and summer. 9 references, 12 figures, 2 tables.

  1. Multivesicular release underlies short term synaptic potentiation independent of release probability change in the supraoptic nucleus.

    Directory of Open Access Journals (Sweden)

    Michelle E Quinlan

    Full Text Available Magnocellular neurons of the supraoptic nucleus receive glutamatergic excitatory inputs that regulate the firing activity and hormone release from these neurons. A strong, brief activation of these excitatory inputs induces a lingering barrage of tetrodotoxin-resistant miniature EPSCs (mEPSCs that lasts for tens of minutes. This is known to accompany an immediate increase in large amplitude mEPSCs. However, it remains unknown how long this amplitude increase can last and whether it is simply a byproduct of greater release probability. Using in vitro patch clamp recording on acute rat brain slices, we found that a brief, high frequency stimulation (HFS of afferents induced a potentiation of mEPSC amplitude lasting up to 20 min. This amplitude potentiation did not correlate with changes in mEPSC frequency, suggesting that it does not reflect changes in presynaptic release probability. Nonetheless, neither postsynaptic calcium chelator nor the NMDA receptor antagonist blocked the potentiation. Together with the known calcium dependency of HFS-induced potentiation of mEPSCs, our results imply that mEPSC amplitude increase requires presynaptic calcium. Further analysis showed multimodal distribution of mEPSC amplitude, suggesting that large mEPSCs were due to multivesicular glutamate release, even at late post-HFS when the frequency is no longer elevated. In conclusion, high frequency activation of excitatory synapses induces lasting multivesicular release in the SON, which is independent of changes in release probability. This represents a novel form of synaptic plasticity that may contribute to prolonged excitatory tone necessary for generation of burst firing of magnocellular neurons.

  2. Experience with radioactivity releases

    International Nuclear Information System (INIS)

    Anderson, T.V.; Johnson, A.G.; Ringle, J.C.

    1972-01-01

    On December 11, 1970, the reactor top continuous air monitor (CAM) showed an increase in particulate air activity of an unusual nature. A check of the CAM filter with a multi-channel analyzer indicated that the majority of the activity was due to Cs-138 , Cs-139 , Rb-89 , and Rb-90 , which indicated a probable fuel element leak. The CAM filter was changed and rechecked several times, but the rubidium and cesium radionuclides were consistently identified. The procedure was followed by removing three fuel elements at a time. Since the CAM was the only instrument picking up radioactivity, it was used as the primary radiation monitor. During the search for the leaky fuel element, it was found that the element in position E-18 (triangle cut-out) was leaning against the top of the element in E-17. Particulate air activity originating from the rotating rack loading port on the reactor top was reported by OSU during the previous TRIGA Owner's Seminar. Short term relief can be obtained by inserting a standard CAM filter paper over the rotating rack loading tube opening, but this has not proved satisfactory for runs of one hour or longer. A simple filter system for the rotating rack was built, and is operated as part of the argon ventilation system. This appears to have solved the problem

  3. Multi-component nuclear energy system to meet requirement of self-consistency

    International Nuclear Information System (INIS)

    Saito, Masaki; Artisyuk, Vladimir; Shmelev, Anotolii; Korovin, Yorii

    2000-01-01

    Environmental harmonization of nuclear energy technology is considered as an absolutely necessary condition in its future successful development for peaceful use. Establishment of Self-Consistent Nuclear Energy System, that simultaneously meets four requirements - energy production, fuel production, burning of radionuclides and safety, strongly relies on the neutron excess generation. Implementation of external non-fission based neutron sources into fission energy system would open the possibility of approaching Multicomponent Self-Consistent Nuclear Energy System with unlimited fuel resources, zero radioactivity release and high protection against uncontrolled proliferation of nuclear materials. (author)

  4. Meltable magnetic biocomposites for controlled release

    Energy Technology Data Exchange (ETDEWEB)

    Müller, R., E-mail: robert.mueller@ipht-jena.de [Leibniz Institute of Photonic Technology (IPHT), P.O.B. 100239, Jena, D-07702 Germany (Germany); Zhou, M. [Institute of Organic Chemistry and Macromolecular Chemistry, Friedrich Schiller University of Jena, Humboldtstrasse 10, Jena, D-07743 Germany (Germany); Dellith, A. [Leibniz Institute of Photonic Technology (IPHT), P.O.B. 100239, Jena, D-07702 Germany (Germany); Liebert, T.; Heinze, T. [Institute of Organic Chemistry and Macromolecular Chemistry, Friedrich Schiller University of Jena, Humboldtstrasse 10, Jena, D-07743 Germany (Germany)

    2017-06-01

    New biocompatible composites with adjustable melting point in the range of 30–140 °C, consisting of magnetite nanoparticles embedded into a matrix of meltable dextran fatty acid ester are presented which can be softened under an induced alternating magnetic field (AMF). The chosen thermoplastic magnetic composites have a melting range close to human body temperature and can be easily shaped into disk or coating film under melting. The composite disks were loaded with green fluorescent protein (GFP) as a model protein. Controlled release of the protein was realized with high frequent alternating magnetic field of 20 kA/m at 400 kHz. These results showed that under an AMF the release of GFP from magnetic composite was accelerated compared to the control sample without exposure to AMF. Furthermore a texturing of particles in the polymer matrix by a static magnetic field was investigated. - Highlights: • Thermoplastic biocomposite are prepared from dextran ester and magnetite particles. • The composite can be heated by an AC magnetic field above the melting temperature. • In molten state texturing of particles is possible and improves the heating ability. • The biopolymer could be used as a remote controlled matrix for protein release.

  5. The Role of Acoustic Cavitation in Ultrasound-triggered Drug Release from Echogenic Liposomes

    Science.gov (United States)

    Kopechek, Jonathan A.

    Cardiovascular disease (CVD) is the leading cause of death in the United States and globally. CVD-related mortality, including coronary heart disease, heart failure, or stroke, generally occurs due to atherosclerosis, a condition in which plaques build up within arterial walls, potentially causing blockage or rupture. Targeted therapies are needed to achieve more effective treatments. Echogenic liposomes (ELIP), which consist of a lipid membrane surrounding an aqueous core, have been developed to encapsulate a therapeutic agent and/or gas bubbles for targeted delivery and ultrasound image enhancement. Under certain conditions ultrasound can cause nonlinear bubble growth and collapse, known as "cavitation." Cavitation activity has been associated with enhanced drug delivery across cellular membranes. However, the mechanisms of ultrasound-mediated drug release from ELIP have not been previously investigated. Thus, the objective of this dissertation is to elucidate the role of acoustic cavitation in ultrasound-mediated drug release from ELIP. To determine the acoustic and physical properties of ELIP, the frequency-dependent attenuation and backscatter coefficients were measured between 3 and 30 MHz. The results were compared to a theoretical model by measuring the ELIP size distribution in order to determine properties of the lipid membrane. It was found that ELIP have a broad size distribution and can provide enhanced ultrasound image contrast across a broad range of clinically-relevant frequencies. Calcein, a hydrophilic fluorescent dye, and papaverine, a lipophilic vasodilator, were separately encapsulated in ELIP and exposed to color Doppler ultrasound pulses from a clinical diagnostic ultrasound scanner in a flow system. Spectrophotometric techniques (fluorescence and absorbance measurements) were used to detect calcein or papaverine release. As a positive control, Triton X-100 (a non-ionic detergent) was added to ELIP samples not exposed to ultrasound in order

  6. Data acquisition backbone core DABC release v1.0

    Energy Technology Data Exchange (ETDEWEB)

    Adamczewski-Musch, Joern; Essel, Hans G.; Kurz, Nikolaus; Linev, S. [GSI Helmholtzzentrum fuer Schwerionenforschung, Darmstadt (Germany)

    2010-07-01

    The new experiments at FAIR require new concepts of data acquisition systems for the distribution of self-triggered, time stamped data streams over high performance networks for event building. The Data Acquisition Backbone Core (DABC) is a general purpose software framework developed for the implementation of such data acquisition systems. A DABC application consists of functional components like data input, combiner, scheduler, event builder, filter, analysis and storage which can be configured at runtime. Application specific code including the support of all kinds of data channels (front-end systems) is implemented by C++ program plug-ins. DABC is also well suited as environment for various detector and readout components test beds. A set of DABC plug-ins has been developed for the FAIR experiment CBM (Compressed Baryonic Matter) at GSI. This DABC application is used as DAQ system for test beamtimes. Front-end boards equipped with n-XYTER ASICs and ADCs are connected to read-out controller boards (ROC). From there the data is sent over Ethernet (UDP), or over optics and PCIe interface cards into Linux PCs. DABC does the controlling, event building, archiving and data serving. The first release of DABC was published in 2009 and is available under GPL license.

  7. Release plan for Big Pete

    International Nuclear Information System (INIS)

    Edwards, T.A.

    1996-11-01

    This release plan is to provide instructions for the Radiological Control Technician (RCT) to conduct surveys for the unconditional release of ''Big Pete,'' which was used in the removal of ''Spacers'' from the N-Reactor. Prior to performing surveys on the rear end portion of ''Big Pete,'' it shall be cleaned (i.e., free of oil, grease, caked soil, heavy dust). If no contamination is found, the vehicle may be released with the permission of the area RCT Supervisor. If contamination is found by any of the surveys, contact the cognizant Radiological Engineer for decontamination instructions

  8. Commercial SNF Accident Release Fractions

    Energy Technology Data Exchange (ETDEWEB)

    J. Schulz

    2004-11-05

    The purpose of this analysis is to specify and document the total and respirable fractions for radioactive materials that could be potentially released from an accident at the repository involving commercial spent nuclear fuel (SNF) in a dry environment. The total and respirable release fractions are used to support the preclosure licensing basis for the repository. The total release fraction is defined as the fraction of total commercial SNF assembly inventory, typically expressed as an activity inventory (e.g., curies), of a given radionuclide that is released to the environment from a waste form. Radionuclides are released from the inside of breached fuel rods (or pins) and from the detachment of radioactive material (crud) from the outside surfaces of fuel rods and other components of fuel assemblies. The total release fraction accounts for several mechanisms that tend to retain, retard, or diminish the amount of radionuclides that are available for transport to dose receptors or otherwise can be shown to reduce exposure of receptors to radiological releases. The total release fraction includes a fraction of airborne material that is respirable and could result in inhalation doses; this subset of the total release fraction is referred to as the respirable release fraction. Accidents may involve waste forms characterized as: (1) bare unconfined intact fuel assemblies, (2) confined intact fuel assemblies, or (3) canistered failed commercial SNF. Confined intact commercial SNF assemblies at the repository are contained in shipping casks, canisters, or waste packages. Four categories of failed commercial SNF are identified: (1) mechanically and cladding-penetration damaged commercial SNF, (2) consolidated/reconstituted assemblies, (3) fuel rods, pieces, and debris, and (4) nonfuel components. It is assumed that failed commercial SNF is placed into waste packages with a mesh screen at each end (CRWMS M&O 1999). In contrast to bare unconfined fuel assemblies, the

  9. Commercial SNF Accident Release Fractions

    International Nuclear Information System (INIS)

    Schulz, J.

    2004-01-01

    The purpose of this analysis is to specify and document the total and respirable fractions for radioactive materials that could be potentially released from an accident at the repository involving commercial spent nuclear fuel (SNF) in a dry environment. The total and respirable release fractions are used to support the preclosure licensing basis for the repository. The total release fraction is defined as the fraction of total commercial SNF assembly inventory, typically expressed as an activity inventory (e.g., curies), of a given radionuclide that is released to the environment from a waste form. Radionuclides are released from the inside of breached fuel rods (or pins) and from the detachment of radioactive material (crud) from the outside surfaces of fuel rods and other components of fuel assemblies. The total release fraction accounts for several mechanisms that tend to retain, retard, or diminish the amount of radionuclides that are available for transport to dose receptors or otherwise can be shown to reduce exposure of receptors to radiological releases. The total release fraction includes a fraction of airborne material that is respirable and could result in inhalation doses; this subset of the total release fraction is referred to as the respirable release fraction. Accidents may involve waste forms characterized as: (1) bare unconfined intact fuel assemblies, (2) confined intact fuel assemblies, or (3) canistered failed commercial SNF. Confined intact commercial SNF assemblies at the repository are contained in shipping casks, canisters, or waste packages. Four categories of failed commercial SNF are identified: (1) mechanically and cladding-penetration damaged commercial SNF, (2) consolidated/reconstituted assemblies, (3) fuel rods, pieces, and debris, and (4) nonfuel components. It is assumed that failed commercial SNF is placed into waste packages with a mesh screen at each end (CRWMS M andO 1999). In contrast to bare unconfined fuel assemblies, the

  10. Iron oxide/aluminum/graphene energetic nanocomposites synthesized by atomic layer deposition: Enhanced energy release and reduced electrostatic ignition hazard

    Science.gov (United States)

    Yan, Ning; Qin, Lijun; Hao, Haixia; Hui, Longfei; Zhao, Fengqi; Feng, Hao

    2017-06-01

    Nanocomposites consisting of iron oxide (Fe2O3) and nano-sized aluminum (Al), possessing outstanding exothermic redox reaction characteristics, are highly promising nanothermite materials. However, the reactant diffusion inhibited in the solid state system makes the fast and complete energy release very challenging. In this work, Al nanoparticles anchored on graphene oxide (GO/Al) was initially prepared by a solution assembly approach. Fe2O3 was deposited on GO/Al substrates by atomic layer deposition (ALD). Simultaneously thermal reduction of GO occurs, resulting in rGO/Al@Fe2O3 energetic composites. Differential scanning calorimetry (DSC) analysis reveals that rGO/Al@Fe2O3 composite containing 4.8 wt% of rGO exhibits a 50% increase of the energy release compared to the Al@Fe2O3 nanothermite synthesized by ALD, and an increase of about 130% compared to a random mixture of rGO/Al/Fe2O3 nanoparticles. The enhanced energy release of rGO/Al@Fe2O3 is attributed to the improved spatial distribution as well as the increased interfacial intimacy between the oxidizer and the fuel. Moreover, the rGO/Al@Fe2O3 composite with an rGO content of 9.6 wt% exhibits significantly reduced electrostatic discharge sensitivity. These findings may inspire potential pathways for engineering energetic nanocomposites with enhanced energy release and improved safety characteristics.

  11. Preliminary analysis of public dose from CFETR gaseous tritium release

    Energy Technology Data Exchange (ETDEWEB)

    Nie, Baojie [Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui 230031 (China); University of Science and Technology of China, Hefei, Anhui 230027 (China); Ni, Muyi, E-mail: muyi.ni@fds.org.cn [Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui 230031 (China); Lian, Chao; Jiang, Jieqiong [Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui 230031 (China)

    2015-02-15

    Highlights: • Present the amounts and limit dose of tritium release to the environment for CFETR. • Perform a preliminary simulation of radiation dose for gaseous tritium release. • Key parameters about soil types, wind speed, stability class, effective release height and age were sensitivity analyzed. • Tritium release amount is recalculated consistently with dose limit in Chinese regulation for CFETR. - Abstract: To demonstrate tritium self-sufficiency and other engineering issues, the scientific conception of Chinese Fusion Engineering Test Reactor (CFETR) has been proposed in China parallel with ITER and before DEMO reactor. Tritium environmental safety for CFETR is an important issue and must be evaluated because of the huge amounts of tritium cycling in reactor. In this work, different tritium release scenarios of CFETR and dose limit regulations in China are introduced. And the public dose is preliminarily analyzed under normal and accidental events. Furthermore, after finishing the sensitivity analysis of key input parameters, the public dose is reevaluated based on extreme parameters. Finally, tritium release amount is recalculated consistently with the dose limit in Chinese regulation for CFETR, which would provide a reference for tritium system design of CFETR.

  12. Bayesian detection of causal rare variants under posterior consistency.

    KAUST Repository

    Liang, Faming

    2013-07-26

    Identification of causal rare variants that are associated with complex traits poses a central challenge on genome-wide association studies. However, most current research focuses only on testing the global association whether the rare variants in a given genomic region are collectively associated with the trait. Although some recent work, e.g., the Bayesian risk index method, have tried to address this problem, it is unclear whether the causal rare variants can be consistently identified by them in the small-n-large-P situation. We develop a new Bayesian method, the so-called Bayesian Rare Variant Detector (BRVD), to tackle this problem. The new method simultaneously addresses two issues: (i) (Global association test) Are there any of the variants associated with the disease, and (ii) (Causal variant detection) Which variants, if any, are driving the association. The BRVD ensures the causal rare variants to be consistently identified in the small-n-large-P situation by imposing some appropriate prior distributions on the model and model specific parameters. The numerical results indicate that the BRVD is more powerful for testing the global association than the existing methods, such as the combined multivariate and collapsing test, weighted sum statistic test, RARECOVER, sequence kernel association test, and Bayesian risk index, and also more powerful for identification of causal rare variants than the Bayesian risk index method. The BRVD has also been successfully applied to the Early-Onset Myocardial Infarction (EOMI) Exome Sequence Data. It identified a few causal rare variants that have been verified in the literature.

  13. Bayesian detection of causal rare variants under posterior consistency.

    Directory of Open Access Journals (Sweden)

    Faming Liang

    Full Text Available Identification of causal rare variants that are associated with complex traits poses a central challenge on genome-wide association studies. However, most current research focuses only on testing the global association whether the rare variants in a given genomic region are collectively associated with the trait. Although some recent work, e.g., the Bayesian risk index method, have tried to address this problem, it is unclear whether the causal rare variants can be consistently identified by them in the small-n-large-P situation. We develop a new Bayesian method, the so-called Bayesian Rare Variant Detector (BRVD, to tackle this problem. The new method simultaneously addresses two issues: (i (Global association test Are there any of the variants associated with the disease, and (ii (Causal variant detection Which variants, if any, are driving the association. The BRVD ensures the causal rare variants to be consistently identified in the small-n-large-P situation by imposing some appropriate prior distributions on the model and model specific parameters. The numerical results indicate that the BRVD is more powerful for testing the global association than the existing methods, such as the combined multivariate and collapsing test, weighted sum statistic test, RARECOVER, sequence kernel association test, and Bayesian risk index, and also more powerful for identification of causal rare variants than the Bayesian risk index method. The BRVD has also been successfully applied to the Early-Onset Myocardial Infarction (EOMI Exome Sequence Data. It identified a few causal rare variants that have been verified in the literature.

  14. Bayesian detection of causal rare variants under posterior consistency.

    KAUST Repository

    Liang, Faming; Xiong, Momiao

    2013-01-01

    Identification of causal rare variants that are associated with complex traits poses a central challenge on genome-wide association studies. However, most current research focuses only on testing the global association whether the rare variants in a given genomic region are collectively associated with the trait. Although some recent work, e.g., the Bayesian risk index method, have tried to address this problem, it is unclear whether the causal rare variants can be consistently identified by them in the small-n-large-P situation. We develop a new Bayesian method, the so-called Bayesian Rare Variant Detector (BRVD), to tackle this problem. The new method simultaneously addresses two issues: (i) (Global association test) Are there any of the variants associated with the disease, and (ii) (Causal variant detection) Which variants, if any, are driving the association. The BRVD ensures the causal rare variants to be consistently identified in the small-n-large-P situation by imposing some appropriate prior distributions on the model and model specific parameters. The numerical results indicate that the BRVD is more powerful for testing the global association than the existing methods, such as the combined multivariate and collapsing test, weighted sum statistic test, RARECOVER, sequence kernel association test, and Bayesian risk index, and also more powerful for identification of causal rare variants than the Bayesian risk index method. The BRVD has also been successfully applied to the Early-Onset Myocardial Infarction (EOMI) Exome Sequence Data. It identified a few causal rare variants that have been verified in the literature.

  15. Modeling self-consistent multi-class dynamic traffic flow

    Science.gov (United States)

    Cho, Hsun-Jung; Lo, Shih-Ching

    2002-09-01

    In this study, we present a systematic self-consistent multiclass multilane traffic model derived from the vehicular Boltzmann equation and the traffic dispersion model. The multilane domain is considered as a two-dimensional space and the interaction among vehicles in the domain is described by a dispersion model. The reason we consider a multilane domain as a two-dimensional space is that the driving behavior of road users may not be restricted by lanes, especially motorcyclists. The dispersion model, which is a nonlinear Poisson equation, is derived from the car-following theory and the equilibrium assumption. Under the concept that all kinds of users share the finite section, the density is distributed on a road by the dispersion model. In addition, the dynamic evolution of the traffic flow is determined by the systematic gas-kinetic model derived from the Boltzmann equation. Multiplying Boltzmann equation by the zeroth, first- and second-order moment functions, integrating both side of the equation and using chain rules, we can derive continuity, motion and variance equation, respectively. However, the second-order moment function, which is the square of the individual velocity, is employed by previous researches does not have physical meaning in traffic flow. Although the second-order expansion results in the velocity variance equation, additional terms may be generated. The velocity variance equation we propose is derived from multiplying Boltzmann equation by the individual velocity variance. It modifies the previous model and presents a new gas-kinetic traffic flow model. By coupling the gas-kinetic model and the dispersion model, a self-consistent system is presented.

  16. Development of controlled release spheroids using Buchananiacochinchinesis gum

    Directory of Open Access Journals (Sweden)

    Narayan Babulal Gaikwad

    2013-03-01

    Full Text Available Chirauli nut gum was isolated from the bark of Buchanania cochinchinesis (fam. Anacadiacea and was used as a release modifier for the preparation of Diclofenac sodium spheroids using the extrusion spheronization technique. The process was studied for the effects on variables when making spheroids with satisfactory particle shape, size and size distribution. The prepared spheroids were characterized for surface morphology, qualitative surface porosity, friability, bulk density and flow properties. In vitro studies demonstrated that the release exhibited Fickian diffusion kinetics which was confirmed by the Higuchi and the Korsmeyer-Peppas models. The physico-chemical parameters of the gum could be correlated to the in vitro dissolution profile of the spheroids. The spheroids were not able to sustain the drug releases over 12 hours. A greater concentration of Chirauli nut gum and a process that can accommodate such greater concentrations may produce a formulation capable of significant sustained release.

  17. Birth control - slow release methods

    Science.gov (United States)

    Contraception - slow-release hormonal methods; Progestin implants; Progestin injections; Skin patch; Vaginal ring ... might want to consider a different birth control method. SKIN PATCH The skin patch is placed on ...

  18. DEVELOPMENT OF SUSTAINED RELEASE TABLETS ...

    African Journals Online (AJOL)

    2013-12-31

    Dec 31, 2013 ... The SR dosage forms that release drugs pH independently in .... were determined; Post compression parameters such as weight variation test, hardness, ... Based on the ICH guidelines 12, the stability studies were carried out ...

  19. Standardised classification of pre-release development in male-brooding pipefish, seahorses, and seadragons (Family Syngnathidae)

    Science.gov (United States)

    2012-01-01

    Background Members of the family Syngnathidae share a unique reproductive mode termed male pregnancy. Males carry eggs in specialised brooding structures for several weeks and release free-swimming offspring. Here we describe a systematic investigation of pre-release development in syngnathid fishes, reviewing available data for 17 species distributed across the family. This work is complemented by in-depth examinations of the straight-nosed pipefish Nerophis ophidion, the black-striped pipefish Syngnathus abaster, and the potbellied seahorse Hippocampus abdominalis. Results We propose a standardised classification of early syngnathid development that extends from the activation of the egg to the release of newborn. The classification consists of four developmental periods – early embryogenesis, eye development, snout formation, and juvenile – which are further divided into 11 stages. Stages are characterised by morphological traits that are easily visible in live and preserved specimens using incident-light microscopy. Conclusions Our classification is derived from examinations of species representing the full range of brooding-structure complexity found in the Syngnathidae, including tail-brooding as well as trunk-brooding species, which represent independent evolutionary lineages. We chose conspicuous common traits as diagnostic features of stages to allow for rapid and consistent staging of embryos and larvae across the entire family. In view of the growing interest in the biology of the Syngnathidae, we believe that the classification proposed here will prove useful for a wide range of studies on the unique reproductive biology of these male-brooding fish. PMID:23273265

  20. PCDD/PCDF release inventories

    Energy Technology Data Exchange (ETDEWEB)

    Fiedler, H. [UNEP Chemicals, Chatelaine (Switzerland)

    2004-09-15

    The Stockholm Convention on Persistent Organic Pollutants (POPs) entered into force on 17 May 2004 with 50 Parties. In May 2004, 59 countries had ratified or acceded the Convention. The objective of the Convention is ''to protect human health and the environment from persistent organic pollutants''. For intentionally produced POPs, e.g., pesticides and industrial chemicals such as hexachlorobenzene and polychlorinated biphenyls, this will be achieved by stop of production and use. For unintentionally generated POPs, such as polychlorinated dibenzo-pdioxins (PCDD) and polychlorinated dibenzofurans (PCDF), measures have to be taken to ''reduce the total releases derived from anthropogenic sources''; the final goal is ultimate elimination, where feasible. Under the Convention, Parties have to establish and maintain release inventories to prove the continuous release reduction. Since many countries do not have the technical and financial capacity to measure all releases from all potential PCDD/PCDF sources, UNEP Chemicals has developed the ''Standardized Toolkit for the Identification of Quantification of Dioxin and Furan Releases'' (''Toolkit'' for short), a methodology to estimate annual releases from a number of sources. With this methodology, annual releases can be estimated by multiplying process-specific default emission factors provided in the Toolkit with national activity data. At the seventh session of the Intergovernmental Negotiating Committee, the Toolkit was recommended to be used by countries when reporting national release data to the Conference of the Parties. The Toolkit is especially used by developing countries and countries with economies in transition where no measured data are available. Results from Uruguay, Thailand, Jordan, Philippines, and Brunei Darussalam have been published.

  1. Tradeoffs in distributed databases

    OpenAIRE

    Juntunen, R. (Risto)

    2016-01-01

    Abstract In a distributed database data is spread throughout the network into separated nodes with different DBMS systems (Date, 2000). According to CAP-theorem three database properties — consistency, availability and partition tolerance cannot be achieved simultaneously in distributed database systems. Two of these properties can be achieved but not all three at the same time (Brewer, 2000). Since this theorem there has b...

  2. Dose apportionment using statistical modeling of the effluent release

    International Nuclear Information System (INIS)

    Datta, D.

    2011-01-01

    Nuclear power plants are always operated under the guidelines stipulated by the regulatory body. These guidelines basically contain the technical specifications of the specific power plant and provide the knowledge of the discharge limit of the radioactive effluent into the environment through atmospheric and aquatic route. However, operational constraints sometimes may violate the technical specification due to which there may be a failure to satisfy the stipulated dose apportioned to that plant. In a site having multi facilities sum total of the dose apportioned to all the facilities should be constrained to 1 mSv/year to the members of the public. Dose apportionment scheme basically stipulates the limit of the gaseous and liquid effluent released into the environment. Existing methodology of dose apportionment is subjective in nature that may result the discharge limit of the effluent in atmospheric and aquatic route in an adhoc manner. Appropriate scientific basis for dose apportionment is always preferable rather than judicial basis from the point of harmonization of establishing the dose apportionment. This paper presents an attempt of establishing the discharge limit of the gaseous and liquid effluent first on the basis of the existing value of the release of the same. Existing release data for a few years (for example 10 years) for any nuclear power station have taken into consideration. Bootstrap, a resampling technique, has been adopted on this data sets to generate the population which subsequently provide the corresponding population distribution of the effluent release. Cumulative distribution of the population distribution obtained is constructed and using this cumulative distribution, 95th percentile (upper bound) of the discharge limit of the radioactive effluents is computed. Dose apportioned for a facility is evaluated using this estimated upper bound of the release limit. Paper describes the detail of the bootstrap method in evaluating the

  3. Hydraulic running and release tool with mechanical emergency release

    International Nuclear Information System (INIS)

    Baker, S.F.

    1991-01-01

    This patent describes a setting tool for connection in a well string to position a tubular member in a well bore. It comprises: a mandrel adapted to be connected to the well string; an outer sleeve surrounding the mandrel and releasably secured thereto; a latch nut releasably connected to the outer sleeve; piston means sealingly engaging the mandrel; shear means releasably securing the piston to the latch nut to maintain the latch nut releasably connected to the tubular member; the mandrel having port means for conducting fluid pressure from the well string to release the piston means from and the latch nut; cooperating engageable surfaces on the piston and latch nut to reengage them together after the piston moves a predetermined longitudinal distance relative to the latch nut; and additional cooperating engageable surfaces on the latch nut and the outer sleeve which are engageable when the piston and engaged latch nut are moved a predetermined additional longitudinal distance by fluid pressure to secure the engaged piston and latch nut with the outer sleeve for retrieval along with the mandrel from the well bore

  4. Discounting future health benefits: the poverty of consistency arguments.

    Science.gov (United States)

    Nord, Erik

    2011-01-01

    In economic evaluation of health care, main stream practice is to discount benefits at the same rate as costs. But main papers in which this practice is advocated have missed a distinction between two quite different evaluation problems: (1) How much does the time of program occurrence matter for value and (2) how much do delays in health benefits from programs implemented at a given time matter? The papers have furthermore focused on logical and arithmetic arguments rather than on real value considerations. These 'consistency arguments' are at best trivial, at worst logically flawed. At the end of the day, there is a sensible argument for equal discounting of costs and benefits rooted in microeconomic theory of rational, utility maximising consumers' saving behaviour. But even this argument is problematic, first because the model is not clearly supported by empirical observations of individuals' time preferences for health, second because it relates only to evaluation in terms of overall individual utility. It does not provide grounds for claiming that decision makers with a wider societal perspective, which may include concerns for fair distribution, need to discount Copyright © 2010 John Wiley & Sons, Ltd. 2010 John Wiley & Sons, Ltd.

  5. Consistent Steering System using SCTP for Bluetooth Scatternet Sensor Network

    Science.gov (United States)

    Dhaya, R.; Sadasivam, V.; Kanthavel, R.

    2012-12-01

    Wireless communication is the best way to convey information from source to destination with flexibility and mobility and Bluetooth is the wireless technology suitable for short distance. On the other hand a wireless sensor network (WSN) consists of spatially distributed autonomous sensors to cooperatively monitor physical or environmental conditions, such as temperature, sound, vibration, pressure, motion or pollutants. Using Bluetooth piconet wireless technique in sensor nodes creates limitation in network depth and placement. The introduction of Scatternet solves the network restrictions with lack of reliability in data transmission. When the depth of the network increases, it results in more difficulties in routing. No authors so far focused on the reliability factors of Scatternet sensor network's routing. This paper illustrates the proposed system architecture and routing mechanism to increase the reliability. The another objective is to use reliable transport protocol that uses the multi-homing concept and supports multiple streams to prevent head-of-line blocking. The results show that the Scatternet sensor network has lower packet loss even in the congestive environment than the existing system suitable for all surveillance applications.

  6. MAP estimators and their consistency in Bayesian nonparametric inverse problems

    KAUST Repository

    Dashti, M.

    2013-09-01

    We consider the inverse problem of estimating an unknown function u from noisy measurements y of a known, possibly nonlinear, map applied to u. We adopt a Bayesian approach to the problem and work in a setting where the prior measure is specified as a Gaussian random field μ0. We work under a natural set of conditions on the likelihood which implies the existence of a well-posed posterior measure, μy. Under these conditions, we show that the maximum a posteriori (MAP) estimator is well defined as the minimizer of an Onsager-Machlup functional defined on the Cameron-Martin space of the prior; thus, we link a problem in probability with a problem in the calculus of variations. We then consider the case where the observational noise vanishes and establish a form of Bayesian posterior consistency for the MAP estimator. We also prove a similar result for the case where the observation of can be repeated as many times as desired with independent identically distributed noise. The theory is illustrated with examples from an inverse problem for the Navier-Stokes equation, motivated by problems arising in weather forecasting, and from the theory of conditioned diffusions, motivated by problems arising in molecular dynamics. © 2013 IOP Publishing Ltd.

  7. A new mixed self-consistent field procedure

    Science.gov (United States)

    Alvarez-Ibarra, A.; Köster, A. M.

    2015-10-01

    A new approach for the calculation of three-centre electronic repulsion integrals (ERIs) is developed, implemented and benchmarked in the framework of auxiliary density functional theory (ADFT). The so-called mixed self-consistent field (mixed SCF) divides the computationally costly ERIs in two sets: far-field and near-field. Far-field ERIs are calculated using the newly developed double asymptotic expansion as in the direct SCF scheme. Near-field ERIs are calculated only once prior to the SCF procedure and stored in memory, as in the conventional SCF scheme. Hence the name, mixed SCF. The implementation is particularly powerful when used in parallel architectures, since all RAM available are used for near-field ERI storage. In addition, the efficient distribution algorithm performs minimal intercommunication operations between processors, avoiding a potential bottleneck. One-, two- and three-dimensional systems are used for benchmarking, showing substantial time reduction in the ERI calculation for all of them. A Born-Oppenheimer molecular dynamics calculation for the Na+55 cluster is also shown in order to demonstrate the speed-up for small systems achievable with the mixed SCF. Dedicated to Sourav Pal on the occasion of his 60th birthday.

  8. MAP estimators and their consistency in Bayesian nonparametric inverse problems

    International Nuclear Information System (INIS)

    Dashti, M; Law, K J H; Stuart, A M; Voss, J

    2013-01-01

    We consider the inverse problem of estimating an unknown function u from noisy measurements y of a known, possibly nonlinear, map G applied to u. We adopt a Bayesian approach to the problem and work in a setting where the prior measure is specified as a Gaussian random field μ 0 . We work under a natural set of conditions on the likelihood which implies the existence of a well-posed posterior measure, μ y . Under these conditions, we show that the maximum a posteriori (MAP) estimator is well defined as the minimizer of an Onsager–Machlup functional defined on the Cameron–Martin space of the prior; thus, we link a problem in probability with a problem in the calculus of variations. We then consider the case where the observational noise vanishes and establish a form of Bayesian posterior consistency for the MAP estimator. We also prove a similar result for the case where the observation of G(u) can be repeated as many times as desired with independent identically distributed noise. The theory is illustrated with examples from an inverse problem for the Navier–Stokes equation, motivated by problems arising in weather forecasting, and from the theory of conditioned diffusions, motivated by problems arising in molecular dynamics. (paper)

  9. Consistent microscopic and phenomenological analysis of composite particle opticle potential

    International Nuclear Information System (INIS)

    Mukhopadhyay, Sheela; Srivastava, D.K.; Ganguly, N.K.

    1976-01-01

    A microscopic calculation of composits particle optical potential has been done using a realistic nucleon-helion interaction and folding it with the density distribution of the targets. The second order effects were simulated by introducing a scaling factor which was searched on to reproduce the experimental scattering results. Composite particle optical potential was also derived from the nucleon-nucleus optical potential. The second order term was explicitly treated as a parameter. Elastic scattering of 20 MeV 3 H on targets ranging from 40 Ca to 208 Pb to 208 Pb have also been analysed using phenomenological optical model. Agreement of these results with the above calculations verified the consistency of the microscopic theory. But the equivalent sharp radius calculated with n-helion interaction was observed to be smaller than phenomenological value. This was attributed to the absence of saturation effects in the density-independent interaction used. Saturation has been introduced by a density dependent term of the form (1-c zetasup(2/3)), where zeta is the compound density of the target helion system. (author)

  10. Decentralized Consistent Network Updates in SDN with ez-Segway

    KAUST Repository

    Nguyen, Thanh Dang

    2017-03-06

    We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and black-holes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.

  11. Distributional Inference

    NARCIS (Netherlands)

    Kroese, A.H.; van der Meulen, E.A.; Poortema, Klaas; Schaafsma, W.

    1995-01-01

    The making of statistical inferences in distributional form is conceptionally complicated because the epistemic 'probabilities' assigned are mixtures of fact and fiction. In this respect they are essentially different from 'physical' or 'frequency-theoretic' probabilities. The distributional form is

  12. Competitive release and outbreaks of non-target pests associated with transgenic Bt cotton.

    Science.gov (United States)

    Zeilinger, Adam R; Olson, Dawn M; Andow, David A

    2016-06-01

    The adoption of transgenic Bt cotton has, in some cases, led to environmental and economic benefits through reduced insecticide use. However, the distribution of these benefits and associated risks among cotton growers and cotton-growing regions has been uneven due in part to outbreaks of non-target or secondary pests, thereby requiring the continued use of synthetic insecticides. In the southeastern USA, Bt cotton adoption has resulted in increased abundance of and damage from stink bug pests, Euschistus servus and Nezara viridula (Heteroptera: Pentatomidae). While the impact of increased stink bug abundance has been well-documented, the causes have remained unclear. We hypothesize that release from competition with Bt-susceptible target pests may drive stink bug outbreaks in Bt cotton. We first examined the evidence for competitive release of stink bugs through meta-analysis of previous studies. We then experimentally tested if herbivory by Bt-susceptible Helicoverpa zea increases stink bug leaving rates and deters oviposition on non-Bt cotton. Consistent with previous studies, we found differences in leaving rates only for E servus, but we found that both species strongly avoided ovipositing on H. zea-damaged plants. Considering all available evidence, competitive release of stink bug populations in Bt cotton likely contributes to outbreaks, though the relative importance of competitive release remains an open question. Ecological risk assessments of Bt crops and other transgenic insecticidal crops would benefit from greater understanding of the ecological mechanisms underlying non-target pest outbreaks and greater attention to indirect ecological effects more broadly.

  13. Molecular scaffold reorganization at the transmitter release site with vesicle exocytosis or botulinum toxin C1.

    Science.gov (United States)

    Stanley, Elise F; Reese, Tom S; Wang, Gary Z

    2003-10-01

    Neurotransmitter release sites at the freeze-fractured frog neuromuscular junction are composed of inner and outer paired rows of large membrane particles, the putative calcium channels, anchored by the ribs of an underlying protein scaffold. We analysed the locations of the release site particles as a reflection of the scaffold structure, comparing particle distributions in secreting terminals with those where secretion was blocked with botulinum toxin A, which cleaves a small segment off SNAP-25, or botulinum toxin C1, which cleaves the cytoplasmic domain of syntaxin. In the idle terminal the inner and outer paired rows were located approximately 25 and approximately 44 nm, respectively, from the release site midline. However, adjacent to vesicular fusion sites both particle rows were displaced towards the midline by approximately 25%. The intervals between the particles along each row were examined by a nearest-neighbour approach. In control terminals the peak interval along the inner row was approximately 17 nm, consistent with previous reports and the spacing of the scaffold ribs. While the average distance between particles in the outer row was also approximately 17 nm, a detailed analysis revealed short 'linear clusters' with a approximately 14 nm interval. These clusters were enriched at vesicle fusion sites, suggesting an association with the docking sites, and were eliminated by botulinum C1, but not A. Our findings suggest, first, that the release site scaffold ribs undergo a predictable, and possibly active, shortening during exocytosis and, second, that at the vesicle docking site syntaxin plays a role in the cross-linking of the rib tips to form the vesicle docking sites.

  14. Bioactive peptides released during of digestion of processed milk

    Science.gov (United States)

    Most of the proteins contained in milk consist of alpha-s1-, alpha-s2-, beta- and kappa-casein, and some of the peptides contained in these caseins may impart health benefits. To determine if processing affected release of peptides, samples of raw (R), homogenized (H), homogenized and pasteurized (...

  15. Random Sampling of Correlated Parameters – a Consistent Solution for Unfavourable Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Žerovnik, G., E-mail: gasper.zerovnik@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Trkov, A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Kodeli, I.A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Capote, R. [International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Smith, D.L. [Argonne National Laboratory, 1710 Avenida del Mundo, Coronado, CA 92118-3073 (United States)

    2015-01-15

    Two methods for random sampling according to a multivariate lognormal distribution – the correlated sampling method and the method of transformation of correlation coefficients – are briefly presented. The methods are mathematically exact and enable consistent sampling of correlated inherently positive parameters with given information on the first two distribution moments. Furthermore, a weighted sampling method to accelerate the convergence of parameters with extremely large relative uncertainties is described. However, the method is efficient only for a limited number of correlated parameters.

  16. THE INTRINSIC EDDINGTON RATIO DISTRIBUTION OF ACTIVE GALACTIC NUCLEI IN STAR-FORMING GALAXIES FROM THE SLOAN DIGITAL SKY SURVEY

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Mackenzie L.; Hickox, Ryan C.; Black, Christine S.; Hainline, Kevin N.; DiPompeo, Michael A. [Department of Physics and Astronomy, Dartmouth College, Hanover, NH 03755 (United States); Goulding, Andy D. [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States)

    2016-07-20

    An important question in extragalactic astronomy concerns the distribution of black hole accretion rates of active galactic nuclei (AGNs). Based on observations at X-ray wavelengths, the observed Eddington ratio distribution appears as a power law, while optical studies have often yielded a lognormal distribution. There is increasing evidence that these observed discrepancies may be due to contamination by star formation and other selection effects. Using a sample of galaxies from the Sloan Digital Sky Survey Data Release 7, we test whether or not an intrinsic Eddington ratio distribution that takes the form of a Schechter function is consistent with previous work suggesting that young galaxies in optical surveys have an observed lognormal Eddington ratio distribution. We simulate the optical emission line properties of a population of galaxies and AGNs using a broad, instantaneous luminosity distribution described by a Schechter function near the Eddington limit. This simulated AGN population is then compared to observed galaxies via their positions on an emission line excitation diagram and Eddington ratio distributions. We present an improved method for extracting the AGN distribution using BPT diagnostics that allows us to probe over one order of magnitude lower in Eddington ratio, counteracting the effects of dilution by star formation. We conclude that for optically selected AGNs in young galaxies, the intrinsic Eddington ratio distribution is consistent with a possibly universal, broad power law with an exponential cutoff, as this distribution is observed in old, optically selected galaxies and X-rays.

  17. Centralized versus distributed propulsion

    Science.gov (United States)

    Clark, J. P.

    1982-01-01

    The functions and requirements of auxiliary propulsion systems are reviewed. None of the three major tasks (attitude control, stationkeeping, and shape control) can be performed by a collection of thrusters at a single central location. If a centralized system is defined as a collection of separated clusters, made up of the minimum number of propulsion units, then such a system can provide attitude control and stationkeeping for most vehicles. A distributed propulsion system is characterized by more numerous propulsion units in a regularly distributed arrangement. Various proposed large space systems are reviewed and it is concluded that centralized auxiliary propulsion is best suited to vehicles with a relatively rigid core. These vehicles may carry a number of flexible or movable appendages. A second group, consisting of one or more large flexible flat plates, may need distributed propulsion for shape control. There is a third group, consisting of vehicles built up from multiple shuttle launches, which may be forced into a distributed system because of the need to add additional propulsion units as the vehicles grow. The effects of distributed propulsion on a beam-like structure were examined. The deflection of the structure under both translational and rotational thrusts is shown as a function of the number of equally spaced thrusters. When two thrusters only are used it is shown that location is an important parameter. The possibility of using distributed propulsion to achieve minimum overall system weight is also examined. Finally, an examination of the active damping by distributed propulsion is described.

  18. Multiplicative Consistency for Interval Valued Reciprocal Preference Relations

    OpenAIRE

    Wu, Jian; Chiclana, Francisco

    2014-01-01

    The multiplicative consistency (MC) property of interval additive reciprocal preference relations (IARPRs) is explored, and then the consistency index is quantified by the multiplicative consistency estimated IARPR. The MC property is used to measure the level of consistency of the information provided by the experts and also to propose the consistency index induced ordered weighted averaging (CI-IOWA) operator. The novelty of this operator is that it aggregates individual IARPRs in such ...

  19. Predicting hydrocarbon release from soil

    International Nuclear Information System (INIS)

    Poppendieck, D.; Loehr, R.C.

    2002-01-01

    'Full text:' The remediation of hazardous chemicals from soils can be a lengthy and costly process. As a result, recent regulatory initiatives have focused on risk-based corrective action (RBCA) approaches. Such approaches attempt to identify the amount of chemical that can be left at a site with contaminated soil and still be protective of human health and the environment. For hydrocarbons in soils to pose risk to human heath and the environment, the hydrocarbons must be released from the soil and accessible to microorganisms, earthworms, or other higher level organisms. The sorption of hydrocarbons to soil can reduce the availability of the hydrocarbon to receptors. Typically in soils and sediments, there is an initial fast release of a hydrocarbon from the soil to the aqueous phase followed by a slower release of the remaining hydrocarbon to the aqueous phase. The rate and extent of slow release can influence aqueous hydrocarbon concentrations and the fate and transport of hydrocarbons in the subsurface. Once the fast fraction of the chemical has been removed from the soil, the remaining fraction of a chemical may desorb at a rate that natural mechanisms can attenuate the released hydrocarbon. Hence, active remediation may be needed only until the fast fraction has been removed. However, the fast fraction is a soil and chemical specific parameter. This presentation will present a tier I type protocol that has been developed to quickly estimate the fraction of hydrocarbons that are readily released from the soil matrix to the aqueous phase. Previous research in our laboratory and elsewhere has used long-term desorption (four months) studies to determine the readily released fraction. This research shows that a single short-term (less than two weeks) batch extraction procedure provides a good estimate of the fast released fraction derived from long-term experiments. This procedure can be used as a tool to rapidly evaluate the release and bioavailability of

  20. Self-consistent ECCD calculations with bootstrap current

    International Nuclear Information System (INIS)

    Decker, J.; Bers, A.; Ram, A. K; Peysson, Y.

    2003-01-01

    To achieve high performance, steady-state operation in tokamaks, it is increasingly important to find the appropriate means for modifying and sustaining the pressure and magnetic shear profiles in the plasma. In such advanced scenarios, especially in the vicinity of internal transport barrier, RF induced currents have to be calculated self-consistently with the bootstrap current, thus taking into account possible synergistic effects resulting from the momentum space distortion of the electron distribution function f e . Since RF waves can cause the distribution of electrons to become non-Maxwellian, the associated changes in parallel diffusion of momentum between trapped and passing particles can be expected to modify the bootstrap current fraction; conversely, the bootstrap current distribution function can enhance the current driven by RF waves. For this purpose, a new, fast and fully implicit solver has been recently developed to carry out computations including new and detailed evaluations of the interactions between bootstrap current (BC) and Electron Cyclotron current drive (ECCD). Moreover, Ohkawa current drive (OKCD) appears to be an efficient method for driving current when the fraction of trapped particles is large. OKCD in the presence of BC is also investigated. Here, results are illustrated around projected tokamak parameters in high performance scenarios of AlcatorC-MOD. It is shown that by increasing n // , the EC wave penetration into the bulk of the electron distribution is greater, and since the resonance extends up to high p // values, this situation is the usual ECCD based on the Fisch-Boozer mechanism concerning passing particles. However, because of the close vicinity of the trapped boundary at r/a=0.7, this process is counterbalanced by the Ohkawa effect, possibly leading to a negative net current. Therefore, by injecting the EC wave in the opposite toroidal direction (n // RF by OKCD may be 70% larger than that of ECCD, with a choice of EC

  1. GEWEX SRB Shortwave Release 4

    Science.gov (United States)

    Cox, S. J.; Stackhouse, P. W., Jr.; Mikovitz, J. C.; Zhang, T.

    2017-12-01

    The NASA/GEWEX Surface Radiation Budget (SRB) project produces shortwave and longwave surface and top of atmosphere radiative fluxes for the 1983-near present time period. Spatial resolution is 1 degree. The new Release 4 uses the newly processed ISCCP HXS product as its primary input for cloud and radiance data. The ninefold increase in pixel number compared to the previous ISCCP DX allows finer gradations in cloud fraction in each grid box. It will also allow higher spatial resolutions (0.5 degree) in future releases. In addition to the input data improvements, several important algorithm improvements have been made since Release 3. These include recalculated atmospheric transmissivities and reflectivities yielding a less transmissive atmosphere. The calculations also include variable aerosol composition, allowing for the use of a detailed aerosol history from the Max Planck Institut Aerosol Climatology (MAC). Ocean albedo and snow/ice albedo are also improved from Release 3. Total solar irradiance is now variable, averaging 1361 Wm-2. Water vapor is taken from ISCCP's nnHIRS product. Results from GSW Release 4 are presented and analyzed. Early comparison to surface measurements show improved agreement.

  2. Aluminum corrosion product release kinetics

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, Matt, E-mail: Matthew.Edwards@cnl.ca; Semmler, Jaleh; Guzonas, Dave; Chen, Hui Qun; Toor, Arshad; Hoendermis, Seanna

    2015-07-15

    Highlights: • Release of Al corrosion product was measured in simulated post-LOCA sump solutions. • Increased boron was found to enhance Al release kinetics at similar pH. • Models of Al release as functions of time, temperature, and pH were developed. - Abstract: The kinetics of aluminum corrosion product release was examined in solutions representative of post-LOCA sump water for both pressurized water and pressurized heavy-water reactors. Coupons of AA 6061 T6 were exposed to solutions in the pH 7–11 range at 40, 60, 90 and 130 °C. Solution samples were analyzed by inductively coupled plasma atomic emission spectroscopy, and coupon samples were analyzed by secondary ion mass spectrometry. The results show a distinct “boron effect” on the release kinetics, expected to be caused by an increase in the solubility of the aluminum corrosion products. New models were developed to describe both sets of data as functions of temperature, time, and pH (where applicable)

  3. Preparation of Cotton-Wool-Like Poly(lactic acid-Based Composites Consisting of Core-Shell-Type Fibers

    Directory of Open Access Journals (Sweden)

    Jian Wang

    2015-11-01

    Full Text Available In previous works, we reported the fabrication of cotton-wool-like composites consisting of siloxane-doped vaterite and poly(l-lactic acid (SiVPCs. Various irregularly shaped bone voids can be filled with the composite, which effectively supplies calcium and silicate ions, enhancing the bone formation by stimulating the cells. The composites, however, were brittle and showed an initial burst release of ions. In the present work, to improve the mechanical flexibility and ion release, the composite fiber was coated with a soft, thin layer consisting of poly(d,l-lactic-co-glycolic acid (PLGA. A coaxial electrospinning technique was used to prepare a cotton-wool-like material comprising “core-shell”-type fibers with a diameter of ~12 µm. The fibers, which consisted of SiVPC coated with a ~2-µm-thick PLGA layer, were mechanically flexible; even under a uniaxial compressive load of 1.5 kPa, the cotton-wool-like material did not exhibit fracture of the fibers and, after removing the load, showed a ~60% recovery. In Tris buffer solution, the initial burst release of calcium and silicate ions from the “core-shell”-type fibers was effectively controlled, and the ions were slowly released after one day. Thus, the mechanical flexibility and ion-release behavior of the composites were drastically improved by the thin PLGA coating.

  4. Stochastic Modeling of Radioactive Material Releases

    Energy Technology Data Exchange (ETDEWEB)

    Andrus, Jason [Idaho National Lab. (INL), Idaho Falls, ID (United States); Pope, Chad [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    Nonreactor nuclear facilities operated under the approval authority of the U.S. Department of Energy use unmitigated hazard evaluations to determine if potential radiological doses associated with design basis events challenge or exceed dose evaluation guidelines. Unmitigated design basis events that sufficiently challenge dose evaluation guidelines or exceed the guidelines for members of the public or workers, merit selection of safety structures, systems, or components or other controls to prevent or mitigate the hazard. Idaho State University, in collaboration with Idaho National Laboratory, has developed a portable and simple to use software application called SODA (Stochastic Objective Decision-Aide) that stochastically calculates the radiation dose associated with hypothetical radiological material release scenarios. Rather than producing a point estimate of the dose, SODA produces a dose distribution result to allow a deeper understanding of the dose potential. SODA allows users to select the distribution type and parameter values for all of the input variables used to perform the dose calculation. SODA then randomly samples each distribution input variable and calculates the overall resulting dose distribution. In cases where an input variable distribution is unknown, a traditional single point value can be used. SODA was developed using the MATLAB coding framework. The software application has a graphical user input. SODA can be installed on both Windows and Mac computers and does not require MATLAB to function. SODA provides improved risk understanding leading to better informed decision making associated with establishing nuclear facility material-at-risk limits and safety structure, system, or component selection. It is important to note that SODA does not replace or compete with codes such as MACCS or RSAC, rather it is viewed as an easy to use supplemental tool to help improve risk understanding and support better informed decisions. The work was

  5. Stochastic Modeling of Radioactive Material Releases

    International Nuclear Information System (INIS)

    Andrus, Jason; Pope, Chad

    2015-01-01

    Nonreactor nuclear facilities operated under the approval authority of the U.S. Department of Energy use unmitigated hazard evaluations to determine if potential radiological doses associated with design basis events challenge or exceed dose evaluation guidelines. Unmitigated design basis events that sufficiently challenge dose evaluation guidelines or exceed the guidelines for members of the public or workers, merit selection of safety structures, systems, or components or other controls to prevent or mitigate the hazard. Idaho State University, in collaboration with Idaho National Laboratory, has developed a portable and simple to use software application called SODA (Stochastic Objective Decision-Aide) that stochastically calculates the radiation dose associated with hypothetical radiological material release scenarios. Rather than producing a point estimate of the dose, SODA produces a dose distribution result to allow a deeper understanding of the dose potential. SODA allows users to select the distribution type and parameter values for all of the input variables used to perform the dose calculation. SODA then randomly samples each distribution input variable and calculates the overall resulting dose distribution. In cases where an input variable distribution is unknown, a traditional single point value can be used. SODA was developed using the MATLAB coding framework. The software application has a graphical user input. SODA can be installed on both Windows and Mac computers and does not require MATLAB to function. SODA provides improved risk understanding leading to better informed decision making associated with establishing nuclear facility material-at-risk limits and safety structure, system, or component selection. It is important to note that SODA does not replace or compete with codes such as MACCS or RSAC, rather it is viewed as an easy to use supplemental tool to help improve risk understanding and support better informed decisions. The work was

  6. Training Materials for Release 3

    DEFF Research Database (Denmark)

    Wake, Jo Dugstad; Hansen, Cecilie; Debus, Kolja

    This document, D7.4 – training materials for release 3, provides an overview of the training material for version 3 of the NEXT-TELL tools and methods. Previous documents submitted as part of work package 7, which is about teacher training, are D7.1 – Training Concept, D7.2 – Training Materials...... for Release 1 and D7.3 – Training Materials for Release 2. D7.4 builds on D7.1 and D7.2 and D7.3. D7.4 contains further development of previous work within WP7, essentially a revised theoretical approach to the teacher training, and expansion of the notion of tool training. The media in use have been expanded...

  7. Controlled Release from Recombinant Polymers

    Science.gov (United States)

    Price, Robert; Poursaid, Azadeh; Ghandehari, Hamidreza

    2014-01-01

    Recombinant polymers provide a high degree of molecular definition for correlating structure with function in controlled release. The wide array of amino acids available as building blocks for these materials lend many advantages including biorecognition, biodegradability, potential biocompatibility, and control over mechanical properties among other attributes. Genetic engineering and DNA manipulation techniques enable the optimization of structure for precise control over spatial and temporal release. Unlike the majority of chemical synthetic strategies used, recombinant DNA technology has allowed for the production of monodisperse polymers with specifically defined sequences. Several classes of recombinant polymers have been used for controlled drug delivery. These include, but are not limited to, elastin-like, silk-like, and silk-elastinlike proteins, as well as emerging cationic polymers for gene delivery. In this article, progress and prospects of recombinant polymers used in controlled release will be reviewed. PMID:24956486

  8. Nanostructured Diclofenac Sodium Releasing Material

    Science.gov (United States)

    Nikkola, L.; Vapalahti, K.; Harlin, A.; Seppälä, J.; Ashammakhi, N.

    2008-02-01

    Various techniques have been developed to produce second generation biomaterials for tissue repair. These include extrusion, molding, salt leaching, spinning etc, but success in regenerating tissues has been limited. It is important to develop porous material, yet with a fibrous structure for it to be biomimetic. To mimic biological tissues, the extra-cellular matrix usually contains fibers in nano scale. To produce nanostructures, self-assembly or electrospinning can be used. Adding a drug release function to such a material may advance applications further for use in controlled tissue repair. This turns the resulting device into a multifunctional porous, fibrous structure to support cells and drug releasing properties in order to control tissue reactions. A bioabsorbable poly(ɛ-caprolactone-co-D,L lactide) 95/5 (PCL) was made into diluted solution using a solvent, to which was added 2w-% of diclofenac sodium (DS). Nano-fibers were made by electrospinning onto substrate. Microstructure of the resulting nanomat was studied using SEM and drug release profiles with UV/VIS spectroscopy. Thickness of the electrospun nanomat was about 2 mm. SEM analysis showed that polymeric nano-fibers containing drug particles form a highly interconnected porous nano structure. Average diameter of the nano-fibers was 130 nm. There was a high burst peak in drug release, which decreased to low levels after one day. The used polymer has slow a degradation rate and though the nanomat was highly porous with a large surface area, drug release rate is slow. It is feasible to develop a nano-fibrous porous structure of bioabsorbable polymer, which is loaded with test drug. Drug release is targeted at improving the properties of biomaterial for use in controlled tissue repair and regeneration.

  9. Large scale Bayesian nuclear data evaluation with consistent model defects

    International Nuclear Information System (INIS)

    Schnabel, G

    2015-01-01

    The aim of nuclear data evaluation is the reliable determination of cross sections and related quantities of the atomic nuclei. To this end, evaluation methods are applied which combine the information of experiments with the results of model calculations. The evaluated observables with their associated uncertainties and correlations are assembled into data sets, which are required for the development of novel nuclear facilities, such as fusion reactors for energy supply, and accelerator driven systems for nuclear waste incineration. The efficiency and safety of such future facilities is dependent on the quality of these data sets and thus also on the reliability of the applied evaluation methods. This work investigated the performance of the majority of available evaluation methods in two scenarios. The study indicated the importance of an essential component in these methods, which is the frequently ignored deficiency of nuclear models. Usually, nuclear models are based on approximations and thus their predictions may deviate from reliable experimental data. As demonstrated in this thesis, the neglect of this possibility in evaluation methods can lead to estimates of observables which are inconsistent with experimental data. Due to this finding, an extension of Bayesian evaluation methods is proposed to take into account the deficiency of the nuclear models. The deficiency is modeled as a random function in terms of a Gaussian process and combined with the model prediction. This novel formulation conserves sum rules and allows to explicitly estimate the magnitude of model deficiency. Both features are missing in available evaluation methods so far. Furthermore, two improvements of existing methods have been developed in the course of this thesis. The first improvement concerns methods relying on Monte Carlo sampling. A Metropolis-Hastings scheme with a specific proposal distribution is suggested, which proved to be more efficient in the studied scenarios than the

  10. Limited Releases of Krsko NPP

    International Nuclear Information System (INIS)

    Breznik, B.; Kovac, A.

    2001-01-01

    Full text: Krsko Nuclear Power Plant is about 700 MW Pressurised Water Reactor plant located in Slovenia close to the border with Croatia. The authorised limit for the radioactive releases is basically set to 50 μSv effective dose per year to the members of the public. There is also additional limitation of total activities released in a year and concentration. The poster presents the effluents of the year 2000 and evaluated dose referring to the limits and to the natural and other sources of radiation around the plant. (author)

  11. Fuel pin bowing and related investigation of WWER-440 control rod influence on power release inside of neighbouring fuel pins

    International Nuclear Information System (INIS)

    Mikus, J.

    2005-01-01

    The purpose of this work consists in investigation of the WWER-440 control rod (CR) influence on space power distribution, especially from viewpoint of the values and gradient occurrence that could result in static and cyclic loads with some consequences, e.g. fuel pin bowing. As known, CR can cause power peaks in periphery fuel pins of adjacent operating assemblies because of the butt joint design of the absorbing adapter to the CR fuel part, that is, presence of the water cavity resulting in a flash up of thermal neutrons. As a consequence, beside well-known peaks in axial power distribution, above power gradients can occur inside of mentioned fuel pins. Because of complicated geometry and material composition of the CR, the detailed calculations concerning both above phenomena are complicated, too. Therefore it is useful to acquire appropriate experimental data to investigate mentioned influence and compare them with calculations. Since detailed power distributions cannot be obtained in the NPP, needed information is provided by means of experiments on research reactors. In case of measurements inside of fuel pins, special (e.g. track) detectors placed between fuel pellets are used. Such works are relatively complicated and time consuming, therefore an evaluation based on mathematical modelling and numerical approximation was proposed by means of that, and using measured power release in some selected fuel pins, information about power release inside of one of these fuel pins, can be obtained. For this purpose, an experiment on light water, zero-power research reactor LR-0 was realized and axial power distribution measurements were performed in a WWER-440 type core near to an authentic CR model. Application of the above evaluation method is demonstrated on one ''investigated'' fuel pin neighbouring CR by means of following results: 1. Axial power distribution inside of investigated fuel pin in two opposite positions on its pellets surface that are situated to

  12. ACRR fission product release tests: ST-1 and ST-2

    International Nuclear Information System (INIS)

    Allen, M.D.; Stockman, H.W.; Reil, K.O.; Grimley, A.J.; Camp, W.J.

    1988-01-01

    Two experiments (ST-1 and ST-2) have been performed in the Annular Core Research Reactor (ACRR) at Sandia National Laboratories (SNLA) to obtain time-resolved data on the release of fission products from irradiated fuels under light water reactor (LWR) severe accident conditions. Both experiments were conducted in a highly reducing environment at maximum fuel temperatures of greater than 2400 K. These experiments were designed specifically to investigate the effect of increased total pressure on fission product release; ST-1 was performed at approximately 0.16 MPa and ST-2 was run at 1.9 MPa, whereas other parameters were matched as closely as possible. Release rate data were measured for Cs, I, Ba, Sr, Eu, Te, and U. The release rates were higher than predicted by existing codes for Ba, Sr, Eu, and U. Te release was very low, but Te did not appear to be sequestered by the zircaloy cladding; it was evenly distributed in the fuel. In addition, in posttest analysis a unique fuel morphology (fuel swelling) was observed which may have enhanced fission product release, especially in the high pressure test (ST-2). These data are compared with analytical results from the CORSOR correlation and the VICTORIA computer model

  13. Product consistency leach tests of Savannah River Site radioactive waste glasses

    International Nuclear Information System (INIS)

    Bibler, N.E.; Bates, J.K.

    1990-01-01

    The product consistency test (PCT) is a glass leach test developed at the Savannah River Site (SRS) to confirm the durability of radioactive nuclear waste glasses that will be produced in the Defense Waste Processing Facility. The PCT is a seven day, crushed glass leach test in deionized water at 90C. Final leachates are filtered and acidified prior to analysis. To demonstrate the reproducibility of the PCT when performed remotely, SRS and Argonne National Laboratory have performed the PCT on samples of two radioactive glasses. The tests were also performed to compare the releases of the radionuclides with the major nonradioactive glass components and to determine if radiation from the glass was affecting the results of the PCT. The test was performed in triplicate at each laboratory. For the major soluble elements, B, Li, Na, and Si, in the glass, each investigator obtained relative precisions in the range 2-5% in the triplicate tests. This range indicates good precision for the PCT when performed remotely with master slave manipulators in a shielded cell environment. When the results of the two laboratories were compared to each other, the agreement was within 20%. Normalized concentrations for the nonradioactive and radioactive elements in the PCT leachates measured at both facilities indicated that the radionuclides were released from the glass slower than the major soluble elements in the glass. For both laboratories, the normalized releases for both glasses were in the general order Li ∼ B ∼ Na > Si > Cs - 137 > Sb - 125 < Sr - 90. The normalized releases for the major soluble elements and the final pH values in the tests with radioactive glass are consistent with those for nonradioactive glasses with similar compositions. This indicates that there was no significant effect of radiation on the results of the PCT

  14. Radioimmunological and clinical studies with luteinizing hormone releasing hormone (LRH)

    International Nuclear Information System (INIS)

    Dahlen, H.G.

    1986-01-01

    Radioimmunoassay for Luteinizing Hormone Releasing Hormone (LRH) has been established, tested and applied. Optimal conditions for the performance with regards to incubation time, incubation temperature, concentration of antiserum and radiolabelled LRH have been established. The specificity of the LRH immunoassay was investigated. Problems with direct measurement of LRH in plasmas of radioimmunoassay are encountered. The LRH distribution in various tissues of the rat are investigated. By means of a system for continuous monitoring of LH and FSH in women the lowest effective dose of LRH causing a significant release of LH and FSH could be established. (Auth.)

  15. Impact of industrial nuclear releases into the English Channel

    International Nuclear Information System (INIS)

    Germain, P.; Guegueniat, P.

    1992-01-01

    The nuclear fuel reprocessing plant at La Hague is the main source of releases of weakly radioactive waste into the English Channel; there are also some contributions from nuclear power stations along the coast. Indicator species, seawater samples and sediments are used to study the distribution and transfer mechanisms of radionuclides in Channel waters. The observed pattern of radiolabelled zones is in good agreement with an hydrodynamic model for the Channel. The variations of activity with time are discussed in relation to releases from La Hague

  16. History of the CERN Web Software Public Releases

    CERN Document Server

    Fluckiger, Francois; CERN. Geneva. IT Department

    2016-01-01

    This note is an extended version of the article “Licencing the Web” (http://home.web.cern.ch/topics/birthweb/licensing-web) published by CERN, Nov 2013, in the “Birth of the Web” series of articles (http://home.cern/topics/birth-web). It describes the successive steps of the public release of the CERN Web software, from public domain to open source, and explains their rationale. It provides in annexes historical documents including release announcement and texts of the licences used by CERN and MIT in public software distributions.

  17. Experimental investigation of cavitation induced air release

    Directory of Open Access Journals (Sweden)

    Kowalski Karoline

    2017-01-01

    Full Text Available Variations in cross-sectional areas may lead to pressure drops below a critical value, such that cavitation and air release are provoked in hydraulic systems. Due to a relatively slow dissolution of gas bubbles, the performance of hydraulic systems will be affected on long time scales by the gas phase. Therefore predictions of air production rates are desirable to describe the system characteristics. Existing investigations on generic geometries such as micro-orifice flows show an outgassing process due to hydrodynamic cavitation which takes place on time scales far shorter than diffusion processes. The aim of the present investigation is to find a correlation between global, hydrodynamic flow characteristics and cavitation induced undissolved gas fractions generated behind generic flow constrictions such as an orifice or venturi tube. Experimental investigations are realised in a cavitation channel that enables an independent adjustment of the pressure level upstream and downstream of the orifice. Released air fractions are determined by means of shadowgraphy imaging. First results indicate that an increased cavitation activity leads to a rapid increase in undissolved gas volume only in the choking regime. The frequency distribution of generated gas bubble size seems to depend only indirectly on the cavitation intensity driven by an increase of downstream coalescence events due to a more densely populated bubbly flow.

  18. Experimental investigation of cavitation induced air release

    Science.gov (United States)

    Kowalski, Karoline; Pollak, Stefan; Hussong, Jeanette

    Variations in cross-sectional areas may lead to pressure drops below a critical value, such that cavitation and air release are provoked in hydraulic systems. Due to a relatively slow dissolution of gas bubbles, the performance of hydraulic systems will be affected on long time scales by the gas phase. Therefore predictions of air production rates are desirable to describe the system characteristics. Existing investigations on generic geometries such as micro-orifice flows show an outgassing process due to hydrodynamic cavitation which takes place on time scales far shorter than diffusion processes. The aim of the present investigation is to find a correlation between global, hydrodynamic flow characteristics and cavitation induced undissolved gas fractions generated behind generic flow constrictions such as an orifice or venturi tube. Experimental investigations are realised in a cavitation channel that enables an independent adjustment of the pressure level upstream and downstream of the orifice. Released air fractions are determined by means of shadowgraphy imaging. First results indicate that an increased cavitation activity leads to a rapid increase in undissolved gas volume only in the choking regime. The frequency distribution of generated gas bubble size seems to depend only indirectly on the cavitation intensity driven by an increase of downstream coalescence events due to a more densely populated bubbly flow.

  19. Poisson distribution

    NARCIS (Netherlands)

    Hallin, M.; Piegorsch, W.; El Shaarawi, A.

    2012-01-01

    The random variable X taking values 0,1,2,…,x,… with probabilities pλ(x) = e−λλx/x!, where λ∈R0+ is called a Poisson variable, and its distribution a Poisson distribution, with parameter λ. The Poisson distribution with parameter λ can be obtained as the limit, as n → ∞ and p → 0 in such a way that

  20. Privacy, Time Consistent Optimal Labour Income Taxation and Education Policy

    OpenAIRE

    Konrad, Kai A.

    1999-01-01

    Incomplete information is a commitment device for time consistency problems. In the context of time consistent labour income taxation privacy reduces welfare losses and increases the effectiveness of public education as a second best policy.

  1. FMCG companies specific distribution channels

    Directory of Open Access Journals (Sweden)

    Ioana Barin

    2009-12-01

    Full Text Available Distribution includes all activities undertaken by the producer, alone or in cooperation, since the end of the final finished products or services until they are in possession of consumers. The distribution consists of the following major components: distribution channels or marketing channels, which together form a distribution network; logistics o rphysical distribution. In order to effective achieve, distribution of goods requires an amount of activities and operational processes related to transit of goods from producer to consumer, the best conditions, using existing distribution channels and logistics system. One of the essential functions of a distribution is performing acts of sale, through which, with the actual movement of goods, their change of ownership takes place, that the successive transfer of ownership from producer to consumer. This is an itinerary in the economic cycle of goods, called the distribution channel.

  2. Distributed Visualization

    Data.gov (United States)

    National Aeronautics and Space Administration — Distributed Visualization allows anyone, anywhere, to see any simulation, at any time. Development focuses on algorithms, software, data formats, data systems and...

  3. Savannah River Site radioiodine atmospheric releases and offsite maximum doses

    International Nuclear Information System (INIS)

    Marter, W.L.

    1990-01-01

    Radioisotopes of iodine have been released to the atmosphere from the Savannah River Site since 1955. The releases, mostly from the 200-F and 200-H Chemical Separations areas, consist of the isotopes, I-129 and 1-131. Small amounts of 1-131 and 1-133 have also been released from reactor facilities and the Savannah River Laboratory. This reference memorandum was issued to summarize our current knowledge of releases of radioiodines and resultant maximum offsite doses. This memorandum supplements the reference memorandum by providing more detailed supporting technical information. Doses reported in this memorandum from consumption of the milk containing the highest I-131 concentration following the 1961 1-131 release incident are about 1% higher than reported in the reference memorandum. This is the result of using unrounded 1-131 concentrations of I-131 in milk in this memo. It is emphasized here that this technical report does not constitute a dose reconstruction in the same sense as the dose reconstruction effort currently underway at Hanford. This report uses existing published data for radioiodine releases and existing transport and dosimetry models

  4. SU-F-19A-08: Optimal Time Release Schedule of In-Situ Drug Release During Permanent Prostate Brachytherapy

    International Nuclear Information System (INIS)

    Cormack, R; Ngwa, W; Makrigiorgos, G; Tangutoori, S; Rajiv, K; Sridhar, S

    2014-01-01

    Purpose: Permanent prostate brachytherapy spacers can be used to deliver sustained doses of radiosentitizing drug directly to the target, in order to enhance the radiation effect. Implantable nanoplatforms for chemo-radiation therapy (INCeRTs) have a maximum drug capacity and can be engineered to control the drug release schedule. The optimal schedule for sensitization during continuous low dose rate irradiation is unknown. This work studies the optimal release schedule of drug for both traditional sensitizers, and those that work by suppressing DNA repair processes. Methods: Six brachytherapy treatment plans were used to model the anatomy, implant geometry and calculate the spatial distribution of radiation dose and drug concentrations for a range of drug diffusion parameters. Three state partial differential equations (cells healthy, damaged or dead) modeled the effect of continuous radiation (radiosensitivities α,β) and cellular repair (time tr) on a cell population. Radiosensitization was modeled as concentration dependent change in α,β or tr which with variable duration under the constraint of fixed total drug release. Average cell kill was used to measure effectiveness. Sensitization by means of both enhanced damage and reduced repair were studied. Results: Optimal release duration is dependent on the concentration of radiosensitizer compared to the saturation concentration (csat) above which additional sensitization does not occur. Long duration drug release when enhancing α or β maximizes cell death when drug concentrations are generally over csat. Short term release is optimal for concentrations below saturation. Sensitization by suppressing repair has a similar though less distinct trend that is more affected by the radiation dose distribution. Conclusion: Models of sustained local radiosensitization show potential to increase the effectiveness of radiation in permanent prostate brachytherapy. INCeRTs with high drug capacity produce the greatest

  5. Generalized contexts and consistent histories in quantum mechanics

    International Nuclear Information System (INIS)

    Losada, Marcelo; Laura, Roberto

    2014-01-01

    We analyze a restriction of the theory of consistent histories by imposing that a valid description of a physical system must include quantum histories which satisfy the consistency conditions for all states. We prove that these conditions are equivalent to imposing the compatibility conditions of our formalism of generalized contexts. Moreover, we show that the theory of consistent histories with the consistency conditions for all states and the formalism of generalized context are equally useful representing expressions which involve properties at different times

  6. Personality and Situation Predictors of Consistent Eating Patterns

    OpenAIRE

    Vainik, Uku; Dub?, Laurette; Lu, Ji; Fellows, Lesley K.

    2015-01-01

    Introduction A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studi...

  7. Energy Release in Solar Flares,

    Science.gov (United States)

    1982-10-01

    Plasma Research, Stanford University P. Kaufmanu CRAA/CNPq -Conseiho lacional de Desenvolvimento Cientifico e Tecnologico, Slo Paulo, SP, Brasil D.F...three phases of energy release in solar flares (Sturrock, 1980). However, a recent article by Feldman e a.. (1982) points to a significant

  8. Lignin based controlled release coatings

    NARCIS (Netherlands)

    Mulder, W.J.; Gosselink, R.J.A.; Vingerhoeds, M.H.; Harmsen, P.F.H.; Eastham, D.

    2011-01-01

    Urea is a commonly used fertilizer. Due to its high water-solubility, misuse easily leads to excess nitrogen levels in the soil. The aim of this research was to develop an economically feasible and biodegradable slow-release coating for urea. For this purpose, lignin was selected as coating

  9. Controlled Release from Zein Matrices

    NARCIS (Netherlands)

    Bouman, Jacob; Belton, Peter; Venema, Paul; Linden, Van Der Erik; Vries, De Renko; Qi, Sheng

    2016-01-01

    Purpose: In earlier studies, the corn protein zein is found to be suitable as a sustained release agent, yet the range of drugs for which zein has been studied remains small. Here, zein is used as a sole excipient for drugs differing in hydrophobicity and isoelectric point: indomethacin,

  10. Dry release of suspended nanostructures

    DEFF Research Database (Denmark)

    Forsén, Esko Sebastian; Davis, Zachary James; Dong, M.

    2004-01-01

    , the technique enables long time storage and transportation of produced devices without the risk of stiction. By combining the dry release method with a plasma deposited anti-stiction coating both fabrication induced stiction, which is mainly caused by capillary forces originating from the dehydration...

  11. Two Impossibility Results on the Converse Consistency Principle in Bargaining

    OpenAIRE

    Youngsub Chun

    1999-01-01

    We present two impossibility results on the converse consistency principle in the context of bargaining. First, we show that there is no solution satis-fying Pareto optimality, contraction independence, and converse consistency. Next, we show that there is no solution satisfying Pareto optimality, strong individual rationality, individual monotonicity, and converse consistency.

  12. Personality consistency analysis in cloned quarantine dog candidates

    Directory of Open Access Journals (Sweden)

    Jin Choi

    2017-01-01

    Full Text Available In recent research, personality consistency has become an important characteristic. Diverse traits and human-animal interactions, in particular, are studied in the field of personality consistency in dogs. Here, we investigated the consistency of dominant behaviours in cloned and control groups followed by the modified Puppy Aptitude Test, which consists of ten subtests to ascertain the influence of genetic identity. In this test, puppies are exposed to stranger, restraint, prey-like object, noise, startling object, etc. Six cloned and four control puppies participated and the consistency of responses at ages 7–10 and 16 weeks in the two groups was compared. The two groups showed different consistencies in the subtests. While the average scores of the cloned group were consistent (P = 0.7991, those of the control group were not (P = 0.0089. Scores of Pack Drive and Fight or Flight Drive were consistent in the cloned group, however, those of the control group were not. Scores of Prey Drive were not consistent in either the cloned or the control group. Therefore, it is suggested that consistency of dominant behaviour is affected by genetic identity and some behaviours can be influenced more than others. Our results suggest that cloned dogs could show more consistent traits than non-cloned. This study implies that personality consistency could be one of the ways to analyse traits of puppies.

  13. Checking Consistency of Pedigree Information is NP-complete

    DEFF Research Database (Denmark)

    Aceto, Luca; Hansen, Jens A.; Ingolfsdottir, Anna

    Consistency checking is a fundamental computational problem in genetics. Given a pedigree and information on the genotypes of some of the individuals in it, the aim of consistency checking is to determine whether these data are consistent with the classic Mendelian laws of inheritance. This probl...

  14. 26 CFR 1.338-8 - Asset and stock consistency.

    Science.gov (United States)

    2010-04-01

    ... that are controlled foreign corporations. (6) Stock consistency. This section limits the application of... 26 Internal Revenue 4 2010-04-01 2010-04-01 false Asset and stock consistency. 1.338-8 Section 1... (CONTINUED) INCOME TAXES Effects on Corporation § 1.338-8 Asset and stock consistency. (a) Introduction—(1...

  15. Stronger Consistency and Semantics for Low-Latency Geo-Replicated Storage

    Science.gov (United States)

    2013-06-01

    Wallach, Mike Burrows , Tushar Chandra, Andrew Fikes, and Robert E. Gruber. Bigtable: A distributed storage system for structured data. ACM TOCS, 26(2...propagation for weakly consistent replication. In SOSP, October 1997. [60] Larry Peterson, Andy Bavier, and Sapan Bhatia. VICCI: A programmable cloud

  16. Controlled release of diuron from an alginate-bentonite formulation: water release kinetics and soil mobility study.

    Science.gov (United States)

    Fernández-Pérez, M; Villafranca-Sánchez, M; González-Pradas, E; Flores-Céspedes, F

    1999-02-01

    The herbicide diuron was incorporated in alginate-based granules to obtain controlled release (CR) properties. The standard formulation (alginate-herbicide-water) was modified by the addition of different sorbents. The effect on diuron release rate caused by incorporation of natural and acid-treated bentonites in alginate formulation was studied by immersion of the granules in water under static conditions. The release of diuron was diffusion-controlled. The time taken for 50% release of active ingredient to be released into water, T(50), was calculated for the comparison of formulations. The addition of bentonite to the alginate-based formulation produced the higher T(50) values, indicating slower release of the diuron. The mobility of technical and formulated diuron was compared by using soil columns. The use of alginate-based CR formulations containing bentonite produced a less vertical distribution of the active ingredient as compared to the technical product and commercial formulation. Sorption capacities of the various soil constituents for diuron were also determined using batch experiments.

  17. Released air during vapor and air cavitation

    Energy Technology Data Exchange (ETDEWEB)

    Jablonská, Jana, E-mail: jana.jablonska@vsb.cz; Kozubková, Milada, E-mail: milada.kozubkova@vsb.cz [VŠB-Technical University of Ostrava, Faculty of Mechanical Engineering, Department of Hydromechanics and Hydraulic Equipment, 17. listopadu 15, 708 33 Ostrava-Poruba (Czech Republic)

    2016-06-30

    Cavitation today is a very important problem that is solved by means of experimental and mathematical methods. The article deals with the generation of cavitation in convergent divergent nozzle of rectangular cross section. Measurement of pressure, flow rate, temperature, amount of dissolved air in the liquid and visualization of cavitation area using high-speed camera was performed for different flow rates. The measurement results were generalized by dimensionless analysis, which allows easy detection of cavitation in the nozzle. For numerical simulation the multiphase mathematical model of cavitation consisting of water and vapor was created. During verification the disagreement with the measurements for higher flow rates was proved, therefore the model was extended to multiphase mathematical model (water, vapor and air), due to release of dissolved air. For the mathematical modeling the multiphase turbulence RNG k-ε model for low Reynolds number flow with vapor and air cavitation was used. Subsequently the sizes of the cavitation area were verified. In article the inlet pressure and loss coefficient depending on the amount of air added to the mathematical model are evaluated. On the basis of the approach it may be create a methodology to estimate the amount of released air added at the inlet to the modeled area.

  18. Energy market review releases draft report

    International Nuclear Information System (INIS)

    Anon

    2002-01-01

    The Energy Market Review Releases draft report has made recommendations consistent with the Australian Gas Association (AGA)'s submissions in a number of areas. In particular, it has endorsed: 1. the need for an independent review of the gas access regime, to address the deficiencies with current access regulation identified by the Productivity Commission's Review of the National Access Regime; 2. the need for greater upstream gas market competition; 3. the principle that significant regulatory decisions should be subject to clear merits and judicial review; and 4. the need to avoid restrictions on retail energy prices. The report also endorses the need for a 'technology neutral' approach to greenhouse emissions abatement policy. It states that 'many of the current measures employed to reduce greenhouse gas emissions are poorly targeted', and that they 'target technologies or fuel types rather than greenhouse gas abatement.' Additionally, it explicitly recognises the key conclusions of the AGA's recently-released Research Paper, Reducing Greenhouse Emissions from Water Heating: Natural Gas as a Cost-effective Option. The draft report recommends the development of an economy-wide emissions trading system, to achieve a more cost-effective approach to greenhouse abatement

  19. Estimate of the instant release fraction for UO2 and MOX fuel at t=0

    International Nuclear Information System (INIS)

    Johnson, L.; Poinssot, C; Ferry, C.; Lovera, P.

    2004-07-01

    The Spent Fuel Stability Project of the European Union aims to develop a model predicting the radionuclide release rate from spent fuel as a function of time for geological disposal conditions. In the first phase of the project, an important aspect consists of the model development focused on defining the instant release fraction (IRF), which represents the fraction of the inventory of safety-relevant radionuclides that may be rapidly released from the fuel and fuel assembly materials at the time of canister breaching. The locations of these preferentially released radionuclides, their quantities, the evidence for their release and proposed estimates of the IRF for the key safety-relevant radionuclides for the case of fuel shortly after discharge from the reactor are the subjects of the present report. Spent fuel assemblies comprise several materials, including uranium oxide, Zircaloy and various steels or nickel alloys used in the structural components of fuel assemblies. Information on the distribution of both activation products and fission products in all these materials must be taken into account in deriving IRF values. Information is presented on the radionuclide distributions in the various materials and IRF values for key radionuclides are proposed. The IRF increases with burnup, particularly above 50 GWd/t IHM . The estimated IRF values are functions of the numbers of spent fuel assemblies in various burnup intervals. Use of bounding fission gas release (FGR) values lead to overestimates of derived IRF values. The problem of uncertainties must be given attention because there is considerable scatter in FGR data, as well as in data from fission product leaching studies. The following approaches and definitions are adopted in this report: a) Best estimate, based on a good understanding of the mechanism and a good quality database; b) Bounding or pessimistic estimate based on data and process understanding that provides a maximum for the range of derived

  20. 28 CFR 2.83 - Release planning.

    Science.gov (United States)

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Release planning. 2.83 Section 2.83... Release planning. (a) All grants of parole shall be conditioned on the development of a suitable release... parole date for purposes of release planning for up to 120 days without a hearing. If efforts to...