Two Impossibility Results on the Converse Consistency Principle in Bargaining
Youngsub Chun
1999-01-01
We present two impossibility results on the converse consistency principle in the context of bargaining. First, we show that there is no solution satis-fying Pareto optimality, contraction independence, and converse consistency. Next, we show that there is no solution satisfying Pareto optimality, strong individual rationality, individual monotonicity, and converse consistency.
Planck 2013 results. XXXI. Consistency of the Planck data
DEFF Research Database (Denmark)
Ade, P. A. R.; Arnaud, M.; Ashdown, M.
2014-01-01
The Planck design and scanning strategy provide many levels of redundancy that can be exploited to provide tests of internal consistency. One of the most important is the comparison of the 70 GHz (amplifier) and 100 GHz (bolometer) channels. Based on dierent instrument technologies, with feeds...... in the HFI channels would result in shifts in the posterior distributions of parameters of less than 0.3σ except for As, the amplitude of the primordial curvature perturbations at 0.05 Mpc-1, which changes by about 1.We extend these comparisons to include the sky maps from the complete nine-year mission...... located dierently in the focal plane, analysed independently by dierent teams using dierent software, and near∫ the minimum of diuse foreground emission, these channels are in eect two dierent experiments. The 143 GHz channel has the lowest noise level on Planck, and is near the minimum of unresolved...
Risk aversion vs. the Omega ratio : Consistency results
Balder, Sven; Schweizer, Nikolaus
This paper clarifies when the Omega ratio and related performance measures are consistent with second order stochastic dominance and when they are not. To avoid consistency problems, the threshold parameter in the ratio should be chosen as the expected return of some benchmark – as is commonly done
Planck 2013 results. XXXI. Consistency of the Planck data
Ade, P A R; Ashdown, M; Aumont, J; Baccigalupi, C; Banday, A.J; Barreiro, R.B; Battaner, E; Benabed, K; Benoit-Levy, A; Bernard, J.P; Bersanelli, M; Bielewicz, P; Bond, J.R; Borrill, J; Bouchet, F.R; Burigana, C; Cardoso, J.F; Catalano, A; Challinor, A; Chamballu, A; Chiang, H.C; Christensen, P.R; Clements, D.L; Colombi, S; Colombo, L.P.L; Couchot, F; Coulais, A; Crill, B.P; Curto, A; Cuttaia, F; Danese, L; Davies, R.D; Davis, R.J; de Bernardis, P; de Rosa, A; de Zotti, G; Delabrouille, J; Desert, F.X; Dickinson, C; Diego, J.M; Dole, H; Donzelli, S; Dore, O; Douspis, M; Dupac, X; Ensslin, T.A; Eriksen, H.K; Finelli, F; Forni, O; Frailis, M; Fraisse, A A; Franceschi, E; Galeotta, S; Ganga, K; Giard, M; Gonzalez-Nuevo, J; Gorski, K.M.; Gratton, S.; Gregorio, A; Gruppuso, A; Gudmundsson, J E; Hansen, F.K; Hanson, D; Harrison, D; Henrot-Versille, S; Herranz, D; Hildebrandt, S.R; Hivon, E; Hobson, M; Holmes, W.A.; Hornstrup, A; Hovest, W.; Huffenberger, K.M; Jaffe, T.R; Jaffe, A.H; Jones, W.C; Keihanen, E; Keskitalo, R; Knoche, J; Kunz, M; Kurki-Suonio, H; Lagache, G; Lahteenmaki, A; Lamarre, J.M; Lasenby, A; Lawrence, C.R; Leonardi, R; Leon-Tavares, J; Lesgourgues, J; Liguori, M; Lilje, P.B; Linden-Vornle, M; Lopez-Caniego, M; Lubin, P.M; Macias-Perez, J.F; Maino, D; Mandolesi, N; Maris, M; Martin, P.G; Martinez-Gonzalez, E; Masi, S; Matarrese, S; Mazzotta, P; Meinhold, P.R; Melchiorri, A; Mendes, L; Mennella, A; Migliaccio, M; Mitra, S; Miville-Deschenes, M.A; Moneti, A; Montier, L; Morgante, G; Mortlock, D; Moss, A; Munshi, D; Murphy, J A; Naselsky, P; Nati, F; Natoli, P; Norgaard-Nielsen, H.U; Noviello, F; Novikov, D; Novikov, I; Oxborrow, C.A; Pagano, L; Pajot, F; Paoletti, D; Partridge, B; Pasian, F; Patanchon, G; Pearson, D; Pearson, T.J; Perdereau, O; Perrotta, F; Piacentini, F; Piat, M; Pierpaoli, E; Pietrobon, D; Plaszczynski, S; Pointecouteau, E; Polenta, G; Ponthieu, N; Popa, L; Pratt, G.W; Prunet, S; Puget, J.L; Rachen, J.P; Reinecke, M; Remazeilles, M; Renault, C; Ricciardi, S.; Ristorcelli, I; Rocha, G.; Roudier, G; Rubino-Martin, J.A; Rusholme, B; Sandri, M; Scott, D; Stolyarov, V; Sudiwala, R; Sutton, D; Suur-Uski, A.S; Sygnet, J.F; Tauber, J.A; Terenzi, L; Toffolatti, L; Tomasi, M; Tristram, M; Tucci, M; Valenziano, L; Valiviita, J; Van Tent, B; Vielva, P; Villa, F; Wade, L.A; Wandelt, B.D; Wehus, I K; White, S D M; Yvon, D; Zacchei, A; Zonca, A
2014-01-01
The Planck design and scanning strategy provide many levels of redundancy that can be exploited to provide tests of internal consistency. One of the most important is the comparison of the 70 GHz (amplifier) and 100 GHz (bolometer) channels. Based on different instrument technologies, with feeds located differently in the focal plane, analysed independently by different teams using different software, and near the minimum of diffuse foreground emission, these channels are in effect two different experiments. The 143 GHz channel has the lowest noise level on Planck, and is near the minimum of unresolved foreground emission. In this paper, we analyse the level of consistency achieved in the 2013 Planck data. We concentrate on comparisons between the 70, 100, and 143 GHz channel maps and power spectra, particularly over the angular scales of the first and second acoustic peaks, on maps masked for diffuse Galactic emission and for strong unresolved sources. Difference maps covering angular scales from 8°...
How oxygen gave rise to eukaryotic sex
Hörandl, Elvira; Speijer, Dave
2018-01-01
9years ago. The large amount of ROS coming from a bacterial endosymbiont gave rise to DNA damage and vast increases in host genome mutation rates. Eukaryogenesis and chromosome evolution represent adaptations to oxidative stress. The host, an archaeon, most probably already had repair mechanisms
Survey and selection of assessment methodologies for GAVE options
International Nuclear Information System (INIS)
Weterings, R.
1999-05-01
The Dutch government is interested in the possibilities for a market introduction of new gaseous and liquid energy carriers. To this purpose the GAVE-programme was recently set up. This study is carried out within the framework of the GAVE-programme and aims at the selection of methodologies for assessing the technological, economic, ecological and social perspectives of these new energy options (so-called GAVE-options). Based on the results of these assessments the Dutch ministries of Housing, Planning and Environment (VROM) and Economic Affairs (EZ) will decide at the end of 1999 about starting demonstration projects of promising energy carriers
Energy Technology Data Exchange (ETDEWEB)
Weterings, R. [ed.] [TNO Milieu, Energie en Procesinnovatie TNO-MEP, Apeldoorn (Netherlands)
1999-05-01
The Dutch government is interested in the possibilities for a market introduction of new gaseous and liquid energy carriers. To this purpose the GAVE-programme was recently set up. This study is carried out within the framework of the GAVE-programme and aims at the selection of methodologies for assessing the technological, economic, ecological and social perspectives of these new energy options (so-called GAVE-options). Based on the results of these assessments the Dutch ministries of Housing, Planning and Environment (VROM) and Economic Affairs (EZ) will decide at the end of 1999 about starting demonstration projects of promising energy carriers.
Posterior consistency for Bayesian inverse problems through stability and regression results
International Nuclear Information System (INIS)
Vollmer, Sebastian J
2013-01-01
In the Bayesian approach, the a priori knowledge about the input of a mathematical model is described via a probability measure. The joint distribution of the unknown input and the data is then conditioned, using Bayes’ formula, giving rise to the posterior distribution on the unknown input. In this setting we prove posterior consistency for nonlinear inverse problems: a sequence of data is considered, with diminishing fluctuations around a single truth and it is then of interest to show that the resulting sequence of posterior measures arising from this sequence of data concentrates around the truth used to generate the data. Posterior consistency justifies the use of the Bayesian approach very much in the same way as error bounds and convergence results for regularization techniques do. As a guiding example, we consider the inverse problem of reconstructing the diffusion coefficient from noisy observations of the solution to an elliptic PDE in divergence form. This problem is approached by splitting the forward operator into the underlying continuum model and a simpler observation operator based on the output of the model. In general, these splittings allow us to conclude posterior consistency provided a deterministic stability result for the underlying inverse problem and a posterior consistency result for the Bayesian regression problem with the push-forward prior. Moreover, we prove posterior consistency for the Bayesian regression problem based on the regularity, the tail behaviour and the small ball probabilities of the prior. (paper)
New Results on a Stochastic Duel Game with Each Force Consisting of Heterogeneous Units
2013-02-01
study of duel models dates back to the 1910s, when Lanchester (1916) proposed differential equations that govern the strength of each force through time...which gave rise to what later became known as Lanchester models. A stream of works extended the Lanchester models—which are deterministic in nature...869. Kress, M. and Talmor, I. (1999). A new look at the 3:1 rule of combat through markov stochastic lanchester models. The Journal of the Operational
International Nuclear Information System (INIS)
Schrader, Heinrich
2000-01-01
Calibration in terms of activity of the ionization-chamber secondary standard measuring systems at the PTB is described. The measurement results of a Centronic IG12/A20, a Vinten ISOCAL IV and a radionuclide calibrator chamber for nuclear medicine applications are discussed, their energy-dependent efficiency curves established and the consistency checked using recently evaluated radionuclide decay data. Criteria for evaluating and transferring calibration factors (or efficiencies) are given
First results of GERDA Phase II and consistency with background models
Agostini, M.; Allardt, M.; Bakalyarov, A. M.; Balata, M.; Barabanov, I.; Baudis, L.; Bauer, C.; Bellotti, E.; Belogurov, S.; Belyaev, S. T.; Benato, G.; Bettini, A.; Bezrukov, L.; Bode1, T.; Borowicz, D.; Brudanin, V.; Brugnera, R.; Caldwell, A.; Cattadori, C.; Chernogorov, A.; D'Andrea, V.; Demidova, E. V.; Di Marco, N.; Domula, A.; Doroshkevich, E.; Egorov, V.; Falkenstein, R.; Frodyma, N.; Gangapshev, A.; Garfagnini, A.; Gooch, C.; Grabmayr, P.; Gurentsov, V.; Gusev, K.; Hakenmüller, J.; Hegai, A.; Heisel, M.; Hemmer, S.; Hofmann, W.; Hult, M.; Inzhechik, L. V.; Janicskó Csáthy, J.; Jochum, J.; Junker, M.; Kazalov, V.; Kihm, T.; Kirpichnikov, I. V.; Kirsch, A.; Kish, A.; Klimenko, A.; Kneißl, R.; Knöpfle, K. T.; Kochetov, O.; Kornoukhov, V. N.; Kuzminov, V. V.; Laubenstein, M.; Lazzaro, A.; Lebedev, V. I.; Lehnert, B.; Liao, H. Y.; Lindner, M.; Lippi, I.; Lubashevskiy, A.; Lubsandorzhiev, B.; Lutter, G.; Macolino, C.; Majorovits, B.; Maneschg, W.; Medinaceli, E.; Miloradovic, M.; Mingazheva, R.; Misiaszek, M.; Moseev, P.; Nemchenok, I.; Palioselitis, D.; Panas, K.; Pandola, L.; Pelczar, K.; Pullia, A.; Riboldi, S.; Rumyantseva, N.; Sada, C.; Salamida, F.; Salathe, M.; Schmitt, C.; Schneider, B.; Schönert, S.; Schreiner, J.; Schulz, O.; Schütz, A.-K.; Schwingenheuer, B.; Selivanenko, O.; Shevzik, E.; Shirchenko, M.; Simgen, H.; Smolnikov, A.; Stanco, L.; Vanhoefer, L.; Vasenko, A. A.; Veresnikova, A.; von Sturm, K.; Wagner, V.; Wegmann, A.; Wester, T.; Wiesinger, C.; Wojcik, M.; Yanovich, E.; Zhitnikov, I.; Zhukov, S. V.; Zinatulina, D.; Zuber, K.; Zuzel, G.
2017-01-01
The GERDA (GERmanium Detector Array) is an experiment for the search of neutrinoless double beta decay (0νββ) in 76Ge, located at Laboratori Nazionali del Gran Sasso of INFN (Italy). GERDA operates bare high purity germanium detectors submersed in liquid Argon (LAr). Phase II of data-taking started in Dec 2015 and is currently ongoing. In Phase II 35 kg of germanium detectors enriched in 76Ge including thirty newly produced Broad Energy Germanium (BEGe) detectors is operating to reach an exposure of 100 kg·yr within about 3 years data taking. The design goal of Phase II is to reduce the background by one order of magnitude to get the sensitivity for T1/20ν = O≤ft( {{{10}26}} \\right){{ yr}}. To achieve the necessary background reduction, the setup was complemented with LAr veto. Analysis of the background spectrum of Phase II demonstrates consistency with the background models. Furthermore 226Ra and 232Th contamination levels consistent with screening results. In the first Phase II data release we found no hint for a 0νββ decay signal and place a limit of this process T1/20ν > 5.3 \\cdot {1025} yr (90% C.L., sensitivity 4.0·1025 yr). First results of GERDA Phase II will be presented.
Directory of Open Access Journals (Sweden)
Yamashiro T
2015-02-01
Full Text Available Tsuneo Yamashiro,1 Tetsuhiro Miyara,1 Osamu Honda,2 Noriyuki Tomiyama,2 Yoshiharu Ohno,3 Satoshi Noma,4 Sadayuki Murayama1 On behalf of the ACTIve Study Group 1Department of Radiology, Graduate School of Medical Science, University of the Ryukyus, Nishihara, Okinawa, Japan; 2Department of Radiology, Osaka University Graduate School of Medicine, Suita, Osaka, Japan; 3Department of Radiology, Kobe University Graduate School of Medicine, Kobe, Hyogo, Japan; 4Department of Radiology, Tenri Hospital, Tenri, Nara, Japan Purpose: To assess the advantages of iterative reconstruction for quantitative computed tomography (CT analysis of pulmonary emphysema. Materials and methods: Twenty-two patients with pulmonary emphysema underwent chest CT imaging using identical scanners with three different tube currents: 240, 120, and 60 mA. Scan data were converted to CT images using Adaptive Iterative Dose Reduction using Three Dimensional Processing (AIDR3D and a conventional filtered-back projection mode. Thus, six scans with and without AIDR3D were generated per patient. All other scanning and reconstruction settings were fixed. The percent low attenuation area (LAA%; < -950 Hounsfield units and the lung density 15th percentile were automatically measured using a commercial workstation. Comparisons of LAA% and 15th percentile results between scans with and without using AIDR3D were made by Wilcoxon signed-rank tests. Associations between body weight and measurement errors among these scans were evaluated by Spearman rank correlation analysis. Results: Overall, scan series without AIDR3D had higher LAA% and lower 15th percentile values than those with AIDR3D at each tube current (P<0.0001. For scan series without AIDR3D, lower tube currents resulted in higher LAA% values and lower 15th percentiles. The extent of emphysema was significantly different between each pair among scans when not using AIDR3D (LAA%, P<0.0001; 15th percentile, P<0.01, but was not
Energy Technology Data Exchange (ETDEWEB)
Bosma, W.J.P. [Arthur D. Little International, Rotterdam (Netherlands)
1999-12-01
The final report (part 2) contains the detailed findings of the analysis, evaluation, and integration of Novem GAVE options and aims at the reader who is interested in the detailed findings, as well as an overview of the results. For readers who are mainly interested in the high-level results, and are comfortable with Dutch, there is a short text summary of our results, entitled 'Analyse en evaluatie van GAVE-ketens, management summary' (part 1). These appendices is for readers who are interested in the underlying data and detailed assumptions. 70 refs.
Energy Technology Data Exchange (ETDEWEB)
Bosma, W.J.P. [Arthur D. Little International, Rotterdam (Netherlands)
1999-12-01
This report contains the detailed findings of the analysis, evaluation, and integration of Novem GAVE options. This main report is meant for the reader who is interested in the detailed findings, as well as an overview of the results. For readers who are mainly interested in the high-level results, and are comfortable with Dutch, there is a short text summary of our results, entitled 'Analyse en evaluatie van GAVE-ketens, management summary' (part 1). Readers who are interested in the underlying data and detailed assumptions are encouraged to consult the appendix to this report, entitled 'Analyse en evaluatie van GAVE-ketens, appendices' (part 3)
Consistent deformations of dual formulations of linearized gravity: A no-go result
International Nuclear Information System (INIS)
Bekaert, Xavier; Boulanger, Nicolas; Henneaux, Marc
2003-01-01
The consistent, local, smooth deformations of the dual formulation of linearized gravity involving a tensor field in the exotic representation of the Lorentz group with Young symmetry type (D-3,1) (one column of length D-3 and one column of length 1) are systematically investigated. The rigidity of the Abelian gauge algebra is first established. We next prove a no-go theorem for interactions involving at most two derivatives of the fields
Directory of Open Access Journals (Sweden)
Kazuharu Bamba
2014-10-01
Full Text Available We reconstruct scalar field theories to realize inflation compatible with the BICEP2 result as well as the Planck. In particular, we examine the chaotic inflation model, natural (or axion inflation model, and an inflationary model with a hyperbolic inflaton potential. We perform an explicit approach to find out a scalar field model of inflation in which any observations can be explained in principle.
National Research Council Canada - National Science Library
Tompkins, Christina P; Fath, Denise M; Hamilton, Tracey A; Kan, Robert K
2006-01-01
The present study was conducted to optimize the operating procedure for the EZ- Retriever" microwave oven to produce consistent and reproducible staining results with microtubule-associated protein 2 (MAP-2...
Energy Technology Data Exchange (ETDEWEB)
Van den Heuvel, E.J.M.T.
1999-12-01
This main report contains a summary of the detailed findings of the analysis, evaluation, and integration of Novem GAVE options. The details are presented in the final report (part 2) in the form of copies of overhead sheets. That report aims at the reader who is interested in the detailed findings, as well as an overview of the results. For readers who are mainly interested in the high-level results, and are comfortable with Dutch, there is this short text summary of our results. Readers who are interested in the underlying data and detailed assumptions are encouraged to consult the appendix to this report, entitled 'Analyse en evaluatie van GAVE-ketens, appendices' (part 3)
Triassic marine reptiles gave birth to live young.
Cheng, Yen-Nien; Wu, Xiao-Chun; Ji, Qiang
2004-11-18
Sauropterygians form the largest and most diverse group of ancient marine reptiles that lived throughout nearly the entire Mesozoic era (from 250 to 65 million years ago). Although thousands of specimens of this group have been collected around the world since the description of the first plesiosaur in 1821 (ref. 3), no direct evidence has been found to determine whether any sauropterygians came on shore to lay eggs (oviparity) like sea turtles, or gave birth in the water to live young (viviparity) as ichthyosaurs and mosasauroids (marine lizards) did. Viviparity has been proposed for plesiosaur, pachypleurosaur and nothosaur sauropterygians, but until now no concrete evidence has been advanced. Here we report two gravid specimens of Keichousaurus hui Young from the Middle Triassic of China. These exquisitely preserved specimens not only provide the first unequivocal evidence of reproductive mode and sexual dimorphism in sauropterygians, but also indicate that viviparity could have been expedited by the evolution of a movable pelvis in pachypleurosaurs. By extension, this has implications for the reproductive pattern of other sauropterygians and Mesozoic marine reptiles that possessed a movable pelvis.
Energy Technology Data Exchange (ETDEWEB)
Diepenmaat, H.B. [Actors Procesmanagement, Zeist (Netherlands)
2000-02-01
The focus is on GAVE as an integral multi-actor process, as a collaboration process in which the parties have to find each other. The reason for this multi-actor approach is that the parties themselves have to realise the importance of sustainable energy. A multi-actor approach uses this crucial fact as the starting point. By means of the use of specific multi-actor methods and insights, it is possible during the collaboration process to develop a clear idea of the content of the joint path and the intended collaborative future, while paying specific attention to the individual roles of the players. Based on this, a well- considered and promising realisation path can be followed. The objective of this study is to illustrate - on the basis of two GAVE options - how the multi-actor approach can support the GAVE programme. In doing this, emphasis is placed on examples, the demonstration of added value of such an approach, and making recommendations for the future. Two experiments have been carried out on the basis of the Trinity approach. The Trinity approach is a set of methods specifically developed for supporting change processes which involve many parties (i.e. multi-actor processes). Trinity helps those involved to obtain a clear picture of both the route to be followed and the future situation to be aimed at. The term 'picture' must be understood literally: an important aspect of Trinity is the collaboration and communication with multi-actor models (also referred to as actorprints). By means of a diagram technique (figures and arrows) these models demonstrate the cohesion between the roles and activities of the various parties involved. The modelling process is instrumental in this; the actual result is that the image that is developed is formed and backed up by the participants in the process. Within these experiments, we have, so far, worked mainly in the shadow of the current GAVE process. A forceful participative use of the actor approach is
International Nuclear Information System (INIS)
Buck, John W.; McDonald, John P.; Taira, Randal Y.
2002-01-01
To support cleanup and closure of these tanks, modeling is performed to understand and predict potential impacts to human health and the environment. Pacific Northwest National Laboratory developed a screening tool for the United States Department of Energy, Office of River Protection that estimates the long-term human health risk, from a strategic planning perspective, posed by potential tank releases to the environment. This tool is being conditioned to more detailed model analyses to ensure consistency between studies and to provide scientific defensibility. Once the conditioning is complete, the system will be used to screen alternative cleanup and closure strategies. The integration of screening and detailed models provides consistent analyses, efficiencies in resources, and positive feedback between the various modeling groups. This approach of conditioning a screening methodology to more detailed analyses provides decision-makers with timely and defensible information and increases confidence in the results on the part of clients, regulators, and stakeholders
International Nuclear Information System (INIS)
Chirico, Robert D.; Kazakov, Andrei F.
2015-01-01
Highlights: • Heat capacities were measured for the temperature range (5 to 520) K. • The enthalpy of combustion was measured and the enthalpy of formation was derived. • Thermodynamic-consistency analysis resolved inconsistencies in literature enthalpies of sublimation. • An inconsistency in literature enthalpies of combustion was resolved. • Application of computational chemistry in consistency analysis was demonstrated successfully. - Abstract: Heat capacities and phase-transition properties for xanthone (IUPAC name 9H-xanthen-9-one and Chemical Abstracts registry number [90-47-1]) are reported for the temperature range 5 < T/K < 524. Statistical calculations were performed and thermodynamic properties for the ideal gas were derived based on molecular geometry optimization and vibrational frequencies calculated at the B3LYP/6-31+G(d,p) level of theory. These results are combined with sublimation pressures from the literature to allow critical evaluation of inconsistent enthalpies of sublimation for xanthone, also reported in the literature. Literature values for the enthalpy of combustion of xanthone are re-assessed, a revision is recommended for one result, and a new value for the enthalpy of formation of the ideal gas is derived. Comparisons with thermophysical properties reported in the literature are made for all other reported and derived properties, where possible
Gamayunov, K. V.; Khazanov, G. V.; Liemohn, M. W.; Fok, M.-C.; Ridley, A. J.
2009-01-01
Further development of our self-consistent model of interacting ring current (RC) ions and electromagnetic ion cyclotron (EMIC) waves is presented. This model incorporates large scale magnetosphere-ionosphere coupling and treats self-consistently not only EMIC waves and RC ions, but also the magnetospheric electric field, RC, and plasmasphere. Initial simulations indicate that the region beyond geostationary orbit should be included in the simulation of the magnetosphere-ionosphere coupling. Additionally, a self-consistent description, based on first principles, of the ionospheric conductance is required. These initial simulations further show that in order to model the EMIC wave distribution and wave spectral properties accurately, the plasmasphere should also be simulated self-consistently, since its fine structure requires as much care as that of the RC. Finally, an effect of the finite time needed to reestablish a new potential pattern throughout the ionosphere and to communicate between the ionosphere and the equatorial magnetosphere cannot be ignored.
International Nuclear Information System (INIS)
Schoenherr, Marek
2011-01-01
With the constantly increasing precision of experimental data acquired at the current collider experiments Tevatron and LHC the theoretical uncertainty on the prediction of multiparticle final states has to decrease accordingly in order to have meaningful tests of the underlying theories such as the Standard Model. A pure leading order calculation, defined in the perturbative expansion of said theory in the interaction constant, represents the classical limit to such a quantum field theory and was already found to be insufficient at past collider experiments, e.g. LEP or HERA. Such a leading order calculation can be systematically improved in various limits. If the typical scales of a process are large and the respective coupling constants are small, the inclusion of fixed-order higher-order corrections then yields quickly converging predictions with much reduced uncertainties. In certain regions of the phase space, still well within the perturbative regime of the underlying theory, a clear hierarchy of the inherent scales, however, leads to large logarithms occurring at every order in perturbation theory. In many cases these logarithms are universal and can be resummed to all orders leading to precise predictions in these limits. Multiparticle final states now exhibit both small and large scales, necessitating a description using both resummed and fixed-order results. This thesis presents the consistent combination of two such resummation schemes with fixed-order results. The main objective therefor is to identify and properly treat terms that are present in both formulations in a process and observable independent manner. In the first part the resummation scheme introduced by Yennie, Frautschi and Suura (YFS), resumming large logarithms associated with the emission of soft photons in massive QED, is combined with fixed-order next-to-leading matrix elements. The implementation of a universal algorithm is detailed and results are studied for various precision
van der Wielen, Nele; Falkingham, Jane; Channon, Andrew Amos
2018-04-23
Ghana is currently undergoing a profound demographic transition, with large increases in the number of older adults in the population. Older adults require greater levels of healthcare as illness and disability increase with age. Ghana therefore provides an important and timely case study of policy implementation aimed at improving equal access to healthcare in the context of population ageing. This paper examines the determinants of National Health Insurance (NHIS) enrolment in Ghana, using two different surveys and distinguishing between younger and older adults. Two surveys are used in order to investigate consistency in insurance enrolment. The comparison between age groups is aimed at understanding whether determinants differ for older adults. Previous studies have mainly focused on the enrolment of young and middle aged adults; thus by widening the focus to include older adults and taking into account differences in their demographic and socio-economic characteristics this paper provides a unique contribution to the literature. Using data from the 2007-2008 Study on Global Ageing and Adult Health (SAGE) and the 2012-2013 Ghanaian Living Standards Survey (GLSS) the determinants of NHIS enrolment among younger adults (aged 18-49) and older adults (aged 50 and over) are compared. Logistic regression explores the socio-economic and demographic determinants of NHIS enrolment and multinomial logistic regression investigates the correlates of insurance drop out. Similar results for people aged 18-49 and people aged 50 plus were revealed, with older adults having a slightly lower probability of dropping out of insurance coverage compared to younger adults. Both surveys confirm that education and wealth increase the likelihood of NHIS affiliation. Further, residential differences in insurance coverage are found, with greater NHIS coverage in urban areas. The findings give assurance that both datasets (SAGE and GLSS) are suitable for research on insurance affiliation
Christensen, J. H.; Larsen, M. A. D.; Christensen, O. B.; Drews, M.
2017-12-01
For more than 20 years, coordinated efforts to apply regional climate models to downscale GCM simulations for Europe have been pursued by an ever increasing group of scientists. This endeavor showed its first results during EU framework supported projects such as RACCS and MERCURE. Here, the foundation for today's advanced worldwide CORDEX approach was laid out by a core of six research teams, who conducted some of the first coordinated RCM simulations with the aim to assess regional climate change for Europe. However, it was realized at this stage that model bias in GCMs as well as RCMs made this task very challenging. As an immediate outcome, the idea was conceived to make an even more coordinated effort by constructing a well-defined and structured set of common simulations; this lead to the PRUDENCE project (2001-2004). Additional coordinated efforts involving ever increasing numbers of GCMs and RCMs followed in ENSEMBLES (2004-2009) and the ongoing Euro-CORDEX (officially commenced 2011) efforts. Along with the overall coordination, simulations have increased their standard resolution from 50km (PRUDENCE) to about 12km (Euro-CORDEX) and from time slice simulations (PRUDENCE) to transient experiments (ENSEMBLES and CORDEX); from one driving model and emission scenario (PRUDENCE) to several (Euro-CORDEX). So far, this wealth of simulations have been used to assess the potential impacts of future climate change in Europe providing a baseline change as defined by a multi-model mean change with associated uncertainties calculated from model spread in the ensemble. But how has the overall picture of state-of-the-art regional climate change projections changed over this period of almost two decades? Here we compare across scenarios, model resolutions and model vintage the results from PRUDENCE, ENSEMBLES and Euro-CORDEX. By appropriate scaling we identify robust findings about the projected future of European climate expressed by temperature and precipitation changes
Groenendijk, Peter; van der Sleen, Peter; Vlam, Mart; Bunyavejchewin, Sarayudh; Bongers, Frans; Zuidema, Pieter A
2015-10-01
The important role of tropical forests in the global carbon cycle makes it imperative to assess changes in their carbon dynamics for accurate projections of future climate-vegetation feedbacks. Forest monitoring studies conducted over the past decades have found evidence for both increasing and decreasing growth rates of tropical forest trees. The limited duration of these studies restrained analyses to decadal scales, and it is still unclear whether growth changes occurred over longer time scales, as would be expected if CO2 -fertilization stimulated tree growth. Furthermore, studies have so far dealt with changes in biomass gain at forest-stand level, but insights into species-specific growth changes - that ultimately determine community-level responses - are lacking. Here, we analyse species-specific growth changes on a centennial scale, using growth data from tree-ring analysis for 13 tree species (~1300 trees), from three sites distributed across the tropics. We used an established (regional curve standardization) and a new (size-class isolation) growth-trend detection method and explicitly assessed the influence of biases on the trend detection. In addition, we assessed whether aggregated trends were present within and across study sites. We found evidence for decreasing growth rates over time for 8-10 species, whereas increases were noted for two species and one showed no trend. Additionally, we found evidence for weak aggregated growth decreases at the site in Thailand and when analysing all sites simultaneously. The observed growth reductions suggest deteriorating growth conditions, perhaps due to warming. However, other causes cannot be excluded, such as recovery from large-scale disturbances or changing forest dynamics. Our findings contrast growth patterns that would be expected if elevated CO2 would stimulate tree growth. These results suggest that commonly assumed growth increases of tropical forests may not occur, which could lead to erroneous
Energy Technology Data Exchange (ETDEWEB)
Xu, Bin [School of Statistics, Jiangxi University of Finance and Economics, Nanchang, Jiangxi 330013 (China); Research Center of Applied Statistics, Jiangxi University of Finance and Economics, Nanchang, Jiangxi 330013 (China); Lin, Boqiang, E-mail: bqlin@xmu.edu.cn [Collaborative Innovation Center for Energy Economics and Energy Policy, China Institute for Studies in Energy Policy, Xiamen University, Xiamen, Fujian 361005 (China)
2017-03-15
China is currently the world's largest carbon dioxide (CO{sub 2}) emitter. Moreover, total energy consumption and CO{sub 2} emissions in China will continue to increase due to the rapid growth of industrialization and urbanization. Therefore, vigorously developing the high–tech industry becomes an inevitable choice to reduce CO{sub 2} emissions at the moment or in the future. However, ignoring the existing nonlinear links between economic variables, most scholars use traditional linear models to explore the impact of the high–tech industry on CO{sub 2} emissions from an aggregate perspective. Few studies have focused on nonlinear relationships and regional differences in China. Based on panel data of 1998–2014, this study uses the nonparametric additive regression model to explore the nonlinear effect of the high–tech industry from a regional perspective. The estimated results show that the residual sum of squares (SSR) of the nonparametric additive regression model in the eastern, central and western regions are 0.693, 0.054 and 0.085 respectively, which are much less those that of the traditional linear regression model (3.158, 4.227 and 7.196). This verifies that the nonparametric additive regression model has a better fitting effect. Specifically, the high–tech industry produces an inverted “U–shaped” nonlinear impact on CO{sub 2} emissions in the eastern region, but a positive “U–shaped” nonlinear effect in the central and western regions. Therefore, the nonlinear impact of the high–tech industry on CO{sub 2} emissions in the three regions should be given adequate attention in developing effective abatement policies. - Highlights: • The nonlinear effect of the high–tech industry on CO{sub 2} emissions was investigated. • The high–tech industry yields an inverted “U–shaped” effect in the eastern region. • The high–tech industry has a positive “U–shaped” nonlinear effect in other regions. • The linear impact
DEFF Research Database (Denmark)
Staunstrup, Jørgen
1998-01-01
This paper proposes that Interface Consistency is an important issue for the development of modular designs. Byproviding a precise specification of component interfaces it becomes possible to check that separately developedcomponents use a common interface in a coherent matter thus avoiding a very...... significant source of design errors. Awide range of interface specifications are possible, the simplest form is a syntactical check of parameter types.However, today it is possible to do more sophisticated forms involving semantic checks....
International Nuclear Information System (INIS)
von Barth, U.; Holm, B.
1996-01-01
With the aim of properly understanding the basis for and the utility of many-body perturbation theory as applied to extended metallic systems, we have calculated the electronic self-energy of the homogeneous electron gas within the GW approximation. The calculation has been carried out in a self-consistent way; i.e., the one-electron Green function obtained from Dyson close-quote s equation is the same as that used to calculate the self-energy. The self-consistency is restricted in the sense that the screened interaction W is kept fixed and equal to that of the random-phase approximation for the gas. We have found that the final results are marginally affected by the broadening of the quasiparticles, and that their self-consistent energies are still close to their free-electron counterparts as they are in non-self-consistent calculations. The reduction in strength of the quasiparticles and the development of satellite structure (plasmons) gives, however, a markedly smaller dynamical self-energy leading to, e.g., a smaller reduction in the quasiparticle strength as compared to non-self-consistent results. The relatively bad description of plasmon structure within the non-self-consistent GW approximation is marginally improved. A first attempt at including W in the self-consistency cycle leads to an even broader and structureless satellite spectrum in disagreement with experiment. copyright 1996 The American Physical Society
Postpartum Care Services and Birth Features of The Women Who Gave Birth in Burdur in 2009
Directory of Open Access Journals (Sweden)
Binali Catak
2011-10-01
Full Text Available AIM: In the study, it is aimed to evaluate postpartum care services and the delivery characteristics of the women who gave birth in Burdur in 2009. MATERIAL AND METHODS: In the study, the data is used about \\\\\\"Birth and Postpartum Care\\\\\\" of the research \\\\\\" Birth, Postpartum Care Services, and Nutritional Status of Children of the women who are giving birth in Burdur in 2009 \\\\\\". The population of the planned cross-sectional study are women who gave birth in Burdur in 2009. For the determination of the population, a list of women who gave birth in 2009 were used which was requested from family physicians. The reported number of women was 2318. The sample size representing the population to be reached was calculated as 1179. The data were collected using face-to-face interviews and were analyzed using SPSS package program. RESULTS: The mean age of the women was 27.1 (± 5.5 with an average size of households 4.3 (± 1.2. 22.1% of the women live with large families and 64.4% live in the village. 8.0% of the women were relatives with their husbands, 52.8% have arranged marriage and 1.3% have no official marriage. 1 in every 4 women is housewive, 1.8% have no formal education, 76.4% have no available social and 7.1% have no available health insurance. The average number of pregnancies of women is 2.1 (± 1.2 and number of children is 1.8 (± 0.8. Spontaneous abortion, induced abortion, stillbirth and death rate of children under 5 years of age are respectively 16.4%, 6.6%, 2.7%, 3.4%. 99.8% of the women have given birth in hospital, % 67.3 had medical supervision, 62.8% had cesarean birth. The average days of hospital stay after birth is 1.9 (± 3.1. 4.8% of the women after being discharged from the hospital have not received Postpartum Care (DSB. Of the women who have received DSB service, 2.2% had taken this service at home by family physician / family health stuff, 33.9% by obstetrician in practice. 92.2% of the women 1 time, 15
Childbirth care: the oral history of women who gave birth from the 1940s to 1980s
Leister,Nathalie; Riesco,Maria Luiza Gonzalez
2013-01-01
This study's objective was to gain a greater understanding of the changes that took place in the childbirth care model from the experience of women who gave birth in the State of Sao Paulo, Brazil from the 1940s to the 1980s. This is a descriptive study conducted with 20 women using the Thematic Oral History method. Data were collected through unstructured interviews. The theme extracted from the interviews was "The experience of childbirth". The results indicate a time and generational demar...
Energy Technology Data Exchange (ETDEWEB)
Fox, K. M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Edwards, T. B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Riley, W. T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Best, D. R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)
2015-09-03
In this report, the Savannah River National Laboratory provides chemical analyses and Product Consistency Test (PCT) results for several simulated low activity waste (LAW) glasses (designated as the January, March, and April 2015 LAW glasses) fabricated by the Pacific Northwest National Laboratory. The results of these analyses will be used as part of efforts to revise or extend the validation regions of the current Hanford Waste Treatment and Immobilization Plant glass property models to cover a broader span of waste compositions.
Energy Technology Data Exchange (ETDEWEB)
Fox, K. M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Edwards, T. B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Best, D. R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)
2015-07-07
In this report, the Savannah River National Laboratory provides chemical analyses and Product Consistency Test (PCT) results for several simulated low activity waste (LAW) glasses (designated as the August and October 2014 LAW glasses) fabricated by the Pacific Northwest National Laboratory. The results of these analyses will be used as part of efforts to revise or extend the validation regions of the current Hanford Waste Treatment and Immobilization Plant glass property models to cover a broader span of waste compositions.
Image recognition and consistency of response
Haygood, Tamara M.; Ryan, John; Liu, Qing Mary A.; Bassett, Roland; Brennan, Patrick C.
2012-02-01
Purpose: To investigate the connection between conscious recognition of an image previously encountered in an experimental setting and consistency of response to the experimental question. Materials and Methods: Twenty-four radiologists viewed 40 frontal chest radiographs and gave their opinion as to the position of a central venous catheter. One-to-three days later they again viewed 40 frontal chest radiographs and again gave their opinion as to the position of the central venous catheter. Half of the radiographs in the second set were repeated images from the first set and half were new. The radiologists were asked of each image whether it had been included in the first set. For this study, we are evaluating only the 20 repeated images. We used the Kruskal-Wallis test and Fisher's exact test to determine the relationship between conscious recognition of a previously interpreted image and consistency in interpretation of the image. Results. There was no significant correlation between recognition of the image and consistency in response regarding the position of the central venous catheter. In fact, there was a trend in the opposite direction, with radiologists being slightly more likely to give a consistent response with respect to images they did not recognize than with respect to those they did recognize. Conclusion: Radiologists' recognition of previously-encountered images in an observer-performance study does not noticeably color their interpretation on the second encounter.
Ishigaki, Masafumi; Kawamata, Ryota; Ouchi, Masami; Oguri, Masamune; Shimasaku, Kazuhiro; Ono, Yoshiaki
2018-02-01
We present UV luminosity functions of dropout galaxies at z∼ 6{--}10 with the complete Hubble Frontier Fields data. We obtain a catalog of ∼450 dropout-galaxy candidates (350, 66, and 40 at z∼ 6{--}7, 8, and 9, respectively), with UV absolute magnitudes that reach ∼ -14 mag, ∼2 mag deeper than the Hubble Ultra Deep Field detection limits. We carefully evaluate number densities of the dropout galaxies by Monte Carlo simulations, including all lensing effects such as magnification, distortion, and multiplication of images as well as detection completeness and contamination effects in a self-consistent manner. We find that UV luminosity functions at z∼ 6{--}8 have steep faint-end slopes, α ∼ -2, and likely steeper slopes, α ≲ -2 at z∼ 9{--}10. We also find that the evolution of UV luminosity densities shows a non-accelerated decline beyond z∼ 8 in the case of {M}trunc}=-15, but an accelerated one in the case of {M}trunc}=-17. We examine whether our results are consistent with the Thomson scattering optical depth from the Planck satellite and the ionized hydrogen fraction Q H II at z≲ 7 based on the standard analytic reionization model. We find that reionization scenarios exist that consistently explain all of the observational measurements with the allowed parameters of {f}esc}={0.17}-0.03+0.07 and {M}trunc}> -14.0 for {log}{ξ }ion}/[{erg}}-1 {Hz}]=25.34, where {f}esc} is the escape fraction, M trunc is the faint limit of the UV luminosity function, and {ξ }ion} is the conversion factor of the UV luminosity to the ionizing photon emission rate. The length of the reionization period is estimated to be {{Δ }}z={3.9}-1.6+2.0 (for 0.1< {Q}{{H}{{II}}}< 0.99), consistent with the recent estimate from Planck.
A Case of Resistant SUNCT that Gave Response to Steroid Treatment
Directory of Open Access Journals (Sweden)
Hakan Levent Gül
2014-03-01
Full Text Available Conjunctival injection and tearing with unilateral short lasting neuralgiform headache syndrome is called as SUNCT. Even though these headaches are reported seldomly, the prevalence is possibly higher than known. It is of importance to recognize these uncommon disorder, since its management differs from common primary headaches. Until today, it was reported to be treated with different types of drugs. We here reported a 30 years old male patient with normal neurological examination, blood examination and neuroimaging. Our patient gave no response to indomethacin, gabapentine and carbamazepine treatments. This case is an example of SUNCT case treated with steroid
International Nuclear Information System (INIS)
Johnson, S. G.; Adamic, M. L.: DiSanto, T.; Warren, A. R.; Cummings, D. G.; Foulkrod, L.; Goff, K. M.
1999-01-01
The ceramic waste form produced from the electrometallurgical treatment of sodium bonded spent fuel from the Experimental Breeder Reactor-II was tested using two immersion tests with separate and distinct purposes. The product consistency test is used to assess the consistency of the waste forms produced and thus is an indicator of a well-controlled process. The toxicity characteristic leaching procedure is used to determine whether a substance is to be considered hazardous by the Environmental Protection Agency. The proposed high level waste repository will not be licensed to receive hazardous waste, thus any waste forms destined to be placed there cannot be of a hazardous nature as defined by the Resource Conservation and Recovery Act. Results are presented from the first four fully radioactive ceramic waste forms produced and from seven ceramic waste forms produced from cold surrogate materials. The fully radioactive waste forms are approximately 2 kg in weight and were produced with salt used to treat 100 driver subassemblies of spent fuel
Energy Technology Data Exchange (ETDEWEB)
Fox, K. M. [Savannah River Site (SRS), Aiken, SC (United States); Edwards, T. B. [Savannah River Site (SRS), Aiken, SC (United States)
2015-12-01
In this report, the Savannah River National Laboratory provides chemical analyses and Product Consistency Test (PCT) results for 14 simulated high level waste glasses fabricated by the Pacific Northwest National Laboratory. The results of these analyses will be used as part of efforts to revise or extend the validation regions of the current Hanford Waste Treatment and Immobilization Plant glass property models to cover a broader span of waste compositions. The measured chemical composition data are reported and compared with the targeted values for each component for each glass. All of the measured sums of oxides for the study glasses fell within the interval of 96.9 to 100.8 wt %, indicating recovery of all components. Comparisons of the targeted and measured chemical compositions showed that the measured values for the glasses met the targeted concentrations within 10% for those components present at more than 5 wt %. The PCT results were normalized to both the targeted and measured compositions of the study glasses. Several of the glasses exhibited increases in normalized concentrations (NCi) after the canister centerline cooled (CCC) heat treatment. Five of the glasses, after the CCC heat treatment, had NC_{B} values that exceeded that of the Environmental Assessment (EA) benchmark glass. These results can be combined with additional characterization, including X-ray diffraction, to determine the cause of the higher release rates.
International Nuclear Information System (INIS)
Nam, Boda; Hwang, Jung Hwa; Lee, Young Mok; Park, Jai Soung; Jou, Sung Shick; Kim, Young Bae
2015-01-01
We compared the clinical and quantitative CT measurement parameters between chronic obstructive pulmonary disease (COPD) patients with and without consistent clinical symptoms and pulmonary function results. This study included 60 patients having a clinical diagnosis of COPD, who underwent chest CT scan and pulmonary function tests. These 60 patients were classified into typical and atypical groups, which were further sub-classified into 4 groups, based on their dyspnea score and the result of pulmonary function tests [typical 1: mild dyspnea and pulmonary function impairment (PFI); typical 2: severe dyspnea and PFI; atypical 1: mild dyspnea and severe PFI; atypical 2: severe dyspnea and mild PFI]. Quantitative measurements of the CT data for emphysema, bronchial wall thickness and air-trapping were performed using software analysis. Comparative statistical analysis was performed between the groups. The CT emphysema index correlated well with the results of the pulmonary functional test (typical 1 vs. atypical 1, p = 0.032), and the bronchial wall area ratio correlated with the dyspnea score (typical 1 vs. atypical 2, p = 0.033). CT air-trapping index also correlated with the results of the pulmonary function test (typical 1 vs. atypical 1, p = 0.012) and dyspnea score (typical 1 vs. atypical 2, p = 0.000), and was found to be the most significant parameter between the typical and atypical groups. Quantitative CT measurements for emphysema and airways correlated well with the dyspnea score and pulmonary function results in patients with COPD. Air-trapping was the most significant parameter between the typical vs. atypical group of COPD patients
Energy Technology Data Exchange (ETDEWEB)
Nam, Boda; Hwang, Jung Hwa [Dept. of Radiology, Soonchunhyang University Hospital, Seoul (Korea, Republic of); Lee, Young Mok [Bangbae GF Allergy Clinic, Seoul (Korea, Republic of); Park, Jai Soung [Dept. of Radiology, Soonchunhyang University Bucheon Hospital, Bucheon (Korea, Republic of); Jou, Sung Shick [Dept. of Radiology, Soonchunhyang University Cheonan Hospital, Cheonan (Korea, Republic of); Kim, Young Bae [Dept. of Preventive Medicine, Soonchunhyang University College of Medicine, Cheonan (Korea, Republic of)
2015-09-15
We compared the clinical and quantitative CT measurement parameters between chronic obstructive pulmonary disease (COPD) patients with and without consistent clinical symptoms and pulmonary function results. This study included 60 patients having a clinical diagnosis of COPD, who underwent chest CT scan and pulmonary function tests. These 60 patients were classified into typical and atypical groups, which were further sub-classified into 4 groups, based on their dyspnea score and the result of pulmonary function tests [typical 1: mild dyspnea and pulmonary function impairment (PFI); typical 2: severe dyspnea and PFI; atypical 1: mild dyspnea and severe PFI; atypical 2: severe dyspnea and mild PFI]. Quantitative measurements of the CT data for emphysema, bronchial wall thickness and air-trapping were performed using software analysis. Comparative statistical analysis was performed between the groups. The CT emphysema index correlated well with the results of the pulmonary functional test (typical 1 vs. atypical 1, p = 0.032), and the bronchial wall area ratio correlated with the dyspnea score (typical 1 vs. atypical 2, p = 0.033). CT air-trapping index also correlated with the results of the pulmonary function test (typical 1 vs. atypical 1, p = 0.012) and dyspnea score (typical 1 vs. atypical 2, p = 0.000), and was found to be the most significant parameter between the typical and atypical groups. Quantitative CT measurements for emphysema and airways correlated well with the dyspnea score and pulmonary function results in patients with COPD. Air-trapping was the most significant parameter between the typical vs. atypical group of COPD patients.
International Nuclear Information System (INIS)
Cook, J W S; Chapman, S C; Dendy, R O; Brady, C S
2011-01-01
We present particle-in-cell (PIC) simulations of minority energetic protons in deuterium plasmas, which demonstrate a collective instability responsible for emission near the lower hybrid frequency and its harmonics. The simulations capture the lower hybrid drift instability in a parameter regime motivated by tokamak fusion plasma conditions, and show further that the excited electromagnetic fields collectively and collisionlessly couple free energy from the protons to directed electron motion. This results in an asymmetric tail antiparallel to the magnetic field. We focus on obliquely propagating modes excited by energetic ions, whose ring-beam distribution is motivated by population inversions related to ion cyclotron emission, in a background plasma with a temperature similar to that of the core of a large tokamak plasma. A fully self-consistent electromagnetic relativistic PIC code representing all vector field quantities and particle velocities in three dimensions as functions of a single spatial dimension is used to model this situation, by evolving the initial antiparallel travelling ring-beam distribution of 3 MeV protons in a background 10 keV Maxwellian deuterium plasma with realistic ion-electron mass ratio. These simulations provide a proof-of-principle for a key plasma physics process that may be exploited in future alpha channelling scenarios for magnetically confined burning plasmas.
Didi, Jennifer; Lemée, Ludovic; Gibert, Laure; Pons, Jean-Louis; Pestel-Caron, Martine
2014-10-01
Staphylococcus lugdunensis is an emergent virulent coagulase-negative staphylococcus responsible for severe infections similar to those caused by Staphylococcus aureus. To understand its potentially pathogenic capacity and have further detailed knowledge of the molecular traits of this organism, 93 isolates from various geographic origins were analyzed by multi-virulence-locus sequence typing (MVLST), targeting seven known or putative virulence-associated loci (atlLR2, atlLR3, hlb, isdJ, SLUG_09050, SLUG_16930, and vwbl). The polymorphisms of the putative virulence-associated loci were moderate and comparable to those of the housekeeping genes analyzed by multilocus sequence typing (MLST). However, the MVLST scheme generated 43 virulence types (VTs) compared to 20 sequence types (STs) based on MLST, indicating that MVLST was significantly more discriminating (Simpson's index [D], 0.943). No hypervirulent lineage or cluster specific to carriage strains was defined. The results of multilocus sequence analysis of known and putative virulence-associated loci are consistent with a clonal population structure for S. lugdunensis, suggesting a coevolution of these genes with housekeeping genes. Indeed, the nonsynonymous to synonymous evolutionary substitutions (dN/dS) ratio, the Tajima's D test, and Single-likelihood ancestor counting (SLAC) analysis suggest that all virulence-associated loci were under negative selection, even atlLR2 (AtlL protein) and SLUG_16930 (FbpA homologue), for which the dN/dS ratios were higher. In addition, this analysis of virulence-associated loci allowed us to propose a trilocus sequence typing scheme based on the intragenic regions of atlLR3, isdJ, and SLUG_16930, which is more discriminant than MLST for studying short-term epidemiology and further characterizing the lineages of the rare but highly pathogenic S. lugdunensis. Copyright © 2014, American Society for Microbiology. All Rights Reserved.
Structural Consistency, Consistency, and Sequential Rationality.
Kreps, David M; Ramey, Garey
1987-01-01
Sequential equilibria comprise consistent beliefs and a sequentially ra tional strategy profile. Consistent beliefs are limits of Bayes ratio nal beliefs for sequences of strategies that approach the equilibrium strategy. Beliefs are structurally consistent if they are rationaliz ed by some single conjecture concerning opponents' strategies. Consis tent beliefs are not necessarily structurally consistent, notwithstan ding a claim by Kreps and Robert Wilson (1982). Moreover, the spirit of stru...
Directory of Open Access Journals (Sweden)
Seiya Nishiyama
2009-01-01
Full Text Available The maximally-decoupled method has been considered as a theory to apply an basic idea of an integrability condition to certain multiple parametrized symmetries. The method is regarded as a mathematical tool to describe a symmetry of a collective submanifold in which a canonicity condition makes the collective variables to be an orthogonal coordinate-system. For this aim we adopt a concept of curvature unfamiliar in the conventional time-dependent (TD self-consistent field (SCF theory. Our basic idea lies in the introduction of a sort of Lagrange manner familiar to fluid dynamics to describe a collective coordinate-system. This manner enables us to take a one-form which is linearly composed of a TD SCF Hamiltonian and infinitesimal generators induced by collective variable differentials of a canonical transformation on a group. The integrability condition of the system read the curvature C = 0. Our method is constructed manifesting itself the structure of the group under consideration. To go beyond the maximaly-decoupled method, we have aimed to construct an SCF theory, i.e., υ (external parameter-dependent Hartree-Fock (HF theory. Toward such an ultimate goal, the υ-HF theory has been reconstructed on an affine Kac-Moody algebra along the soliton theory, using infinite-dimensional fermion. An infinite-dimensional fermion operator is introduced through a Laurent expansion of finite-dimensional fermion operators with respect to degrees of freedom of the fermions related to a υ-dependent potential with a Υ-periodicity. A bilinear equation for the υ-HF theory has been transcribed onto the corresponding τ-function using the regular representation for the group and the Schur-polynomials. The υ-HF SCF theory on an infinite-dimensional Fock space F∞ leads to a dynamics on an infinite-dimensional Grassmannian Gr∞ and may describe more precisely such a dynamics on the group manifold. A finite-dimensional Grassmannian is identified with a Gr
One of the many visiting theoreticians, R P Feynman, who gave lectures at CERN during the year
CERN PhotoLab
1970-01-01
Visiting CERN in January was R P Feynman, who has recently been working on strong interaction theory. On 8 January, he packed the lecture theatre, as usual, when he gave a talk on inelastic hadron collisions and is here caught in a typically graphic pose.
"It Gave Me My Life Back": An Evaluation of a Specialist Legal Domestic Abuse Service.
Lea, Susan J; Callaghan, Lynne
2016-05-01
Community-based advocacy services are important in enabling victims to escape domestic abuse and rebuild their lives. This study evaluated a domestic abuse service. Two phases of research were conducted following case-file analysis (n = 86): surveys (n = 22) and interviews (n = 12) with victims, and interviews with key individuals (n = 12) based in related statutory and community organizations. The findings revealed the holistic model of legal, practical, mental health-related, and advocacy components resulted in a range of benefits to victims and enhanced interagency partnership working. Core elements of a successful needs-led, victim-centered service could be distilled. © The Author(s) 2015.
Thyroid function in mothers who gave birth to neonates with transient congenital hypothyroidism
International Nuclear Information System (INIS)
Karam, G.A.; Hakimi, H.; Rezaeian, M.; Gafarzadeh, A.; Rashidinejad, H.; Khaksari, M.
2009-01-01
Objective: To determine the thyroid status of mother's of newborns with primary congenital hypothyroidism. Methodology: Thyroid function tests were carried out on 80 mothers of hypothyroid newborns and 80 mothers of non-hypothyroid newborns as control. Results: The mean difference of the tests revealed that mothers of congenitally hypothyroid infants had a lower triiodothyronine resin uptake (T3RU) concentrations compared with the control population. The higher value of free thyroxin index (FTI) in case group showed a tendency to significance. The proportional frequency distribution showed; T3RU and triiodothyronine (T3) had a significant difference, and FTI showed a tendency to significance. There were no significant differences between; thyroid-stimulating hormone (TSH), thyroxine (T4) and anti-thyroid peroxidase antibodies (anti-TPO) in two groups. Conclusions: These results indicated that at least some cases of primary congenital hypothyroidism were attributable to the maternal thyroid disease. Therefore, we recommend that each pregnant woman should be assessed for thyroid function in region with a high prevalence of thyroid disease. (author)
Herschel CHESS discovery of the fossil cloud that gave birth to the Trapezium and Orion KL
López-Sepulcre, A.; Kama, M.; Ceccarelli, C.; Dominik, C.; Caux, E.; Fuente, A.; Alonso-Albi, T.
2013-01-01
Context. The Orion A molecular complex is a nearby (420 pc), very well studied stellar nursery that is believed to contain examples of triggered star formation. Aims: As part of the Herschel guaranteed time key programme CHESS, we present the discovery of a diffuse gas component in the foreground of the intermediate-mass protostar OMC-2 FIR 4, located in the Orion A region. Methods: Making use of the full HIFI spectrum of OMC-2 FIR 4 obtained in CHESS, we detected several ground-state lines from OH+, H2O+, HF, and CH+, all of them seen in absorption against the dust continuum emission of the protostar's envelope. We derived column densities for each species, as well as an upper limit to the column density of the undetected H3O+. In order to model and characterise the foreground cloud, we used the Meudon PDR code to run a homogeneous grid of models that spans a reasonable range of densities, visual extinctions, cosmic ray ionisation rates and far-ultraviolet (FUV) radiation fields, and studied the implications of adopting the Orion Nebula extinction properties instead of the standard interstellar medium ones. Results: The detected absorption lines peak at a velocity of 9 km s-1, which is blue-shifted by 2 km s-1 with respect to the systemic velocity of OMC-2 FIR 4 (VLSR = 11.4 km s-1). The results of our modelling indicate that the foreground cloud is composed of predominantly neutral diffuse gas (nH = 100 cm-3) and is heavily irradiated by an external source of FUV that most likely arises from the nearby Trapezium OB association. The cloud is 6 pc thick and bears many similarities with the so-called C+ interface between Orion-KL and the Trapezium cluster, 2 pc south of OMC-2 FIR 4. Conclusions: We conclude that the foreground cloud we detected is an extension of the C+ interface seen in the direction of Orion KL, and interpret it to be the remains of the parental cloud of OMC-1, which extends from OMC-1 up to OMC-2.
DEFF Research Database (Denmark)
Thomsen, Christa; Nielsen, Anne Ellerup
2006-01-01
This chapter first outlines theory and literature on CSR and Stakeholder Relations focusing on the different perspectives and the contextual and dynamic character of the CSR concept. CSR reporting challenges are discussed and a model of analysis is proposed. Next, our paper presents the results...... of a case study showing that companies use different and not necessarily consistent strategies for reporting on CSR. Finally, the implications for managerial practice are discussed. The chapter concludes by highlighting the value and awareness of the discourse and the discourse types adopted...... in the reporting material. By implementing consistent discourse strategies that interact according to a well-defined pattern or order, it is possible to communicate a strong social commitment on the one hand, and to take into consideration the expectations of the shareholders and the other stakeholders...
Serfon, Cedric; The ATLAS collaboration
2016-01-01
One of the biggest challenge with Large scale data management system is to ensure the consistency between the global file catalog and what is physically on all storage elements. To tackle this issue, the Rucio software which is used by the ATLAS Distributed Data Management system has been extended to automatically handle lost or unregistered files (aka Dark Data). This system automatically detects these inconsistencies and take actions like recovery or deletion of unneeded files in a central manner. In this talk, we will present this system, explain the internals and give some results.
Surfactant modified clays’ consistency limits and contact angles
Directory of Open Access Journals (Sweden)
S Akbulut
2012-07-01
Full Text Available This study was aimed at preparing a surfactant modified clay (SMC and researching the effect of surfactants on clays' contact angles and consistency limits; clay was thus modified by surfactants formodifying their engineering properties. Seven surfactants (trimethylglycine, hydroxyethylcellulose octyl phenol ethoxylate, linear alkylbenzene sulfonic acid, sodium lauryl ether sulfate, cetyl trimethylammonium chloride and quaternised ethoxylated fatty amine were used as surfactants in this study. The experimental results indicated that SMC consistency limits (liquid and plastic limits changedsignificantly compared to those of natural clay. Plasticity index and liquid limit (PI-LL values representing soil class approached the A-line when zwitterion, nonionic, and anionic surfactant percentageincreased. However, cationic SMC became transformed from CH (high plasticity clay to MH (high plasticity silt class soils, according to the unified soil classification system (USCS. Clay modifiedwith cationic and anionic surfactants gave higher and lower contact angles than natural clay, respectively.
Consistent model driven architecture
Niepostyn, Stanisław J.
2015-09-01
The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.
Bitcoin Meets Strong Consistency
Decker, Christian; Seidel, Jochen; Wattenhofer, Roger
2014-01-01
The Bitcoin system only provides eventual consistency. For everyday life, the time to confirm a Bitcoin transaction is prohibitively slow. In this paper we propose a new system, built on the Bitcoin blockchain, which enables strong consistency. Our system, PeerCensus, acts as a certification authority, manages peer identities in a peer-to-peer network, and ultimately enhances Bitcoin and similar systems with strong consistency. Our extensive analysis shows that PeerCensus is in a secure state...
Consistent classical supergravity theories
International Nuclear Information System (INIS)
Muller, M.
1989-01-01
This book offers a presentation of both conformal and Poincare supergravity. The consistent four-dimensional supergravity theories are classified. The formulae needed for further modelling are included
Consistency of orthodox gravity
Energy Technology Data Exchange (ETDEWEB)
Bellucci, S. [INFN, Frascati (Italy). Laboratori Nazionali di Frascati; Shiekh, A. [International Centre for Theoretical Physics, Trieste (Italy)
1997-01-01
A recent proposal for quantizing gravity is investigated for self consistency. The existence of a fixed-point all-order solution is found, corresponding to a consistent quantum gravity. A criterion to unify couplings is suggested, by invoking an application of their argument to more complex systems.
Quasiparticles and thermodynamical consistency
International Nuclear Information System (INIS)
Shanenko, A.A.; Biro, T.S.; Toneev, V.D.
2003-01-01
A brief and simple introduction into the problem of the thermodynamical consistency is given. The thermodynamical consistency relations, which should be taken into account under constructing a quasiparticle model, are found in a general manner from the finite-temperature extension of the Hellmann-Feynman theorem. Restrictions following from these relations are illustrated by simple physical examples. (author)
Bergantiños, Gustavo; Valencia-Toledo, Alfredo; Vidal-Puga, Juan
2016-01-01
The program evaluation review technique (PERT) is a tool used to schedule and coordinate activities in a complex project. In assigning the cost of a potential delay, we characterize the Shapley rule as the only rule that satisfies consistency and other desirable properties.
Geometrically Consistent Mesh Modification
Bonito, A.
2010-01-01
A new paradigm of adaptivity is to execute refinement, coarsening, and smoothing of meshes on manifolds with incomplete information about their geometry and yet preserve position and curvature accuracy. We refer to this collectively as geometrically consistent (GC) mesh modification. We discuss the concept of discrete GC, show the failure of naive approaches, and propose and analyze a simple algorithm that is GC and accuracy preserving. © 2010 Society for Industrial and Applied Mathematics.
Consistency argued students of fluid
Viyanti; Cari; Suparmi; Winarti; Slamet Budiarti, Indah; Handika, Jeffry; Widyastuti, Fatma
2017-01-01
Problem solving for physics concepts through consistency arguments can improve thinking skills of students and it is an important thing in science. The study aims to assess the consistency of the material Fluid student argmentation. The population of this study are College students PGRI Madiun, UIN Sunan Kalijaga Yogyakarta and Lampung University. Samples using cluster random sampling, 145 samples obtained by the number of students. The study used a descriptive survey method. Data obtained through multiple-choice test and interview reasoned. Problem fluid modified from [9] and [1]. The results of the study gained an average consistency argmentation for the right consistency, consistency is wrong, and inconsistent respectively 4.85%; 29.93%; and 65.23%. Data from the study have an impact on the lack of understanding of the fluid material which is ideally in full consistency argued affect the expansion of understanding of the concept. The results of the study as a reference in making improvements in future studies is to obtain a positive change in the consistency of argumentations.
International Nuclear Information System (INIS)
Wang Xiaomin; Tegmark, Max; Zaldarriaga, Matias
2002-01-01
We perform a detailed analysis of the latest cosmic microwave background (CMB) measurements (including BOOMERaNG, DASI, Maxima and CBI), both alone and jointly with other cosmological data sets involving, e.g., galaxy clustering and the Lyman Alpha Forest. We first address the question of whether the CMB data are internally consistent once calibration and beam uncertainties are taken into account, performing a series of statistical tests. With a few minor caveats, our answer is yes, and we compress all data into a single set of 24 bandpowers with associated covariance matrix and window functions. We then compute joint constraints on the 11 parameters of the 'standard' adiabatic inflationary cosmological model. Our best fit model passes a series of physical consistency checks and agrees with essentially all currently available cosmological data. In addition to sharp constraints on the cosmic matter budget in good agreement with those of the BOOMERaNG, DASI and Maxima teams, we obtain a heaviest neutrino mass range 0.04-4.2 eV and the sharpest constraints to date on gravity waves which (together with preference for a slight red-tilt) favor 'small-field' inflation models
Griffiths, Robert B.
2001-11-01
Quantum mechanics is one of the most fundamental yet difficult subjects in physics. Nonrelativistic quantum theory is presented here in a clear and systematic fashion, integrating Born's probabilistic interpretation with Schrödinger dynamics. Basic quantum principles are illustrated with simple examples requiring no mathematics beyond linear algebra and elementary probability theory. The quantum measurement process is consistently analyzed using fundamental quantum principles without referring to measurement. These same principles are used to resolve several of the paradoxes that have long perplexed physicists, including the double slit and Schrödinger's cat. The consistent histories formalism used here was first introduced by the author, and extended by M. Gell-Mann, J. Hartle and R. Omnès. Essential for researchers yet accessible to advanced undergraduate students in physics, chemistry, mathematics, and computer science, this book is supplementary to standard textbooks. It will also be of interest to physicists and philosophers working on the foundations of quantum mechanics. Comprehensive account Written by one of the main figures in the field Paperback edition of successful work on philosophy of quantum mechanics
DEFF Research Database (Denmark)
Kirk, O; Gerstoft, J; Pedersen, C
2001-01-01
OBJECTIVES: To assess predictors for discontinuation and treatment-limiting adverse drug reactions (TLADR) among patients starting their first protease inhibitor (PI). METHODS: Data on patients starting a PI regimen (indinavir, ritonavir, ritonavir/saquinavir and saquinavir hard gel) in a randomi......OBJECTIVES: To assess predictors for discontinuation and treatment-limiting adverse drug reactions (TLADR) among patients starting their first protease inhibitor (PI). METHODS: Data on patients starting a PI regimen (indinavir, ritonavir, ritonavir/saquinavir and saquinavir hard gel....... Low body weight and initiation of ritonavir relative to other PIs were associated with an increased risk of TLADRs. Very consistent results were found in a randomized trial and an observational cohort....
International Nuclear Information System (INIS)
Mountricha, E.
2012-01-01
The subject of this thesis is the search for the Standard Model Higgs boson through its decay into four leptons with the ATLAS experiment at CERN. The theory postulating the Higgs boson is presented and the constraints of the theory and direct and indirect searches are quoted. The ATLAS experiment and its components are described and the Detector Control System for the operation and monitoring of the power supplies of the Monitored Drift Tubes is detailed. The electron and muon reconstruction and identification are summarized. Studies on the muon fake rates, on the effect of pileup on the isolation of the muons, and on muon efficiencies of the isolation and impact parameter requirements are presented. The analysis of the Higgs decay to four leptons is detailed with emphasis on the background estimation, the methods employed and the control regions used. The results of the search using the 2011 νs= 7 TeV data are presented which have led to hints for the observation of the Higgs boson. The optimization performed for the search of a low mass Higgs boson is described and the effect on the 2011 data are shown. The analysis is performed for the 2011 νs = 8 TeV data collected up to July and the results are presented, including the combination with the 2011 data. These latest results have led to the observation of a new particle consistent with the Standard Model Higgs. (author) [fr
Directory of Open Access Journals (Sweden)
Salabura Piotr
2017-01-01
Full Text Available HADES experiment at GSI is the only high precision experiment probing nuclear matter in the beam energy range of a few AGeV. Pion, proton and ion beams are used to study rare dielectron and strangeness probes to diagnose properties of strongly interacting matter in this energy regime. Selected results from p + A and A + A collisions are presented and discussed.
The Principle of Energetic Consistency
Cohn, Stephen E.
2009-01-01
A basic result in estimation theory is that the minimum variance estimate of the dynamical state, given the observations, is the conditional mean estimate. This result holds independently of the specifics of any dynamical or observation nonlinearity or stochasticity, requiring only that the probability density function of the state, conditioned on the observations, has two moments. For nonlinear dynamics that conserve a total energy, this general result implies the principle of energetic consistency: if the dynamical variables are taken to be the natural energy variables, then the sum of the total energy of the conditional mean and the trace of the conditional covariance matrix (the total variance) is constant between observations. Ensemble Kalman filtering methods are designed to approximate the evolution of the conditional mean and covariance matrix. For them the principle of energetic consistency holds independently of ensemble size, even with covariance localization. However, full Kalman filter experiments with advection dynamics have shown that a small amount of numerical dissipation can cause a large, state-dependent loss of total variance, to the detriment of filter performance. The principle of energetic consistency offers a simple way to test whether this spurious loss of variance limits ensemble filter performance in full-blown applications. The classical second-moment closure (third-moment discard) equations also satisfy the principle of energetic consistency, independently of the rank of the conditional covariance matrix. Low-rank approximation of these equations offers an energetically consistent, computationally viable alternative to ensemble filtering. Current formulations of long-window, weak-constraint, four-dimensional variational methods are designed to approximate the conditional mode rather than the conditional mean. Thus they neglect the nonlinear bias term in the second-moment closure equation for the conditional mean. The principle of
Consistent force fields for saccharides
DEFF Research Database (Denmark)
Rasmussen, Kjeld
1999-01-01
Consistent force fields for carbohydrates were hitherto developed by extensive optimization ofpotential energy function parameters on experimental data and on ab initio results. A wide range of experimental data is used: internal structures obtained from gas phase electron diffraction and from x......-anomeric effects are accounted for without addition of specific terms. The work is done in the framework of the Consistent Force Field which originatedin Israel and was further developed in Denmark. The actual methods and strategies employed havebeen described previously. Extensive testing of the force field...
Consistency and Communication in Committees
Inga Deimen; Felix Ketelaar; Mark T. Le Quement
2013-01-01
This paper analyzes truthtelling incentives in pre-vote communication in heterogeneous committees. We generalize the classical Condorcet jury model by introducing a new informational structure that captures consistency of information. In contrast to the impossibility result shown by Coughlan (2000) for the classical model, full pooling of information followed by sincere voting is an equilibrium outcome of our model for a large set of parameter values implying the possibility of ex post confli...
Dynamically consistent oil import tariffs
International Nuclear Information System (INIS)
Karp, L.; Newbery, D.M.
1992-01-01
The standard theory of optimal tariffs considers tariffs on perishable goods produced abroad under static conditions, in which tariffs affect prices only in that period. Oil and other exhaustable resources do not fit this model, for current tariffs affect the amount of oil imported, which will affect the remaining stock and hence its future price. The problem of choosing a dynamically consistent oil import tariff when suppliers are competitive but importers have market power is considered. The open-loop Nash tariff is solved for the standard competitive case in which the oil price is arbitraged, and it was found that the resulting tariff rises at the rate of interest. This tariff was found to have an equilibrium that in general is dynamically inconsistent. Nevertheless, it is shown that necessary and sufficient conditions exist under which the tariff satisfies the weaker condition of time consistency. A dynamically consistent tariff is obtained by assuming that all agents condition their current decisions on the remaining stock of the resource, in contrast to open-loop strategies. For the natural case in which all agents choose their actions simultaneously in each period, the dynamically consistent tariff was characterized, and found to differ markedly from the time-inconsistent open-loop tariff. It was shown that if importers do not have overwhelming market power, then the time path of the world price is insensitive to the ability to commit, as is the level of wealth achieved by the importer. 26 refs., 4 figs
Energy Technology Data Exchange (ETDEWEB)
Weterings, R.A.P.M.; Koppejan, J. [TNO Milieu, Energie- en Processinnovatie TNO-MEP, Apeldoorn (Netherlands); Bergsma, G.C. [Centrum voor Energiebesparing en schone technologie CE, Delft (Netherlands); Meeusen-van Onna, M.J.G. [Landbouw Economisch Instituut LEI, Den Haag (Netherlands)
1999-12-01
The Netherlands agency for energy and the environment (Novem) commissioned a consortium to carry out the ABC (Dutch abbreviation for Waste and Biomass Conversion) project in three separate studies: (A) a scenario study of the availability of biomass and waste for energy generation in the Netherlands; (B) a 'three-level assessment' of biomass availability on national, European and global levels; and (C) a scenario study of the feasibility / profitability of energy crops in the Netherlands. The results of the ABC project are published in two separate reports. This present report gives the results of the combined scenario study of availability (A) and the three-level assessment (B). The results of the energy crops study (C) are presented elsewhere. The goal of the present project is to gain insight into the current availability of biomass and waste flows for energy generation, and of the driving forces and constraints that can affect their availability up to the year 2020. First, it is examined whether the availability of biomass and waste is or could become problematic. This is an important aspect for market parties that want to invest in energy from biomass and biomass. Second, it is examined what additional policy measures the Dutch government would need to take to achieve the set policy goal of savings of fossil fuels. The combined scenario study of waste availability (in the Netherlands) and biomass availability (in the Netherlands, the European Union, and worldwide) for energy generation started off with a Definition Phase. In this phase, the project's framework and key issues were formulated and relevant sources of information were outlined. On the basis of these sources, a Quick Scan was carried out to map the existing information as well as any gaps in knowledge and uncertainties about the availability of biomass and waste. In the subsequent In-depth Phase the results of the Quick Scan were submitted to a number of national experts for comment
Consistence of Network Filtering Rules
Institute of Scientific and Technical Information of China (English)
SHE Kun; WU Yuancheng; HUANG Juncai; ZHOU Mingtian
2004-01-01
The inconsistence of firewall/VPN(Virtual Private Network) rule makes a huge maintainable cost.With development of Multinational Company,SOHO office,E-government the number of firewalls/VPN will increase rapidly.Rule table in stand-alone or network will be increased in geometric series accordingly.Checking the consistence of rule table manually is inadequate.A formal approach can define semantic consistence,make a theoretic foundation of intelligent management about rule tables.In this paper,a kind of formalization of host rules and network ones for auto rule-validation based on SET theory were proporsed and a rule validation scheme was defined.The analysis results show the superior performance of the methods and demonstrate its potential for the intelligent management based on rule tables.
Van Reenen, Samantha Lynne
2012-01-01
Pregnancy and childbirth are important life experiences in a woman’s psychosocial and psychological development. For many women, vaginal birth is still considered an integral part of being a woman and becoming a mother. Furthermore, it is thought to promote maternal well-being through helping women to match their expectations to experiences. For these women, a failed natural birth can be a psychological, psychosocial, and existential challenge that can result in significant ...
'His nerves gave way': Shell shock, history and the memory of the First World War in Britain.
Reid, Fiona
2014-06-01
During the First World War soldiers suffered from a wide range of debilitating nervous complaints as a result of the stresses and strains of modern warfare. These complaints--widely known as shell shock--were the subject of much medical-military debate during the war and became emblematic of the war and its sufferings afterwards. One hundred years after the war the diagnosis of PTSD has not resolved the issues initially raised by First World War shell shock. The stigma of mental illness remains strong and it is still difficult to commemorate and remember the mental wounds of war in a culture which tend to glory or glamorise military heroes. Copyright © 2014. Published by Elsevier Ltd.
En gave til skolernes litteraturundervisning
DEFF Research Database (Denmark)
Larsen, Peter Stein
2015-01-01
Mens Caspar Erics debutdigtsamling, " 7/ 11", fra sidste år var præget af en stil, hvor der i en hæsblæsende strøm blev refereret til Facebook, You-Tube, iPhone, mp3, Macbook, Twitter og Netflix, frem for at man fik noget at vide om, hvad der rørte sig i ... ......Mens Caspar Erics debutdigtsamling, " 7/ 11", fra sidste år var præget af en stil, hvor der i en hæsblæsende strøm blev refereret til Facebook, You-Tube, iPhone, mp3, Macbook, Twitter og Netflix, frem for at man fik noget at vide om, hvad der rørte sig i ... ...
Geomagnetism gave me my bearings
Indian Academy of Sciences (India)
Lawrence
days at the University of Delhi, but also had the opportunity to attend summer ... versity in the United States, I worked on a problem in condensed- matter physics. ... diately after my Ph.D. made me open to the idea of taking up research in an ...
Measuring process and knowledge consistency
DEFF Research Database (Denmark)
Edwards, Kasper; Jensen, Klaes Ladeby; Haug, Anders
2007-01-01
When implementing configuration systems, knowledge about products and processes are documented and replicated in the configuration system. This practice assumes that products are specified consistently i.e. on the same rule base and likewise for processes. However, consistency cannot be taken...... for granted; rather the contrary, and attempting to implement a configuration system may easily ignite a political battle. This is because stakes are high in the sense that the rules and processes chosen may only reflect one part of the practice, ignoring a majority of the employees. To avoid this situation......, this paper presents a methodology for measuring product and process consistency prior to implementing a configuration system. The methodology consists of two parts: 1) measuring knowledge consistency and 2) measuring process consistency. Knowledge consistency is measured by developing a questionnaire...
Aaron, Grant J; Friesen, Valerie M; Jungjohann, Svenja; Garrett, Greg S; Neufeld, Lynnette M; Myatt, Mark
2017-05-01
Background: Large-scale food fortification (LSFF) of commonly consumed food vehicles is widely implemented in low- and middle-income countries. Many programs have monitoring information gaps and most countries fail to assess program coverage. Objective: The aim of this work was to present LSFF coverage survey findings (overall and in vulnerable populations) from 18 programs (7 wheat flour, 4 maize flour, and 7 edible oil programs) conducted in 8 countries between 2013 and 2015. Methods: A Fortification Assessment Coverage Toolkit (FACT) was developed to standardize the assessments. Three indicators were used to assess the relations between coverage and vulnerability: 1 ) poverty, 2 ) poor dietary diversity, and 3 ) rural residence. Three measures of coverage were assessed: 1 ) consumption of the vehicle, 2 ) consumption of a fortifiable vehicle, and 3 ) consumption of a fortified vehicle. Individual program performance was assessed based on the following: 1 ) achieving overall coverage ≥50%, 2) achieving coverage of ≥75% in ≥1 vulnerable group, and 3 ) achieving equity in coverage for ≥1 vulnerable group. Results: Coverage varied widely by food vehicle and country. Only 2 of the 18 LSFF programs assessed met all 3 program performance criteria. The 2 main program bottlenecks were a poor choice of vehicle and failure to fortify a fortifiable vehicle (i.e., absence of fortification). Conclusions: The results highlight the importance of sound program design and routine monitoring and evaluation. There is strong evidence of the impact and cost-effectiveness of LSFF; however, impact can only be achieved when the necessary activities and processes during program design and implementation are followed. The FACT approach fills an important gap in the availability of standardized tools. The LSFF programs assessed here need to be re-evaluated to determine whether to further invest in the programs, whether other vehicles are appropriate, and whether other approaches
Aaron, Grant J; Friesen, Valerie M; Jungjohann, Svenja; Garrett, Greg S; Myatt, Mark
2017-01-01
Background: Large-scale food fortification (LSFF) of commonly consumed food vehicles is widely implemented in low- and middle-income countries. Many programs have monitoring information gaps and most countries fail to assess program coverage. Objective: The aim of this work was to present LSFF coverage survey findings (overall and in vulnerable populations) from 18 programs (7 wheat flour, 4 maize flour, and 7 edible oil programs) conducted in 8 countries between 2013 and 2015. Methods: A Fortification Assessment Coverage Toolkit (FACT) was developed to standardize the assessments. Three indicators were used to assess the relations between coverage and vulnerability: 1) poverty, 2) poor dietary diversity, and 3) rural residence. Three measures of coverage were assessed: 1) consumption of the vehicle, 2) consumption of a fortifiable vehicle, and 3) consumption of a fortified vehicle. Individual program performance was assessed based on the following: 1) achieving overall coverage ≥50%, 2) achieving coverage of ≥75% in ≥1 vulnerable group, and 3) achieving equity in coverage for ≥1 vulnerable group. Results: Coverage varied widely by food vehicle and country. Only 2 of the 18 LSFF programs assessed met all 3 program performance criteria. The 2 main program bottlenecks were a poor choice of vehicle and failure to fortify a fortifiable vehicle (i.e., absence of fortification). Conclusions: The results highlight the importance of sound program design and routine monitoring and evaluation. There is strong evidence of the impact and cost-effectiveness of LSFF; however, impact can only be achieved when the necessary activities and processes during program design and implementation are followed. The FACT approach fills an important gap in the availability of standardized tools. The LSFF programs assessed here need to be re-evaluated to determine whether to further invest in the programs, whether other vehicles are appropriate, and whether other approaches are needed
Tarella, Corrado; Rutella, Sergio; Gualandi, Francesca; Melazzini, Mario; Scimè, Rosanna; Petrini, Mario; Moglia, Cristina; Ulla, Marco; Omedé, Paola; Bella, Vincenzo La; Corbo, Massimo; Silani, Vincenzo; Siciliano, Gabriele; Mora, Gabriele; Caponnetto, Claudia; Sabatelli, Mario; Chiò, Adriano
2010-01-01
The aim of this study was to evaluate and characterize the feasibility and safety of bone marrow-derived cell (BMC) mobilization following repeated courses of granulocyte-colony stimulating factor (G-CSF) in patients with amyotrophic lateral sclerosis (ALS). Between January 2006 and March 2007, 26 ALS patients entered a multicenter trial that included four courses of BMC mobilization at 3-month intervals. In each course, G-CSF (5 microg/kg b.i.d.) was administered for four consecutive days; 18% mannitol was also given. Mobilization was monitored by flow cytometry analysis of circulating CD34(+) cells and by in vitro colony assay for clonogenic progenitors. Co-expression by CD34(+) cells of CD133, CD90, CD184, CD117 and CD31 was also assessed. Twenty patients completed the four-course schedule. One patient died and one refused to continue the program before starting the mobilization courses; four discontinued the study protocol because of disease progression. Overall, 89 G-CSF courses were delivered. There were two severe adverse events: one prolactinoma and one deep vein thrombosis. There were no discontinuations as a result of toxic complications. Circulating CD34(+) cells were monitored during 85 G-CSF courses and were always markedly increased; the range of median peak values was 41-57/microL, with no significant differences among the four G-CSF courses. Circulating clonogenic progenitor levels paralleled CD34(+) cell levels. Most mobilized CD34(+) cells co-expressed stem cell markers, with a significant increase in CD133 co-expression. It is feasible to deliver repeated courses of G-CSF to mobilize a substantial number of CD34(+) cells in patients with ALS; mobilized BMC include immature cells with potential clinical usefulness.
Coordinating user interfaces for consistency
Nielsen, Jakob
2001-01-01
In the years since Jakob Nielsen's classic collection on interface consistency first appeared, much has changed, and much has stayed the same. On the one hand, there's been exponential growth in the opportunities for following or disregarding the principles of interface consistency-more computers, more applications, more users, and of course the vast expanse of the Web. On the other, there are the principles themselves, as persistent and as valuable as ever. In these contributed chapters, you'll find details on many methods for seeking and enforcing consistency, along with bottom-line analys
Directory of Open Access Journals (Sweden)
Kim Han S
2005-11-01
Full Text Available Abstract Background This study assesses the consistency of responses among women regarding their beliefs about the mechanisms of actions of birth control methods, beliefs about when human life begins, the intention to use or not use birth control methods that they believe may act after fertilization or implantation, and their reported use of specific methods. Methods A questionnaire was administered in family practice and obstetrics and gynecology clinics in Salt Lake City, Utah, and Tulsa, Oklahoma. Participants included women ages 18–50 presenting for any reason and women under age 18 presenting for family planning or pregnancy care. Analyses were based on key questions addressing beliefs about whether specific birth control methods may act after fertilization, beliefs about when human life begins, intention to use a method that may act after fertilization, and reported use of specific methods. The questionnaire contained no information about the mechanism of action of any method of birth control. Responses were considered inconsistent if actual use contradicted intentions, if one intention contradicted another, or if intentions contradicted beliefs. Results Of all respondents, 38% gave consistent responses about intention to not use or to stop use of any birth control method that acted after fertilization, while 4% gave inconsistent responses. The corresponding percentages for birth control methods that work after implantation were 64% consistent and 2% inconsistent. Of all respondents, 34% reported they believed that life begins at fertilization and would not use any birth control method that acts after fertilization (a consistent response, while 3% reported they believed that life begins at fertilization but would use a birth control method that acts after fertilization (inconsistent. For specific methods of birth control, less than 1% of women gave inconsistent responses. A majority of women (68% or greater responded accurately about the
Choice, internal consistency, and rationality
Aditi Bhattacharyya; Prasanta K. Pattanaik; Yongsheng Xu
2010-01-01
The classical theory of rational choice is built on several important internal consistency conditions. In recent years, the reasonableness of those internal consistency conditions has been questioned and criticized, and several responses to accommodate such criticisms have been proposed in the literature. This paper develops a general framework to accommodate the issues raised by the criticisms of classical rational choice theory, and examines the broad impact of these criticisms from both no...
International Nuclear Information System (INIS)
Rafelski, J.
1979-01-01
After an introductory overview of the bag model the author uses the self-consistent solution of the coupled Dirac-meson fields to represent a bound state of strongly ineteracting fermions. In this framework he discusses the vivial approach to classical field equations. After a short description of the used numerical methods the properties of bound states of scalar self-consistent Fields and the solutions of a self-coupled Dirac field are considered. (HSI) [de
Consistency Results for the ROC Curves of Fused Classifiers
National Research Council Canada - National Science Library
Bjerkaas, Kristopher
2004-01-01
.... An established performance quantifier is the Receiver Operating Characteristic (ROC) curve, which allows one to view the probability of detection versus the probability of false alarm in one graph...
Time-consistent and market-consistent evaluations
Pelsser, A.; Stadje, M.A.
2014-01-01
We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from
Market-consistent actuarial valuation
Wüthrich, Mario V
2016-01-01
This is the third edition of this well-received textbook, presenting powerful methods for measuring insurance liabilities and assets in a consistent way, with detailed mathematical frameworks that lead to market-consistent values for liabilities. Topics covered are stochastic discounting with deflators, valuation portfolio in life and non-life insurance, probability distortions, asset and liability management, financial risks, insurance technical risks, and solvency. Including updates on recent developments and regulatory changes under Solvency II, this new edition of Market-Consistent Actuarial Valuation also elaborates on different risk measures, providing a revised definition of solvency based on industry practice, and presents an adapted valuation framework which takes a dynamic view of non-life insurance reserving risk.
Consistent guiding center drift theories
International Nuclear Information System (INIS)
Wimmel, H.K.
1982-04-01
Various guiding-center drift theories are presented that are optimized in respect of consistency. They satisfy exact energy conservation theorems (in time-independent fields), Liouville's theorems, and appropriate power balance equations. A theoretical framework is given that allows direct and exact derivation of associated drift-kinetic equations from the respective guiding-center drift-orbit theories. These drift-kinetic equations are listed. Northrop's non-optimized theory is discussed for reference, and internal consistency relations of G.C. drift theories are presented. (orig.)
Weak consistency and strong paraconsistency
Directory of Open Access Journals (Sweden)
Gemma Robles
2009-11-01
Full Text Available In a standard sense, consistency and paraconsistency are understood as, respectively, the absence of any contradiction and as the absence of the ECQ (“E contradictione quodlibet” rule that allows us to conclude any well formed formula from any contradiction. The aim of this paper is to explain the concepts of weak consistency alternative to the standard one, the concepts of paraconsistency related to them and the concept of strong paraconsistency, all of which have been defined by the author together with José M. Méndez.
Glass consistency and glass performance
International Nuclear Information System (INIS)
Plodinec, M.J.; Ramsey, W.G.
1994-01-01
Glass produced by the Defense Waste Processing Facility (DWPF) will have to consistently be more durable than a benchmark glass (evaluated using a short-term leach test), with high confidence. The DWPF has developed a Glass Product Control Program to comply with this specification. However, it is not clear what relevance product consistency has on long-term glass performance. In this report, the authors show that DWPF glass, produced in compliance with this specification, can be expected to effectively limit the release of soluble radionuclides to natural environments. However, the release of insoluble radionuclides to the environment will be limited by their solubility, and not glass durability
Time-consistent actuarial valuations
Pelsser, A.A.J.; Salahnejhad Ghalehjooghi, A.
2016-01-01
Time-consistent valuations (i.e. pricing operators) can be created by backward iteration of one-period valuations. In this paper we investigate the continuous-time limits of well-known actuarial premium principles when such backward iteration procedures are applied. This method is applied to an
Replica consistency in a Data Grid
International Nuclear Information System (INIS)
Domenici, Andrea; Donno, Flavia; Pucciani, Gianni; Stockinger, Heinz; Stockinger, Kurt
2004-01-01
A Data Grid is a wide area computing infrastructure that employs Grid technologies to provide storage capacity and processing power to applications that handle very large quantities of data. Data Grids rely on data replication to achieve better performance and reliability by storing copies of data sets on different Grid nodes. When a data set can be modified by applications, the problem of maintaining consistency among existing copies arises. The consistency problem also concerns metadata, i.e., additional information about application data sets such as indices, directories, or catalogues. This kind of metadata is used both by the applications and by the Grid middleware to manage the data. For instance, the Replica Management Service (the Grid middleware component that controls data replication) uses catalogues to find the replicas of each data set. Such catalogues can also be replicated and their consistency is crucial to the correct operation of the Grid. Therefore, metadata consistency generally poses stricter requirements than data consistency. In this paper we report on the development of a Replica Consistency Service based on the middleware mainly developed by the European Data Grid Project. The paper summarises the main issues in the replica consistency problem, and lays out a high-level architectural design for a Replica Consistency Service. Finally, results from simulations of different consistency models are presented
Consistently violating the non-Gaussian consistency relation
International Nuclear Information System (INIS)
Mooij, Sander; Palma, Gonzalo A.
2015-01-01
Non-attractor models of inflation are characterized by the super-horizon evolution of curvature perturbations, introducing a violation of the non-Gaussian consistency relation between the bispectrum's squeezed limit and the power spectrum's spectral index. In this work we show that the bispectrum's squeezed limit of non-attractor models continues to respect a relation dictated by the evolution of the background. We show how to derive this relation using only symmetry arguments, without ever needing to solve the equations of motion for the perturbations
International Nuclear Information System (INIS)
Hazeltine, R.D.
1988-12-01
The boundary layer arising in the radial vicinity of a tokamak limiter is examined, with special reference to the TEXT tokamak. It is shown that sheath structure depends upon the self-consistent effects of ion guiding-center orbit modification, as well as the radial variation of E /times/ B-induced toroidal rotation. Reasonable agreement with experiment is obtained from an idealized model which, however simplified, preserves such self-consistent effects. It is argued that the radial sheath, which occurs whenever confining magnetic field-lines lie in the plasma boundary surface, is an object of some intrinsic interest. It differs from the more familiar axial sheath because magnetized charges respond very differently to parallel and perpendicular electric fields. 11 refs., 1 fig
Lagrangian multiforms and multidimensional consistency
Energy Technology Data Exchange (ETDEWEB)
Lobb, Sarah; Nijhoff, Frank [Department of Applied Mathematics, University of Leeds, Leeds LS2 9JT (United Kingdom)
2009-10-30
We show that well-chosen Lagrangians for a class of two-dimensional integrable lattice equations obey a closure relation when embedded in a higher dimensional lattice. On the basis of this property we formulate a Lagrangian description for such systems in terms of Lagrangian multiforms. We discuss the connection of this formalism with the notion of multidimensional consistency, and the role of the lattice from the point of view of the relevant variational principle.
Deep Feature Consistent Variational Autoencoder
Hou, Xianxu; Shen, Linlin; Sun, Ke; Qiu, Guoping
2016-01-01
We present a novel method for constructing Variational Autoencoder (VAE). Instead of using pixel-by-pixel loss, we enforce deep feature consistency between the input and the output of a VAE, which ensures the VAE's output to preserve the spatial correlation characteristics of the input, thus leading the output to have a more natural visual appearance and better perceptual quality. Based on recent deep learning works such as style transfer, we employ a pre-trained deep convolutional neural net...
Maintaining consistency in distributed systems
Birman, Kenneth P.
1991-01-01
In systems designed as assemblies of independently developed components, concurrent access to data or data structures normally arises within individual programs, and is controlled using mutual exclusion constructs, such as semaphores and monitors. Where data is persistent and/or sets of operation are related to one another, transactions or linearizability may be more appropriate. Systems that incorporate cooperative styles of distributed execution often replicate or distribute data within groups of components. In these cases, group oriented consistency properties must be maintained, and tools based on the virtual synchrony execution model greatly simplify the task confronting an application developer. All three styles of distributed computing are likely to be seen in future systems - often, within the same application. This leads us to propose an integrated approach that permits applications that use virtual synchrony with concurrent objects that respect a linearizability constraint, and vice versa. Transactional subsystems are treated as a special case of linearizability.
Consistent histories and operational quantum theory
International Nuclear Information System (INIS)
Rudolph, O.
1996-01-01
In this work a generalization of the consistent histories approach to quantum mechanics is presented. We first critically review the consistent histories approach to nonrelativistic quantum mechanics in a mathematically rigorous way and give some general comments about it. We investigate to what extent the consistent histories scheme is compatible with the results of the operational formulation of quantum mechanics. According to the operational approach, nonrelativistic quantum mechanics is most generally formulated in terms of effects, states, and operations. We formulate a generalized consistent histories theory using the concepts and the terminology which have proven useful in the operational formulation of quantum mechanics. The logical rule of the logical interpretation of quantum mechanics is generalized to the present context. The algebraic structure of the generalized theory is studied in detail
Crespin, Daniel J; Christianson, Jon B; McCullough, Jeffrey S; Finch, Michael D
This study addresses whether health systems have consistent diabetes care performance across their ambulatory clinics and whether increasing consistency is associated with improvements in clinic performance. Study data included 2007 to 2013 diabetes care intermediate outcome measures for 661 ambulatory clinics in Minnesota and bordering states. Health systems provided more consistent performance, as measured by the standard deviation of performance for clinics in a system, relative to propensity score-matched proxy systems created for comparison purposes. No evidence was found that improvements in consistency were associated with higher clinic performance. The combination of high performance and consistent care is likely to enhance a health system's brand reputation, allowing it to better mitigate the financial risks of consumers seeking care outside the organization. These results suggest that larger health systems are most likely to deliver the combination of consistent and high-performance care. Future research should explore the mechanisms that drive consistent care within health systems.
Decentralized Consistent Updates in SDN
Nguyen, Thanh Dang
2017-04-10
We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and blackholes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.
Consistent Visual Analyses of Intrasubject Data
Kahng, SungWoo; Chung, Kyong-Mee; Gutshall, Katharine; Pitts, Steven C.; Kao, Joyce; Girolami, Kelli
2010-01-01
Visual inspection of single-case data is the primary method of interpretation of the effects of an independent variable on a dependent variable in applied behavior analysis. The purpose of the current study was to replicate and extend the results of DeProspero and Cohen (1979) by reexamining the consistency of visual analysis across raters. We…
Consistent feeding positions of great tit parents
Lessells, C.M.; Poelman, E.H.; Mateman, A.C.; Cassey, Ph.
2006-01-01
When parent birds arrive at the nest to provision their young, their position on the nest rim may influence which chick or chicks are fed. As a result, the consistency of feeding positions of the individual parents, and the difference in position between the parents, may affect how equitably food is
Consistent ranking of volatility models
DEFF Research Database (Denmark)
Hansen, Peter Reinhard; Lunde, Asger
2006-01-01
We show that the empirical ranking of volatility models can be inconsistent for the true ranking if the evaluation is based on a proxy for the population measure of volatility. For example, the substitution of a squared return for the conditional variance in the evaluation of ARCH-type models can...... variance in out-of-sample evaluations rather than the squared return. We derive the theoretical results in a general framework that is not specific to the comparison of volatility models. Similar problems can arise in comparisons of forecasting models whenever the predicted variable is a latent variable....
Self-consistent asset pricing models
Malevergne, Y.; Sornette, D.
2007-08-01
We discuss the foundations of factor or regression models in the light of the self-consistency condition that the market portfolio (and more generally the risk factors) is (are) constituted of the assets whose returns it is (they are) supposed to explain. As already reported in several articles, self-consistency implies correlations between the return disturbances. As a consequence, the alphas and betas of the factor model are unobservable. Self-consistency leads to renormalized betas with zero effective alphas, which are observable with standard OLS regressions. When the conditions derived from internal consistency are not met, the model is necessarily incomplete, which means that some sources of risk cannot be replicated (or hedged) by a portfolio of stocks traded on the market, even for infinite economies. Analytical derivations and numerical simulations show that, for arbitrary choices of the proxy which are different from the true market portfolio, a modified linear regression holds with a non-zero value αi at the origin between an asset i's return and the proxy's return. Self-consistency also introduces “orthogonality” and “normality” conditions linking the betas, alphas (as well as the residuals) and the weights of the proxy portfolio. Two diagnostics based on these orthogonality and normality conditions are implemented on a basket of 323 assets which have been components of the S&P500 in the period from January 1990 to February 2005. These two diagnostics show interesting departures from dynamical self-consistency starting about 2 years before the end of the Internet bubble. Assuming that the CAPM holds with the self-consistency condition, the OLS method automatically obeys the resulting orthogonality and normality conditions and therefore provides a simple way to self-consistently assess the parameters of the model by using proxy portfolios made only of the assets which are used in the CAPM regressions. Finally, the factor decomposition with the
Guided color consistency optimization for image mosaicking
Xie, Renping; Xia, Menghan; Yao, Jian; Li, Li
2018-01-01
This paper studies the problem of color consistency correction for sequential images with diverse color characteristics. Existing algorithms try to adjust all images to minimize color differences among images under a unified energy framework, however, the results are prone to presenting a consistent but unnatural appearance when the color difference between images is large and diverse. In our approach, this problem is addressed effectively by providing a guided initial solution for the global consistency optimization, which avoids converging to a meaningless integrated solution. First of all, to obtain the reliable intensity correspondences in overlapping regions between image pairs, we creatively propose the histogram extreme point matching algorithm which is robust to image geometrical misalignment to some extents. In the absence of the extra reference information, the guided initial solution is learned from the major tone of the original images by searching some image subset as the reference, whose color characteristics will be transferred to the others via the paths of graph analysis. Thus, the final results via global adjustment will take on a consistent color similar to the appearance of the reference image subset. Several groups of convincing experiments on both the synthetic dataset and the challenging real ones sufficiently demonstrate that the proposed approach can achieve as good or even better results compared with the state-of-the-art approaches.
Towards thermodynamical consistency of quasiparticle picture
International Nuclear Information System (INIS)
Biro, T.S.; Shanenko, A.A.; Toneev, V.D.; Research Inst. for Particle and Nuclear Physics, Hungarian Academy of Sciences, Budapest
2003-01-01
The purpose of the present article is to call attention to some realistic quasi-particle-based description of the quark/gluon matter and its consistent implementation in thermodynamics. A simple and transparent representation of the thermodynamical consistency conditions is given. This representation allows one to review critically and systemize available phenomenological approaches to the deconfinement problem with respect to their thermodynamical consistency. A particular attention is paid to the development of a method for treating the string screening in the dense matter of unbound color charges. The proposed method yields an integrable effective pair potential, which can be incorporated into the mean-field picture. The results of its application are in reasonable agreement with lattice data on the QCD thermodynamics [ru
Toward thermodynamic consistency of quasiparticle picture
International Nuclear Information System (INIS)
Biro, T.S.; Toneev, V.D.; Shanenko, A.A.
2003-01-01
The purpose of the present article is to call attention to some realistic quasiparticle-based description of quark/gluon matter and its consistent implementation in thermodynamics. A simple and transparent representation of the thermodynamic consistency conditions is given. This representation allows one to review critically and systemize available phenomenological approaches to the deconfinement problem with respect to their thermodynamic consistency. Particular attention is paid to the development of a method for treating the string screening in the dense matter of unbound color charges. The proposed method yields an integrable effective pair potential that can be incorporated into the mean-field picture. The results of its application are in reasonable agreement with lattice data on the QCD thermodynamics
International Nuclear Information System (INIS)
Shepard, J.R.
1991-01-01
The authors examine the RPA based on a relativistic Hartree approximation description for nuclear ground states. This model includes contributions from the negative energy sea at the 1-loop level. They emphasize consistency between the treatment of the ground state and the RPA. This consistency is important in the description of low-lying collective levels but less important for the longitudinal (e, e') quasi-elastic response. They also study the effect of imposing a 3-momentum cutoff on negative energy sea contributions. A cutoff of twice the nucleon mass improves agreement with observed spin orbit splittings in nuclei compared to the standard infinite cutoff results, an effect traceable to the fact that imposing the cutoff reduces m*/m. The cutoff is much less important than consistency in the description of low-lying collective levels. The cutoff model provides excellent agreement with quasi-elastic (e, e') data
Personalized recommendation based on unbiased consistence
Zhu, Xuzhen; Tian, Hui; Zhang, Ping; Hu, Zheng; Zhou, Tao
2015-08-01
Recently, in physical dynamics, mass-diffusion-based recommendation algorithms on bipartite network provide an efficient solution by automatically pushing possible relevant items to users according to their past preferences. However, traditional mass-diffusion-based algorithms just focus on unidirectional mass diffusion from objects having been collected to those which should be recommended, resulting in a biased causal similarity estimation and not-so-good performance. In this letter, we argue that in many cases, a user's interests are stable, and thus bidirectional mass diffusion abilities, no matter originated from objects having been collected or from those which should be recommended, should be consistently powerful, showing unbiased consistence. We further propose a consistence-based mass diffusion algorithm via bidirectional diffusion against biased causality, outperforming the state-of-the-art recommendation algorithms in disparate real data sets, including Netflix, MovieLens, Amazon and Rate Your Music.
Consistency relations in effective field theory
Energy Technology Data Exchange (ETDEWEB)
Munshi, Dipak; Regan, Donough, E-mail: D.Munshi@sussex.ac.uk, E-mail: D.Regan@sussex.ac.uk [Astronomy Centre, School of Mathematical and Physical Sciences, University of Sussex, Brighton BN1 9QH (United Kingdom)
2017-06-01
The consistency relations in large scale structure relate the lower-order correlation functions with their higher-order counterparts. They are direct outcome of the underlying symmetries of a dynamical system and can be tested using data from future surveys such as Euclid. Using techniques from standard perturbation theory (SPT), previous studies of consistency relation have concentrated on continuity-momentum (Euler)-Poisson system of an ideal fluid. We investigate the consistency relations in effective field theory (EFT) which adjusts the SPT predictions to account for the departure from the ideal fluid description on small scales. We provide detailed results for the 3D density contrast δ as well as the scaled divergence of velocity θ-bar . Assuming a ΛCDM background cosmology, we find the correction to SPT results becomes important at k ∼> 0.05 h/Mpc and that the suppression from EFT to SPT results that scales as square of the wave number k , can reach 40% of the total at k ≈ 0.25 h/Mpc at z = 0. We have also investigated whether effective field theory corrections to models of primordial non-Gaussianity can alter the squeezed limit behaviour, finding the results to be rather insensitive to these counterterms. In addition, we present the EFT corrections to the squeezed limit of the bispectrum in redshift space which may be of interest for tests of theories of modified gravity.
Self-consistency in Capital Markets
Benbrahim, Hamid
2013-03-01
Capital Markets are considered, at least in theory, information engines whereby traders contribute to price formation with their diverse perspectives. Regardless whether one believes in efficient market theory on not, actions by individual traders influence prices of securities, which in turn influence actions by other traders. This influence is exerted through a number of mechanisms including portfolio balancing, margin maintenance, trend following, and sentiment. As a result market behaviors emerge from a number of mechanisms ranging from self-consistency due to wisdom of the crowds and self-fulfilling prophecies, to more chaotic behavior resulting from dynamics similar to the three body system, namely the interplay between equities, options, and futures. This talk will address questions and findings regarding the search for self-consistency in capital markets.
Directory of Open Access Journals (Sweden)
Teferra Alemayehu
2012-07-01
Full Text Available Abstract Background Reduction of maternal mortality is a global priority particularly in developing countries including Ethiopia where maternal mortality ratio is one of the highest in the world. The key to reducing maternal mortality ratio and improving maternal health is increasing attendance by skilled health personnel throughout pregnancy and delivery. However, delivery service is significantly lower in Amhara Regional State, Ethiopia. Therefore, this study aimed to assess factors affecting institutional delivery service utilization among mothers who gave birth in the last 12 months in Sekela District, Amhara Region, Ethiopia. Methods Community-based cross-sectional study was conducted among mothers with birth in the last 12 months during August, 2010. Multistage sampling technique was used to select 371 participants. A pre tested and structured questionnaire was used to collect data. Bivariate and multivariate data analysis was performed using SPSS version 16.0 software. Results The study indicated that 12.1% of the mothers delivered in health facilities. Of 87.9% mothers who gave birth at home, 80.0% of them were assisted by family members and relatives. The common reasons for home delivery were closer attention from family members and relatives (60.9%, home delivery is usual practice (57.7%, unexpected labour (33.4%, not being sick or no problem at the time of delivery (21.6% and family influence (14.4%. Being urban resident (AOR [95% CI] = 4.6 [1.91, 10.9], ANC visit during last pregnancy (AOR [95% CI] = 4.26 [1.1, 16.4], maternal education level (AOR [95%CI] =11.98 [3.36, 41.4] and knowledge of mothers on pregnancy and delivery services (AOR [95% CI] = 2.97[1.1, 8.6] had significant associations with institutional delivery service utilization. Conclusions Very low institutional delivery service utilization was observed in the study area. Majority of the births at home were assisted by family members and relatives. ANC visit and lack of
Directory of Open Access Journals (Sweden)
Luisa Matilde Salamanca-Duque
2014-09-01
Full Text Available Introduction: ADHD is one of the most common diagnoses in child psychiatry, its early diagnosis is of great importance for intervention at family, school and social environment. Based on the International Classification of Functioning, Disability and Health (ICF, a questionnaire was designed to assess activity limitations and participation restrictions in children with ADHD. The questionnaire was called “CLARP-ADHD Parent and Teacher Version”. Objective: To determine the degree of internal consistency of the CLARP-ADHD questionnaire, and its concurrent validity with the “Strengths and Difficulties Questionnaire SDQ parent and teacher version”. Material and Methods: A sample of 203 children aged 6 to 12 with ADHD, currently attending school in five Colombian cities. The questionnaires were applied to parents and teachers. The internal consistency analysis was performed through Cronbach coefficient and concurrent validity using the Spearman correlation coefficient utilizing multiple and unique predictors through multiple linear regression as well as simple regression models. Results: A high internal consistency was found for global questionnaires for each of its domains. The CLARP-ADHD for parents gave as result an internal consistency of 0.83, and the CLARP-ADHD for teachers one of 0.93. Concurrent validity was found between the CLARP-ADHD and the SDQ Parent and Teacher version; also, concurrence between the CLARPADHD for Teachers and the SDQ Teachers was found, as well as between CLARP ADHD for Parents and CLARP ADHD Teachers, given by p values of p < 0.001.
A consistent interpretation of quantum mechanics
International Nuclear Information System (INIS)
Omnes, Roland
1990-01-01
Some mostly recent theoretical and mathematical advances can be linked together to yield a new consistent interpretation of quantum mechanics. It relies upon a unique and universal interpretative rule of a logical character which is based upon Griffiths consistent history. Some new results in semi-classical physics allow classical physics to be derived from this rule, including its logical aspects, and to prove accordingly the existence of determinism within the quantum framework. Together with decoherence, this can be used to retrieve the existence of facts, despite the probabilistic character of the theory. Measurement theory can then be made entirely deductive. It is accordingly found that wave packet reduction is a logical property, whereas one can always choose to avoid using it. The practical consequences of this interpretation are most often in agreement with the Copenhagen formulation but they can be proved never to give rise to any logical inconsistency or paradox. (author)
Salomon, Jérôme; Schnitzler, Alexis; Ville, Yves; Laffont, Isabelle; Perronne, Christian; Denys, Pierre; Bernard, Louis
2009-05-01
Pregnancies in spinal cord-injured (SCI) patients present unique clinical challenges. Because of the neurogenic bladder and the use of intermittent catheterization, chronic bacteriuria and recurrent urinary tract infection (UTI) is common. During pregnancy the prevalence of UTI increases dramatically. Recurrent UTI requires multiple courses of antibiotics and increases the risks of abortion, prematurity, and low birth weight. A weekly oral cyclic antibiotic (WOCA) program was recently described for the prevention of UTI in SCI patients. To test the impact of WOCA in six SCI pregnant women (four paraplegic, two tetraplegic). This was a prospective observational study. WOCA consists of the alternate administration of one of two antibiotics once per week. We observed a significant reduction of UTI (6 UTI/patient/year before pregnancy to 0.4 during pregnancy and under WOCA; pUTI prophylaxis in SCI pregnant women.
Directory of Open Access Journals (Sweden)
Ana Ashtalkovska Gajtanoska
2017-08-01
Full Text Available This text examines the experiences of several women ethnologists / anthropologists in regard to women’s inheritance rights and the traditional practices used in contemporary context in Macedonia. Women’s inheritance rights and traditional norms, which, according to the ideal model, recommend that a woman cannot be an heir of immovable property, are among the main associations of patriarchy on the Balkans. The women interlocutors in this research consistently hold on to the thesis that the term “patriarchy” is inadequate for describing in general terms the status of women in the radically divided periods of the traditional past or the contemporary context. Therefore, this situation entails a great methodological challenge in the context of the research theme, when the experiences from the everyday life of the researchers seem to contradict their theses in regard to patriarchy.
Orthology and paralogy constraints: satisfiability and consistency.
Lafond, Manuel; El-Mabrouk, Nadia
2014-01-01
A variety of methods based on sequence similarity, reconciliation, synteny or functional characteristics, can be used to infer orthology and paralogy relations between genes of a given gene family G. But is a given set C of orthology/paralogy constraints possible, i.e., can they simultaneously co-exist in an evolutionary history for G? While previous studies have focused on full sets of constraints, here we consider the general case where C does not necessarily involve a constraint for each pair of genes. The problem is subdivided in two parts: (1) Is C satisfiable, i.e. can we find an event-labeled gene tree G inducing C? (2) Is there such a G which is consistent, i.e., such that all displayed triplet phylogenies are included in a species tree? Previous results on the Graph sandwich problem can be used to answer to (1), and we provide polynomial-time algorithms for satisfiability and consistency with a given species tree. We also describe a new polynomial-time algorithm for the case of consistency with an unknown species tree and full knowledge of pairwise orthology/paralogy relationships, as well as a branch-and-bound algorithm in the case when unknown relations are present. We show that our algorithms can be used in combination with ProteinOrtho, a sequence similarity-based orthology detection tool, to extract a set of robust orthology/paralogy relationships.
Consistency Checking of Web Service Contracts
DEFF Research Database (Denmark)
Cambronero, M. Emilia; Okika, Joseph C.; Ravn, Anders Peter
2008-01-01
Behavioural properties are analyzed for web service contracts formulated in Business Process Execution Language (BPEL) and Choreography Description Language (CDL). The key result reported is an automated technique to check consistency between protocol aspects of the contracts. The contracts...... are abstracted to (timed) automata and from there a simulation is set up, which is checked using automated tools for analyzing networks of finite state processes. Here we use the Concurrency Work Bench. The proposed techniques are illustrated with a case study that include otherwise difficult to analyze fault...
Evaluating Temporal Consistency in Marine Biodiversity Hotspots.
Piacenza, Susan E; Thurman, Lindsey L; Barner, Allison K; Benkwitt, Cassandra E; Boersma, Kate S; Cerny-Chipman, Elizabeth B; Ingeman, Kurt E; Kindinger, Tye L; Lindsley, Amy J; Nelson, Jake; Reimer, Jessica N; Rowe, Jennifer C; Shen, Chenchen; Thompson, Kevin A; Heppell, Selina S
2015-01-01
With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monitoring dataset collected over an eight year period off the US Pacific Coast, we developed a methodological approach for avoiding biases associated with hotspot delineation. We aggregated benthic fish species data from research trawls and calculated mean hotspot thresholds for fish species richness and Shannon's diversity indices over the eight year dataset. We used a spatial frequency distribution method to assign hotspot designations to the grid cells annually. We found no areas containing consistently high biodiversity through the entire study period based on the mean thresholds, and no grid cell was designated as a hotspot for greater than 50% of the time-series. To test if our approach was sensitive to sampling effort and the geographic extent of the survey, we followed a similar routine for the northern region of the survey area. Our finding of low consistency in benthic fish biodiversity hotspots over time was upheld, regardless of biodiversity metric used, whether thresholds were calculated per year or across all years, or the spatial extent for which we calculated thresholds and identified hotspots. Our results suggest that static measures of benthic fish biodiversity off the US West Coast are insufficient for identification of hotspots and that long-term data are required to appropriately identify patterns of high temporal variability in biodiversity for these highly mobile taxa. Given that ecological communities are responding to a changing climate and other
Self-consistent gravitational self-force
International Nuclear Information System (INIS)
Pound, Adam
2010-01-01
I review the problem of motion for small bodies in general relativity, with an emphasis on developing a self-consistent treatment of the gravitational self-force. An analysis of the various derivations extant in the literature leads me to formulate an asymptotic expansion in which the metric is expanded while a representative worldline is held fixed. I discuss the utility of this expansion for both exact point particles and asymptotically small bodies, contrasting it with a regular expansion in which both the metric and the worldline are expanded. Based on these preliminary analyses, I present a general method of deriving self-consistent equations of motion for arbitrarily structured (sufficiently compact) small bodies. My method utilizes two expansions: an inner expansion that keeps the size of the body fixed, and an outer expansion that lets the body shrink while holding its worldline fixed. By imposing the Lorenz gauge, I express the global solution to the Einstein equation in the outer expansion in terms of an integral over a worldtube of small radius surrounding the body. Appropriate boundary data on the tube are determined from a local-in-space expansion in a buffer region where both the inner and outer expansions are valid. This buffer-region expansion also results in an expression for the self-force in terms of irreducible pieces of the metric perturbation on the worldline. Based on the global solution, these pieces of the perturbation can be written in terms of a tail integral over the body's past history. This approach can be applied at any order to obtain a self-consistent approximation that is valid on long time scales, both near and far from the small body. I conclude by discussing possible extensions of my method and comparing it to alternative approaches.
Crawford, Jacob E.
2017-02-20
BackgroundThe mosquito Aedes aegypti is the main vector of dengue, Zika, chikungunya and yellow fever viruses. This major disease vector is thought to have arisen when the African subspecies Ae. aegypti formosus evolved from being zoophilic and living in forest habitats into a form that specialises on humans and resides near human population centres. The resulting domestic subspecies, Ae. aegypti aegypti, is found throughout the tropics and largely blood-feeds on humans.ResultsTo understand this transition, we have sequenced the exomes of mosquitoes collected from five populations from around the world. We found that Ae. aegypti specimens from an urban population in Senegal in West Africa were more closely related to populations in Mexico and Sri Lanka than they were to a nearby forest population. We estimate that the populations in Senegal and Mexico split just a few hundred years ago, and we found no evidence of Ae. aegypti aegypti mosquitoes migrating back to Africa from elsewhere in the tropics. The out-of-Africa migration was accompanied by a dramatic reduction in effective population size, resulting in a loss of genetic diversity and rare genetic variants.ConclusionsWe conclude that a domestic population of Ae. aegypti in Senegal and domestic populations on other continents are more closely related to each other than to other African populations. This suggests that an ancestral population of Ae. aegypti evolved to become a human specialist in Africa, giving rise to the subspecies Ae. aegypti aegypti. The descendants of this population are still found in West Africa today, and the rest of the world was colonised when mosquitoes from this population migrated out of Africa. This is the first report of an African population of Ae. aegypti aegypti mosquitoes that is closely related to Asian and American populations. As the two subspecies differ in their ability to vector disease, their existence side by side in West Africa may have important implications for
Consistent mutational paths predict eukaryotic thermostability
Directory of Open Access Journals (Sweden)
van Noort Vera
2013-01-01
Full Text Available Abstract Background Proteomes of thermophilic prokaryotes have been instrumental in structural biology and successfully exploited in biotechnology, however many proteins required for eukaryotic cell function are absent from bacteria or archaea. With Chaetomium thermophilum, Thielavia terrestris and Thielavia heterothallica three genome sequences of thermophilic eukaryotes have been published. Results Studying the genomes and proteomes of these thermophilic fungi, we found common strategies of thermal adaptation across the different kingdoms of Life, including amino acid biases and a reduced genome size. A phylogenetics-guided comparison of thermophilic proteomes with those of other, mesophilic Sordariomycetes revealed consistent amino acid substitutions associated to thermophily that were also present in an independent lineage of thermophilic fungi. The most consistent pattern is the substitution of lysine by arginine, which we could find in almost all lineages but has not been extensively used in protein stability engineering. By exploiting mutational paths towards the thermophiles, we could predict particular amino acid residues in individual proteins that contribute to thermostability and validated some of them experimentally. By determining the three-dimensional structure of an exemplar protein from C. thermophilum (Arx1, we could also characterise the molecular consequences of some of these mutations. Conclusions The comparative analysis of these three genomes not only enhances our understanding of the evolution of thermophily, but also provides new ways to engineer protein stability.
Consistency of extreme flood estimation approaches
Felder, Guido; Paquet, Emmanuel; Penot, David; Zischg, Andreas; Weingartner, Rolf
2017-04-01
Estimations of low-probability flood events are frequently used for the planning of infrastructure as well as for determining the dimensions of flood protection measures. There are several well-established methodical procedures to estimate low-probability floods. However, a global assessment of the consistency of these methods is difficult to achieve, the "true value" of an extreme flood being not observable. Anyway, a detailed comparison performed on a given case study brings useful information about the statistical and hydrological processes involved in different methods. In this study, the following three different approaches for estimating low-probability floods are compared: a purely statistical approach (ordinary extreme value statistics), a statistical approach based on stochastic rainfall-runoff simulation (SCHADEX method), and a deterministic approach (physically based PMF estimation). These methods are tested for two different Swiss catchments. The results and some intermediate variables are used for assessing potential strengths and weaknesses of each method, as well as for evaluating the consistency of these methods.
Consistent biokinetic models for the actinide elements
International Nuclear Information System (INIS)
Leggett, R.W.
2001-01-01
The biokinetic models for Th, Np, Pu, Am and Cm currently recommended by the International Commission on Radiological Protection (ICRP) were developed within a generic framework that depicts gradual burial of skeletal activity in bone volume, depicts recycling of activity released to blood and links excretion to retention and translocation of activity. For other actinide elements such as Ac, Pa, Bk, Cf and Es, the ICRP still uses simplistic retention models that assign all skeletal activity to bone surface and depicts one-directional flow of activity from blood to long-term depositories to excreta. This mixture of updated and older models in ICRP documents has led to inconsistencies in dose estimates and interpretation of bioassay for radionuclides with reasonably similar biokinetics. This paper proposes new biokinetic models for Ac, Pa, Bk, Cf and Es that are consistent with the updated models for Th, Np, Pu, Am and Cm. The proposed models are developed within the ICRP's generic model framework for bone-surface-seeking radionuclides, and an effort has been made to develop parameter values that are consistent with results of comparative biokinetic data on the different actinide elements. (author)
Crawford, Jacob E; Alves, Joel M; Palmer, William J; Day, Jonathan P; Sylla, Massamba; Ramasamy, Ranjan; Surendran, Sinnathamby N; Black, William C; Pain, Arnab; Jiggins, Francis M
2017-02-28
The mosquito Aedes aegypti is the main vector of dengue, Zika, chikungunya and yellow fever viruses. This major disease vector is thought to have arisen when the African subspecies Ae. aegypti formosus evolved from being zoophilic and living in forest habitats into a form that specialises on humans and resides near human population centres. The resulting domestic subspecies, Ae. aegypti aegypti, is found throughout the tropics and largely blood-feeds on humans. To understand this transition, we have sequenced the exomes of mosquitoes collected from five populations from around the world. We found that Ae. aegypti specimens from an urban population in Senegal in West Africa were more closely related to populations in Mexico and Sri Lanka than they were to a nearby forest population. We estimate that the populations in Senegal and Mexico split just a few hundred years ago, and we found no evidence of Ae. aegypti aegypti mosquitoes migrating back to Africa from elsewhere in the tropics. The out-of-Africa migration was accompanied by a dramatic reduction in effective population size, resulting in a loss of genetic diversity and rare genetic variants. We conclude that a domestic population of Ae. aegypti in Senegal and domestic populations on other continents are more closely related to each other than to other African populations. This suggests that an ancestral population of Ae. aegypti evolved to become a human specialist in Africa, giving rise to the subspecies Ae. aegypti aegypti. The descendants of this population are still found in West Africa today, and the rest of the world was colonised when mosquitoes from this population migrated out of Africa. This is the first report of an African population of Ae. aegypti aegypti mosquitoes that is closely related to Asian and American populations. As the two subspecies differ in their ability to vector disease, their existence side by side in West Africa may have important implications for disease transmission.
Consistency Anchor Formalization and Correctness Proofs
Miguel, Correia; Bessani, Alysson
2014-01-01
This is report contains the formal proofs for the techniques for increasing the consistency of cloud storage as presented in "Bessani et al. SCFS: A Cloud-backed File System. Proc. of the 2014 USENIX Annual Technical Conference. June 2014." The consistency anchor technique allows one to increase the consistency provided by eventually consistent cloud storage services like Amazon S3. This technique has been used in the SCFS (Shared Cloud File System) cloud-backed file system for solving rea...
Directory of Open Access Journals (Sweden)
Ehsan Jozaghi
2014-03-01
Full Text Available Although numerous studies on heroin-assisted treatment (HAT have been published in leading international journals, little attention has been given to HAT's clients, their stories, and what constitutes the most influential factor in the treatment process. The present study investigates the role of HAT in transforming the lives of injection drug users (IDUs in Vancouver, Canada. This study is qualitative focusing on 16 in-depth interviews with patients from the randomized trials of HAT. Interviews were transcribed verbatim and analyzed thematically using NVivo 10 software. The findings revealed a positive change in many respects: the randomized trials reduce criminal activity, sex work, and illicit drug use. In addition, the trials improved the health and social functioning of its clients, with some participants acquiring work or volunteer positions. Many of the participants have been able to reconnect with their family members, which was not possible before the program. Furthermore, the relationship between the staff and patients at the project appears to have transformed the behavior of participants. Attending HAT in Vancouver has been particularly effective in creating a unique microenvironment where IDUs who have attended HAT have been able to form a collective identity advocating for their rights. The result of this research points to the need for continuation of the project beyond the current study, leading toward a permanent program.
Non linear self consistency of microtearing modes
International Nuclear Information System (INIS)
Garbet, X.; Mourgues, F.; Samain, A.
1987-01-01
The self consistency of a microtearing turbulence is studied in non linear regimes where the ergodicity of the flux lines determines the electron response. The current which sustains the magnetic perturbation via the Ampere law results from the combines action of the radial electric field in the frame where the island chains are static and of the thermal electron diamagnetism. Numerical calculations show that at usual values of β pol in Tokamaks the turbulence can create a diffusion coefficient of order ν th p 2 i where p i is the ion larmor radius and ν th the electron ion collision frequency. On the other hand, collisionless regimes involving special profiles of each mode near the resonant surface seem possible
Thermodynamically consistent model calibration in chemical kinetics
Directory of Open Access Journals (Sweden)
Goutsias John
2011-05-01
Full Text Available Abstract Background The dynamics of biochemical reaction systems are constrained by the fundamental laws of thermodynamics, which impose well-defined relationships among the reaction rate constants characterizing these systems. Constructing biochemical reaction systems from experimental observations often leads to parameter values that do not satisfy the necessary thermodynamic constraints. This can result in models that are not physically realizable and may lead to inaccurate, or even erroneous, descriptions of cellular function. Results We introduce a thermodynamically consistent model calibration (TCMC method that can be effectively used to provide thermodynamically feasible values for the parameters of an open biochemical reaction system. The proposed method formulates the model calibration problem as a constrained optimization problem that takes thermodynamic constraints (and, if desired, additional non-thermodynamic constraints into account. By calculating thermodynamically feasible values for the kinetic parameters of a well-known model of the EGF/ERK signaling cascade, we demonstrate the qualitative and quantitative significance of imposing thermodynamic constraints on these parameters and the effectiveness of our method for accomplishing this important task. MATLAB software, using the Systems Biology Toolbox 2.1, can be accessed from http://www.cis.jhu.edu/~goutsias/CSS lab/software.html. An SBML file containing the thermodynamically feasible EGF/ERK signaling cascade model can be found in the BioModels database. Conclusions TCMC is a simple and flexible method for obtaining physically plausible values for the kinetic parameters of open biochemical reaction systems. It can be effectively used to recalculate a thermodynamically consistent set of parameter values for existing thermodynamically infeasible biochemical reaction models of cellular function as well as to estimate thermodynamically feasible values for the parameters of new
Plants Africa gave to the World
Directory of Open Access Journals (Sweden)
G. Kunkel
1983-11-01
Full Text Available Although the flora of Africa is rather poor in plant species when compared to the floras of Tropical America or South-east Asia, this vast continent is the home of a wide range of plants useful to Man. Many of these have become famous in cultivation around the world. Coffee now provides an important source of income for certain countries, and the Yams yield one of the world’s staple foods. The Oil Palm and Cola trees are widely cultivated in Africa itself and elsewhere. African Mahoganies and Ironwoods are much sought after timber trees of excellent quality. Numerous grasses and pulses are well-known for their food value, and some of the native Cucurbitaceae are appreciated additions to our vegetable diet. African plants have also made their contribution to horticulture, ranging from world-famous trees such as the African or Gabon Tulip tree and many of the South African species of Proteaceae to the multitude of East and South African succulents. The present paper provides a survey of the most important of these useful plants and will emphasize the need of further research for forestry and agricultural as well as horticultural purposes, especially as far as some still little-known but potentially important plants species are concerned.
Foods The Indians Gave Us. Coloring Book.
Hail, Raven
This children's coloring book devotes a page to each of twenty of the most familiar American Indian plant foods: avocado, green beans, black walnuts, cocoa, corn, peanuts, pecans, chile peppers, pineapples, popcorn, potatoes, pumpkins, squash, strawberries, sugar maple, sunflowers, sweet potatoes, tapioca, tomatoes, and vanilla. Illustrating each…
They gave their names to science
National Research Council Canada - National Science Library
Halacy, D. S
1967-01-01
...: Ernst Mach and his number, Gergor Mendel and his laws, Christian Johann Doppler and his effect, Hans Geiger and his radiation counter, Nicholas Sadi Carnot and thermodynamics, Gustave, Gaspard de...
A Consistent Phylogenetic Backbone for the Fungi
Ebersberger, Ingo; de Matos Simoes, Ricardo; Kupczok, Anne; Gube, Matthias; Kothe, Erika; Voigt, Kerstin; von Haeseler, Arndt
2012-01-01
The kingdom of fungi provides model organisms for biotechnology, cell biology, genetics, and life sciences in general. Only when their phylogenetic relationships are stably resolved, can individual results from fungal research be integrated into a holistic picture of biology. However, and despite recent progress, many deep relationships within the fungi remain unclear. Here, we present the first phylogenomic study of an entire eukaryotic kingdom that uses a consistency criterion to strengthen phylogenetic conclusions. We reason that branches (splits) recovered with independent data and different tree reconstruction methods are likely to reflect true evolutionary relationships. Two complementary phylogenomic data sets based on 99 fungal genomes and 109 fungal expressed sequence tag (EST) sets analyzed with four different tree reconstruction methods shed light from different angles on the fungal tree of life. Eleven additional data sets address specifically the phylogenetic position of Blastocladiomycota, Ustilaginomycotina, and Dothideomycetes, respectively. The combined evidence from the resulting trees supports the deep-level stability of the fungal groups toward a comprehensive natural system of the fungi. In addition, our analysis reveals methodologically interesting aspects. Enrichment for EST encoded data—a common practice in phylogenomic analyses—introduces a strong bias toward slowly evolving and functionally correlated genes. Consequently, the generalization of phylogenomic data sets as collections of randomly selected genes cannot be taken for granted. A thorough characterization of the data to assess possible influences on the tree reconstruction should therefore become a standard in phylogenomic analyses. PMID:22114356
Consistency based correlations for tailings consolidation
Energy Technology Data Exchange (ETDEWEB)
Azam, S.; Paul, A.C. [Regina Univ., Regina, SK (Canada). Environmental Systems Engineering
2010-07-01
The extraction of oil, uranium, metals and mineral resources from the earth generates significant amounts of tailings slurry. The tailings are contained in a disposal area with perimeter dykes constructed from the coarser fraction of the slurry. There are many unique challenges pertaining to the management of the containment facilities for several decades beyond mine closure that are a result of the slow settling rates of the fines and the high standing toxic waters. Many tailings dam failures in different parts of the world have been reported to result in significant contaminant releases causing public concern over the conventional practice of tailings disposal. Therefore, in order to reduce and minimize the environmental footprint, the fluid tailings need to undergo efficient consolidation. This paper presented an investigation into the consolidation behaviour of tailings in conjunction with soil consistency that captured physicochemical interactions. The paper discussed the large strain consolidation behaviour (volume compressibility and hydraulic conductivity) of six fine-grained soil slurries based on published data. The paper provided background information on the study and presented the research methodology. The geotechnical index properties of the selected materials were also presented. The large strain consolidation, volume compressibility correlations, and hydraulic conductivity correlations were provided. It was concluded that the normalized void ratio best described volume compressibility whereas liquidity index best explained the hydraulic conductivity. 17 refs., 3 tabs., 4 figs.
The Consistency Between Clinical and Electrophysiological Diagnoses
Directory of Open Access Journals (Sweden)
Esra E. Okuyucu
2009-09-01
Full Text Available OBJECTIVE: The aim of this study was to provide information concerning the impact of electrophysiological tests in the clinical management and diagnosis of patients, and to evaluate the consistency between referring clinical diagnoses and electrophysiological diagnoses. METHODS: The study included 957 patients referred to the electroneuromyography (ENMG laboratory from different clinics with different clinical diagnoses in 2008. Demographic data, referring clinical diagnoses, the clinics where the requests wanted, and diagnoses after ENMG testing were recorded and statistically evaluated. RESULTS: In all, 957 patients [644 (67.3% female and 313 (32.7% male] were included in the study. Mean age of the patients was 45.40 ± 14.54 years. ENMG requests were made by different specialists; 578 (60.4% patients were referred by neurologists, 122 (12.8% by orthopedics, 140 (14.6% by neurosurgeons, and 117 (12.2% by physical treatment and rehabilitation departments. According to the results of ENMG testing, 513 (53.6% patients’ referrals were related to their referral diagnosis, whereas 397 (41.5% patients had normal ENMG test results, and 47 (4.9% patients had a diagnosis that differed from the referring diagnosis. Among the relation between the referral diagnosis and electrophysiological diagnosis according to the clinics where the requests were made, there was no statistical difference (p= 0.794, but there were statistically significant differences between the support of different clinical diagnoses, such as carpal tunnel syndrome, polyneuropathy, radiculopathy-plexopathy, entrapment neuropathy, and myopathy based on ENMG test results (p< 0.001. CONCLUSION: ENMG is a frequently used neurological examination. As such, referrals for ENMG can be made to either support the referring diagnosis or to exclude other diagnoses. This may explain the inconsistency between clinical referring diagnoses and diagnoses following ENMG
A new approach to hull consistency
Directory of Open Access Journals (Sweden)
Kolev Lubomir
2016-06-01
Full Text Available Hull consistency is a known technique to improve the efficiency of iterative interval methods for solving nonlinear systems describing steady-states in various circuits. Presently, hull consistency is checked in a scalar manner, i.e. successively for each equation of the nonlinear system with respect to a single variable. In the present poster, a new more general approach to implementing hull consistency is suggested which consists in treating simultaneously several equations with respect to the same number of variables.
A consistent thermodynamic database for cement minerals
International Nuclear Information System (INIS)
Blanc, P.; Claret, F.; Burnol, A.; Marty, N.; Gaboreau, S.; Tournassat, C.; Gaucher, E.C.; Giffault, E.; Bourbon, X.
2010-01-01
Document available in extended abstract form only. In the context of waste confinement and, more specifically, waste from the nuclear industry, concrete is used both as a confinement and as a building material. The exposure to high temperatures makes its geochemical behaviour difficult to predict over large periods of time. The present work aims to elucidate the temperature dependency of the thermodynamic functions related to minerals from the concrete or associated with some of its degradation products. To address precisely these functions is a key issue in order to investigate correctly the cement/clay interaction, from a geochemical point of view. A large set of experimental data has been collected, for the chemical systems CaO-SiO 2 -H 2 O, SO 3 -Al 2 O 3 - CaO-CO 2 -Cl-H 2 O and SiO 2 -Al 2 O 3 -CaO-H 2 O, including iron and magnesium bearing phases. Data include calorimetric measurements when available and results from equilibration experiments. The stability of C-S-H phases was considered as a specific issue, those phases may appear both as amorphous or crystalline minerals. In addition, the composition of amorphous minerals is still under debate. The phase diagram of crystalline phases was refined, providing the thermodynamic function of most of the main minerals. Then, we were able to build up a polyhedral model from the refined properties. The composition and the equilibrium constants of amorphous C-S-H, at room temperature, were derived from a large a set of equilibration data. Finally, the thermodynamic functions were completed by using the polyhedral model, both for amorphous and crystalline phases, in some cases. A verification test, based on reaction enthalpies derived from experimental data, indicates that predicted values for amorphous C-S-H are in close agreement with experimental data. For phases other than C-S-H, we have proceeded for each mineral the following way: - the equilibrium constant at 25 deg. C is selected from a single experimental
Evaluating the hydrological consistency of evaporation products
Lopez Valencia, Oliver Miguel; Houborg, Rasmus; McCabe, Matthew
2017-01-01
. Interestingly, after imposing a simple lag in GRACE data to account for delayed surface runoff or baseflow components, an improved match in terms of degree correlation was observed in the Niger River basin. Significant improvements to the degree correlations (from ∼ 0 to about 0.6) were also found in the Colorado River basin for both the CSIRO-PML and GLEAM products, while MOD16 showed only half of that improvement. In other basins, the variability in the temporal pattern of degree correlations remained considerable and hindered any clear differentiation between the evaporation products. Even so, it was found that a constant lag of 2 months provided a better fit compared to other alternatives, including a zero lag. From a product assessment perspective, no significant or persistent advantage could be discerned across any of the three evaporation products in terms of a sustained hydrological consistency with precipitation and water storage anomaly data. As a result, our analysis has implications in terms of the confidence that can be placed in independent retrievals of the hydrological cycle, raises questions on inter-product quality, and highlights the need for additional techniques to evaluate large-scale products.
Evaluating the hydrological consistency of evaporation products
Lopez Valencia, Oliver Miguel
2017-01-18
. Interestingly, after imposing a simple lag in GRACE data to account for delayed surface runoff or baseflow components, an improved match in terms of degree correlation was observed in the Niger River basin. Significant improvements to the degree correlations (from ∼ 0 to about 0.6) were also found in the Colorado River basin for both the CSIRO-PML and GLEAM products, while MOD16 showed only half of that improvement. In other basins, the variability in the temporal pattern of degree correlations remained considerable and hindered any clear differentiation between the evaporation products. Even so, it was found that a constant lag of 2 months provided a better fit compared to other alternatives, including a zero lag. From a product assessment perspective, no significant or persistent advantage could be discerned across any of the three evaporation products in terms of a sustained hydrological consistency with precipitation and water storage anomaly data. As a result, our analysis has implications in terms of the confidence that can be placed in independent retrievals of the hydrological cycle, raises questions on inter-product quality, and highlights the need for additional techniques to evaluate large-scale products.
Personality consistency analysis in cloned quarantine dog candidates
Directory of Open Access Journals (Sweden)
Jin Choi
2017-01-01
Full Text Available In recent research, personality consistency has become an important characteristic. Diverse traits and human-animal interactions, in particular, are studied in the field of personality consistency in dogs. Here, we investigated the consistency of dominant behaviours in cloned and control groups followed by the modified Puppy Aptitude Test, which consists of ten subtests to ascertain the influence of genetic identity. In this test, puppies are exposed to stranger, restraint, prey-like object, noise, startling object, etc. Six cloned and four control puppies participated and the consistency of responses at ages 7–10 and 16 weeks in the two groups was compared. The two groups showed different consistencies in the subtests. While the average scores of the cloned group were consistent (P = 0.7991, those of the control group were not (P = 0.0089. Scores of Pack Drive and Fight or Flight Drive were consistent in the cloned group, however, those of the control group were not. Scores of Prey Drive were not consistent in either the cloned or the control group. Therefore, it is suggested that consistency of dominant behaviour is affected by genetic identity and some behaviours can be influenced more than others. Our results suggest that cloned dogs could show more consistent traits than non-cloned. This study implies that personality consistency could be one of the ways to analyse traits of puppies.
Student Effort, Consistency, and Online Performance
Patron, Hilde; Lopez, Salvador
2011-01-01
This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas…
Translationally invariant self-consistent field theories
International Nuclear Information System (INIS)
Shakin, C.M.; Weiss, M.S.
1977-01-01
We present a self-consistent field theory which is translationally invariant. The equations obtained go over to the usual Hartree-Fock equations in the limit of large particle number. In addition to deriving the dynamic equations for the self-consistent amplitudes we discuss the calculation of form factors and various other observables
Sticky continuous processes have consistent price systems
DEFF Research Database (Denmark)
Bender, Christian; Pakkanen, Mikko; Sayit, Hasanjan
Under proportional transaction costs, a price process is said to have a consistent price system, if there is a semimartingale with an equivalent martingale measure that evolves within the bid-ask spread. We show that a continuous, multi-asset price process has a consistent price system, under...
Consistent-handed individuals are more authoritarian.
Lyle, Keith B; Grillo, Michael C
2014-01-01
Individuals differ in the consistency with which they use one hand over the other to perform everyday activities. Some individuals are very consistent, habitually using a single hand to perform most tasks. Others are relatively inconsistent, and hence make greater use of both hands. More- versus less-consistent individuals have been shown to differ in numerous aspects of personality and cognition. In several respects consistent-handed individuals resemble authoritarian individuals. For example, both consistent-handedness and authoritarianism have been linked to cognitive inflexibility. Therefore we hypothesised that consistent-handedness is an external marker for authoritarianism. Confirming our hypothesis, we found that consistent-handers scored higher than inconsistent-handers on a measure of submission to authority, were more likely to identify with a conservative political party (Republican), and expressed less-positive attitudes towards out-groups. We propose that authoritarianism may be influenced by the degree of interaction between the left and right brain hemispheres, which has been found to differ between consistent- and inconsistent-handed individuals.
Testing the visual consistency of web sites
van der Geest, Thea; Loorbach, N.R.
2005-01-01
Consistency in the visual appearance of Web pages is often checked by experts, such as designers or reviewers. This article reports a card sort study conducted to determine whether users rather than experts could distinguish visual (in-)consistency in Web elements and pages. The users proved to
Consistent spectroscopy for a extended gauge model
International Nuclear Information System (INIS)
Oliveira Neto, G. de.
1990-11-01
The consistent spectroscopy was obtained with a Lagrangian constructed with vector fields with a U(1) group extended symmetry. As consistent spectroscopy is understood the determination of quantum physical properties described by the model in an manner independent from the possible parametrizations adopted in their description. (L.C.J.A.)
Modeling and Testing Legacy Data Consistency Requirements
DEFF Research Database (Denmark)
Nytun, J. P.; Jensen, Christian Søndergaard
2003-01-01
An increasing number of data sources are available on the Internet, many of which offer semantically overlapping data, but based on different schemas, or models. While it is often of interest to integrate such data sources, the lack of consistency among them makes this integration difficult....... This paper addresses the need for new techniques that enable the modeling and consistency checking for legacy data sources. Specifically, the paper contributes to the development of a framework that enables consistency testing of data coming from different types of data sources. The vehicle is UML and its...... accompanying XMI. The paper presents techniques for modeling consistency requirements using OCL and other UML modeling elements: it studies how models that describe the required consistencies among instances of legacy models can be designed in standard UML tools that support XMI. The paper also considers...
Consistently Showing Your Best Side? Intra-individual Consistency in #Selfie Pose Orientation
Lindell, Annukka K.
2017-01-01
Painted and photographic portraits of others show an asymmetric bias: people favor their left cheek. Both experimental and database studies confirm that the left cheek bias extends to selfies. To date all such selfie studies have been cross-sectional; whether individual selfie-takers tend to consistently favor the same pose orientation, or switch between multiple poses, remains to be determined. The present study thus examined intra-individual consistency in selfie pose orientations. Two hundred selfie-taking participants (100 male and 100 female) were identified by searching #selfie on Instagram. The most recent 10 single-subject selfies for the each of the participants were selected and coded for type of selfie (normal; mirror) and pose orientation (left, midline, right), resulting in a sample of 2000 selfies. Results indicated that selfie-takers do tend to consistently adopt a preferred pose orientation (α = 0.72), with more participants showing an overall left cheek bias (41%) than would be expected by chance (overall right cheek bias = 31.5%; overall midline bias = 19.5%; no overall bias = 8%). Logistic regression modellng, controlling for the repeated measure of participant identity, indicated that sex did not affect pose orientation. However, selfie type proved a significant predictor when comparing left and right cheek poses, with a stronger left cheek bias for mirror than normal selfies. Overall, these novel findings indicate that selfie-takers show intra-individual consistency in pose orientation, and in addition, replicate the previously reported left cheek bias for selfies and other types of portrait, confirming that the left cheek bias also presents within individuals’ selfie corpora. PMID:28270790
Self consistent field theory of virus assembly
Li, Siyu; Orland, Henri; Zandi, Roya
2018-04-01
The ground state dominance approximation (GSDA) has been extensively used to study the assembly of viral shells. In this work we employ the self-consistent field theory (SCFT) to investigate the adsorption of RNA onto positively charged spherical viral shells and examine the conditions when GSDA does not apply and SCFT has to be used to obtain a reliable solution. We find that there are two regimes in which GSDA does work. First, when the genomic RNA length is long enough compared to the capsid radius, and second, when the interaction between the genome and capsid is so strong that the genome is basically localized next to the wall. We find that for the case in which RNA is more or less distributed uniformly in the shell, regardless of the length of RNA, GSDA is not a good approximation. We observe that as the polymer-shell interaction becomes stronger, the energy gap between the ground state and first excited state increases and thus GSDA becomes a better approximation. We also present our results corresponding to the genome persistence length obtained through the tangent-tangent correlation length and show that it is zero in case of GSDA but is equal to the inverse of the energy gap when using SCFT.
Consistency between GRUAN sondes, LBLRTM and IASI
Directory of Open Access Journals (Sweden)
X. Calbet
2017-06-01
Full Text Available Radiosonde soundings from the GCOS Reference Upper-Air Network (GRUAN data record are shown to be consistent with Infrared Atmospheric Sounding Instrument (IASI-measured radiances via LBLRTM (Line-By-Line Radiative Transfer Model in the part of the spectrum that is mostly affected by water vapour absorption in the upper troposphere (from 700 hPa up. This result is key for climate data records, since GRUAN, IASI and LBLRTM constitute reference measurements or a reference radiative transfer model in each of their fields. This is specially the case for night-time radiosonde measurements. Although the sample size is small (16 cases, daytime GRUAN radiosonde measurements seem to have a small dry bias of 2.5 % in absolute terms of relative humidity, located mainly in the upper troposphere, with respect to LBLRTM and IASI. Full metrological closure is not yet possible and will not be until collocation uncertainties are better characterized and a full uncertainty covariance matrix is clarified for GRUAN.
Consistency in the World Wide Web
DEFF Research Database (Denmark)
Thomsen, Jakob Grauenkjær
Tim Berners-Lee envisioned that computers will behave as agents of humans on the World Wide Web, where they will retrieve, extract, and interact with information from the World Wide Web. A step towards this vision is to make computers capable of extracting this information in a reliable...... and consistent way. In this dissertation we study steps towards this vision by showing techniques for the specication, the verication and the evaluation of the consistency of information in the World Wide Web. We show how to detect certain classes of errors in a specication of information, and we show how...... the World Wide Web, in order to help perform consistent evaluations of web extraction techniques. These contributions are steps towards having computers reliable and consistently extract information from the World Wide Web, which in turn are steps towards achieving Tim Berners-Lee's vision. ii...
Self-consistent areas law in QCD
International Nuclear Information System (INIS)
Makeenko, Yu.M.; Migdal, A.A.
1980-01-01
The problem of obtaining the self-consistent areas law in quantum chromodynamics (QCD) is considered from the point of view of the quark confinement. The exact equation for the loop average in multicolor QCD is reduced to a bootstrap form. Its iterations yield new manifestly gauge invariant perturbation theory in the loop space, reproducing asymptotic freedom. For large loops, the areas law apprears to be a self-consistent solution
Consistency of the MLE under mixture models
Chen, Jiahua
2016-01-01
The large-sample properties of likelihood-based statistical inference under mixture models have received much attention from statisticians. Although the consistency of the nonparametric MLE is regarded as a standard conclusion, many researchers ignore the precise conditions required on the mixture model. An incorrect claim of consistency can lead to false conclusions even if the mixture model under investigation seems well behaved. Under a finite normal mixture model, for instance, the consis...
Self-consistent calculation of atomic structure for mixture
International Nuclear Information System (INIS)
Meng Xujun; Bai Yun; Sun Yongsheng; Zhang Jinglin; Zong Xiaoping
2000-01-01
Based on relativistic Hartree-Fock-Slater self-consistent average atomic model, atomic structure for mixture is studied by summing up component volumes in mixture. Algorithmic procedure for solving both the group of Thomas-Fermi equations and the self-consistent atomic structure is presented in detail, and, some numerical results are discussed
Carl Rogers during Initial Interviews: A Moderate and Consistent Therapist.
Edwards, H. P.; And Others
1982-01-01
Analyzed two initial interviews by Carl Rogers in their entirety using the Carkhuff scales, Hill's category system, and a brief grammatical analysis to establish the level and consistency with which Rogers provides facilitative conditions. Results indicated his behavior as counselor was stable and consistent within and across interviews. (Author)
Financial model calibration using consistency hints.
Abu-Mostafa, Y S
2001-01-01
We introduce a technique for forcing the calibration of a financial model to produce valid parameters. The technique is based on learning from hints. It converts simple curve fitting into genuine calibration, where broad conclusions can be inferred from parameter values. The technique augments the error function of curve fitting with consistency hint error functions based on the Kullback-Leibler distance. We introduce an efficient EM-type optimization algorithm tailored to this technique. We also introduce other consistency hints, and balance their weights using canonical errors. We calibrate the correlated multifactor Vasicek model of interest rates, and apply it successfully to Japanese Yen swaps market and US dollar yield market.
Quasi-Particle Self-Consistent GW for Molecules.
Kaplan, F; Harding, M E; Seiler, C; Weigend, F; Evers, F; van Setten, M J
2016-06-14
We present the formalism and implementation of quasi-particle self-consistent GW (qsGW) and eigenvalue only quasi-particle self-consistent GW (evGW) adapted to standard quantum chemistry packages. Our implementation is benchmarked against high-level quantum chemistry computations (coupled-cluster theory) and experimental results using a representative set of molecules. Furthermore, we compare the qsGW approach for five molecules relevant for organic photovoltaics to self-consistent GW results (scGW) and analyze the effects of the self-consistency on the ground state density by comparing calculated dipole moments to their experimental values. We show that qsGW makes a significant improvement over conventional G0W0 and that partially self-consistent flavors (in particular evGW) can be excellent alternatives.
A methodology for the data energy regional consumption consistency analysis
International Nuclear Information System (INIS)
Canavarros, Otacilio Borges; Silva, Ennio Peres da
1999-01-01
The article introduces a methodology for data energy regional consumption consistency analysis. The work was going based on recent studies accomplished by several cited authors and boarded Brazilian matrices and Brazilian energetics regional balances. The results are compared and analyzed
Proteolysis and consistency of Meshanger cheese
Jong, de L.
1978-01-01
Proteolysis in Meshanger cheese, estimated by quantitative polyacrylamide gel electrophoresis is discussed. The conversion of α _{s1} -casein was proportional to rennet concentration in the cheese. Changes in consistency, after a maximum, were correlated to breakdown of
Developing consistent pronunciation models for phonemic variants
CSIR Research Space (South Africa)
Davel, M
2006-09-01
Full Text Available Pronunciation lexicons often contain pronunciation variants. This can create two problems: It can be difficult to define these variants in an internally consistent way and it can also be difficult to extract generalised grapheme-to-phoneme rule sets...
Consistent Valuation across Curves Using Pricing Kernels
Directory of Open Access Journals (Sweden)
Andrea Macrina
2018-03-01
Full Text Available The general problem of asset pricing when the discount rate differs from the rate at which an asset’s cash flows accrue is considered. A pricing kernel framework is used to model an economy that is segmented into distinct markets, each identified by a yield curve having its own market, credit and liquidity risk characteristics. The proposed framework precludes arbitrage within each market, while the definition of a curve-conversion factor process links all markets in a consistent arbitrage-free manner. A pricing formula is then derived, referred to as the across-curve pricing formula, which enables consistent valuation and hedging of financial instruments across curves (and markets. As a natural application, a consistent multi-curve framework is formulated for emerging and developed inter-bank swap markets, which highlights an important dual feature of the curve-conversion factor process. Given this multi-curve framework, existing multi-curve approaches based on HJM and rational pricing kernel models are recovered, reviewed and generalised and single-curve models extended. In another application, inflation-linked, currency-based and fixed-income hybrid securities are shown to be consistently valued using the across-curve valuation method.
Consistent application of codes and standards
International Nuclear Information System (INIS)
Scott, M.A.
1989-01-01
The guidelines presented in the US Department of Energy, General Design Criteria (DOE 6430.1A), and the Design and Evaluation Guidelines for Department of Energy Facilities Subject to Natural Phenomena Hazards (UCRL-15910) provide a consistent and well defined approach to determine the natural phenomena hazards loads for US Department of Energy site facilities. The guidelines for the application of loads combinations and allowables criteria are not as well defined and are more flexible in interpretation. This flexibility in the interpretation of load combinations can lead to conflict between the designer and overseer. The establishment of an efficient set of acceptable design criteria, based on US Department of Energy guidelines, provides a consistent baseline for analysis, design, and review. Additionally, the proposed method should not limit the design and analytical innovation necessary to analyze or qualify the unique structure. This paper investigates the consistent application of load combinations, analytical methods, and load allowables and suggests a reference path consistent with the US Department of Energy guidelines
Consistency in multi-viewpoint architectural design
Dijkman, R.M.; Dijkman, Remco Matthijs
2006-01-01
This thesis presents a framework that aids in preserving consistency in multi-viewpoint designs. In a multi-viewpoint design each stakeholder constructs his own design part. We call each stakeholder’s design part the view of that stakeholder. To construct his view, a stakeholder has a viewpoint.
Consistent Stochastic Modelling of Meteocean Design Parameters
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Sterndorff, M. J.
2000-01-01
Consistent stochastic models of metocean design parameters and their directional dependencies are essential for reliability assessment of offshore structures. In this paper a stochastic model for the annual maximum values of the significant wave height, and the associated wind velocity, current...
On the existence of consistent price systems
DEFF Research Database (Denmark)
Bayraktar, Erhan; Pakkanen, Mikko S.; Sayit, Hasanjan
2014-01-01
We formulate a sufficient condition for the existence of a consistent price system (CPS), which is weaker than the conditional full support condition (CFS). We use the new condition to show the existence of CPSs for certain processes that fail to have the CFS property. In particular this condition...
Dynamic phonon exchange requires consistent dressing
International Nuclear Information System (INIS)
Hahne, F.J.W.; Engelbrecht, C.A.; Heiss, W.D.
1976-01-01
It is shown that states with undersirable properties (such as ghosts, states with complex eigenenergies and states with unrestricted normalization) emerge from two-body calculations using dynamic effective interactions if one is not careful in introducing single-particle self-energy insertions in a consistent manner
Consistency of the postulates of special relativity
International Nuclear Information System (INIS)
Gron, O.; Nicola, M.
1976-01-01
In a recent article in this journal, Kingsley has tried to show that the postulates of special relativity contradict each other. It is shown that the arguments of Kingsley are invalid because of an erroneous appeal to symmetry in a nonsymmetric situation. The consistency of the postulates of special relativity and the relativistic kinematics deduced from them is restated
Consistency of Network Traffic Repositories: An Overview
Lastdrager, E.; Lastdrager, E.E.H.; Pras, Aiko
2009-01-01
Traffc repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffc that has been ï¬‚owing over the network; little thoughts are made regarding the consistency of these repositories. Still, for
Consistency analysis of network traffic repositories
Lastdrager, Elmer; Lastdrager, E.E.H.; Pras, Aiko
Traffic repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffic that has been flowing over the network; little thoughts are made regarding the consistency of these repositories. Still, for
Student Effort, Consistency and Online Performance
Directory of Open Access Journals (Sweden)
Hilde Patron
2011-07-01
Full Text Available This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas effort, or total minutes spent online, is not. Other independent variables include GPA and the difference between a pre-test and a post-test. The GPA is used as a measure of motivation, and the difference between a post-test and pre-test as marginal learning. As expected, the level of motivation is found statistically significant at a 99% confidence level, and marginal learning is also significant at a 95% level.
Consistent thermodynamic properties of lipids systems
DEFF Research Database (Denmark)
Cunico, Larissa; Ceriani, Roberta; Sarup, Bent
different pressures, with azeotrope behavior observed. Available thermodynamic consistency tests for TPx data were applied before performing parameter regressions for Wilson, NRTL, UNIQUAC and original UNIFAC models. The relevance of enlarging experimental databank of lipids systems data in order to improve......Physical and thermodynamic properties of pure components and their mixtures are the basic requirement for process design, simulation, and optimization. In the case of lipids, our previous works[1-3] have indicated a lack of experimental data for pure components and also for their mixtures...... the performance of predictive thermodynamic models was confirmed in this work by analyzing the calculated values of original UNIFAC model. For solid-liquid equilibrium (SLE) data, new consistency tests have been developed [2]. Some of the developed tests were based in the quality tests proposed for VLE data...
Consistency relation for cosmic magnetic fields
DEFF Research Database (Denmark)
Jain, R. K.; Sloth, M. S.
2012-01-01
If cosmic magnetic fields are indeed produced during inflation, they are likely to be correlated with the scalar metric perturbations that are responsible for the cosmic microwave background anisotropies and large scale structure. Within an archetypical model of inflationary magnetogenesis, we show...... that there exists a new simple consistency relation for the non-Gaussian cross correlation function of the scalar metric perturbation with two powers of the magnetic field in the squeezed limit where the momentum of the metric perturbation vanishes. We emphasize that such a consistency relation turns out...... to be extremely useful to test some recent calculations in the literature. Apart from primordial non-Gaussianity induced by the curvature perturbations, such a cross correlation might provide a new observational probe of inflation and can in principle reveal the primordial nature of cosmic magnetic fields. DOI...
Consistent Estimation of Partition Markov Models
Directory of Open Access Journals (Sweden)
Jesús E. García
2017-04-01
Full Text Available The Partition Markov Model characterizes the process by a partition L of the state space, where the elements in each part of L share the same transition probability to an arbitrary element in the alphabet. This model aims to answer the following questions: what is the minimal number of parameters needed to specify a Markov chain and how to estimate these parameters. In order to answer these questions, we build a consistent strategy for model selection which consist of: giving a size n realization of the process, finding a model within the Partition Markov class, with a minimal number of parts to represent the process law. From the strategy, we derive a measure that establishes a metric in the state space. In addition, we show that if the law of the process is Markovian, then, eventually, when n goes to infinity, L will be retrieved. We show an application to model internet navigation patterns.
Internal Branding and Employee Brand Consistent Behaviours
DEFF Research Database (Denmark)
Mazzei, Alessandra; Ravazzani, Silvia
2017-01-01
constitutive processes. In particular, the paper places emphasis on the role and kinds of communication practices as a central part of the nonnormative and constitutive internal branding process. The paper also discusses an empirical study based on interviews with 32 Italian and American communication managers...... and 2 focus groups with Italian communication managers. Findings show that, in order to enhance employee brand consistent behaviours, the most effective communication practices are those characterised as enablement-oriented. Such a communication creates the organizational conditions adequate to sustain......Employee behaviours conveying brand values, named brand consistent behaviours, affect the overall brand evaluation. Internal branding literature highlights a knowledge gap in terms of communication practices intended to sustain such behaviours. This study contributes to the development of a non...
Self-consistent velocity dependent effective interactions
International Nuclear Information System (INIS)
Kubo, Takayuki; Sakamoto, Hideo; Kammuri, Tetsuo; Kishimoto, Teruo.
1993-09-01
The field coupling method is extended to a system with a velocity dependent mean potential. By means of this method, we can derive the effective interactions which are consistent with the mean potential. The self-consistent velocity dependent effective interactions are applied to the microscopic analysis of the structures of giant dipole resonances (GDR) of 148,154 Sm, of the first excited 2 + states of Sn isotopes and of the first excited 3 - states of Mo isotopes. It is clarified that the interactions play crucial roles in describing the splitting of the resonant structure of GDR peaks, in restoring the energy weighted sum rule values, and in reducing B (Eλ) values. (author)
Evaluating Temporal Consistency in Marine Biodiversity Hotspots
Piacenza, Susan E.; Thurman, Lindsey L.; Barner, Allison K.; Benkwitt, Cassandra E.; Boersma, Kate S.; Cerny-Chipman, Elizabeth B.; Ingeman, Kurt E.; Kindinger, Tye L.; Lindsley, Amy J.; Nelson, Jake; Reimer, Jessica N.; Rowe, Jennifer C.; Shen, Chenchen; Thompson, Kevin A.; Heppell, Selina S.
2015-01-01
With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monito...
Cloud Standardization: Consistent Business Processes and Information
Directory of Open Access Journals (Sweden)
Razvan Daniel ZOTA
2013-01-01
Full Text Available Cloud computing represents one of the latest emerging trends in distributed computing that enables the existence of hardware infrastructure and software applications as services. The present paper offers a general approach to the cloud computing standardization as a mean of improving the speed of adoption for the cloud technologies. Moreover, this study tries to show out how organizations may achieve more consistent business processes while operating with cloud computing technologies.
Consistency Analysis of Nearest Subspace Classifier
Wang, Yi
2015-01-01
The Nearest subspace classifier (NSS) finds an estimation of the underlying subspace within each class and assigns data points to the class that corresponds to its nearest subspace. This paper mainly studies how well NSS can be generalized to new samples. It is proved that NSS is strongly consistent under certain assumptions. For completeness, NSS is evaluated through experiments on various simulated and real data sets, in comparison with some other linear model based classifiers. It is also ...
Consistent probabilities in loop quantum cosmology
International Nuclear Information System (INIS)
Craig, David A; Singh, Parampreet
2013-01-01
A fundamental issue for any quantum cosmological theory is to specify how probabilities can be assigned to various quantum events or sequences of events such as the occurrence of singularities or bounces. In previous work, we have demonstrated how this issue can be successfully addressed within the consistent histories approach to quantum theory for Wheeler–DeWitt-quantized cosmological models. In this work, we generalize that analysis to the exactly solvable loop quantization of a spatially flat, homogeneous and isotropic cosmology sourced with a massless, minimally coupled scalar field known as sLQC. We provide an explicit, rigorous and complete decoherent-histories formulation for this model and compute the probabilities for the occurrence of a quantum bounce versus a singularity. Using the scalar field as an emergent internal time, we show for generic states that the probability for a singularity to occur in this model is zero, and that of a bounce is unity, complementing earlier studies of the expectation values of the volume and matter density in this theory. We also show from the consistent histories point of view that all states in this model, whether quantum or classical, achieve arbitrarily large volume in the limit of infinite ‘past’ or ‘future’ scalar ‘time’, in the sense that the wave function evaluated at any arbitrary fixed value of the volume vanishes in that limit. Finally, we briefly discuss certain misconceptions concerning the utility of the consistent histories approach in these models. (paper)
Putting humans in ecology: consistency in science and management.
Hobbs, Larry; Fowler, Charles W
2008-03-01
Normal and abnormal levels of human participation in ecosystems can be revealed through the use of macro-ecological patterns. Such patterns also provide consistent and objective guidance that will lead to achieving and maintaining ecosystem health and sustainability. This paper focuses on the consistency of this type of guidance and management. Such management, in sharp contrast to current management practices, ensures that our actions as individuals, institutions, political groups, societies, and as a species are applied consistently across all temporal, spatial, and organizational scales. This approach supplants management of today, where inconsistency results from debate, politics, and legal and religious polarity. Consistency is achieved when human endeavors are guided by natural patterns. Pattern-based management meets long-standing demands for enlightened management that requires humans to participate in complex systems in consistent and sustainable ways.
Consistency of color representation in smart phones.
Dain, Stephen J; Kwan, Benjamin; Wong, Leslie
2016-03-01
One of the barriers to the construction of consistent computer-based color vision tests has been the variety of monitors and computers. Consistency of color on a variety of screens has necessitated calibration of each setup individually. Color vision examination with a carefully controlled display has, as a consequence, been a laboratory rather than a clinical activity. Inevitably, smart phones have become a vehicle for color vision tests. They have the advantage that the processor and screen are associated and there are fewer models of smart phones than permutations of computers and monitors. Colorimetric consistency of display within a model may be a given. It may extend across models from the same manufacturer but is unlikely to extend between manufacturers especially where technologies vary. In this study, we measured the same set of colors in a JPEG file displayed on 11 samples of each of four models of smart phone (iPhone 4s, iPhone5, Samsung Galaxy S3, and Samsung Galaxy S4) using a Photo Research PR-730. The iPhones are white LED backlit LCD and the Samsung are OLEDs. The color gamut varies between models and comparison with sRGB space shows 61%, 85%, 117%, and 110%, respectively. The iPhones differ markedly from the Samsungs and from one another. This indicates that model-specific color lookup tables will be needed. Within each model, the primaries were quite consistent (despite the age of phone varying within each sample). The worst case in each model was the blue primary; the 95th percentile limits in the v' coordinate were ±0.008 for the iPhone 4 and ±0.004 for the other three models. The u'v' variation in white points was ±0.004 for the iPhone4 and ±0.002 for the others, although the spread of white points between models was u'v'±0.007. The differences are essentially the same for primaries at low luminance. The variation of colors intermediate between the primaries (e.g., red-purple, orange) mirror the variation in the primaries. The variation in
Dong, S.
2018-05-01
We present a reduction-consistent and thermodynamically consistent formulation and an associated numerical algorithm for simulating the dynamics of an isothermal mixture consisting of N (N ⩾ 2) immiscible incompressible fluids with different physical properties (densities, viscosities, and pair-wise surface tensions). By reduction consistency we refer to the property that if only a set of M (1 ⩽ M ⩽ N - 1) fluids are present in the system then the N-phase governing equations and boundary conditions will exactly reduce to those for the corresponding M-phase system. By thermodynamic consistency we refer to the property that the formulation honors the thermodynamic principles. Our N-phase formulation is developed based on a more general method that allows for the systematic construction of reduction-consistent formulations, and the method suggests the existence of many possible forms of reduction-consistent and thermodynamically consistent N-phase formulations. Extensive numerical experiments have been presented for flow problems involving multiple fluid components and large density ratios and large viscosity ratios, and the simulation results are compared with the physical theories or the available physical solutions. The comparisons demonstrate that our method produces physically accurate results for this class of problems.
A method for consistent precision radiation therapy
International Nuclear Information System (INIS)
Leong, J.
1985-01-01
Using a meticulous setup procedure in which repeated portal films were taken before each treatment until satisfactory portal verifications were obtained, a high degree of precision in patient positioning was achieved. A fluctuation from treatment to treatment, over 11 treatments, of less than +-0.10 cm (S.D.) for anatomical points inside the treatment field was obtained. This, however, only applies to specific anatomical points selected for this positioning procedure and does not apply to all points within the portal. We have generalized this procedure and have suggested a means by which any target volume can be consistently positioned which may approach this degree of precision. (orig.)
Gentzen's centenary the quest for consistency
Rathjen, Michael
2015-01-01
Gerhard Gentzen has been described as logic’s lost genius, whom Gödel called a better logician than himself. This work comprises articles by leading proof theorists, attesting to Gentzen’s enduring legacy to mathematical logic and beyond. The contributions range from philosophical reflections and re-evaluations of Gentzen’s original consistency proofs to the most recent developments in proof theory. Gentzen founded modern proof theory. His sequent calculus and natural deduction system beautifully explain the deep symmetries of logic. They underlie modern developments in computer science such as automated theorem proving and type theory.
Two consistent calculations of the Weinberg angle
International Nuclear Information System (INIS)
Fairlie, D.B.
1979-01-01
The Weinberg-Salam theory is reformulated as a pure Yang-Mills theory in a six-dimensional space, the Higgs field being interpreted as gauge potentials in the additional dimensions. Viewed in this way, the condition that the Higgs field transforms as a U(1) representation of charge one is equivalent to requiring a value of 30 0 C for the Weinberg angle. A second consistent determination comes from the idea borrowed from monopole theory that the electromagnetic field is in the direction of the Higgs field. (Author)
Mantell, Joanne E.; Smit, Jennifer A.; Beksinska, Mags; Scorgie, Fiona; Milford, Cecilia; Balch, Erin; Mabude, Zonke; Smith, Emily; Adams-Skinner, Jessica; Exner, Theresa M.; Hoffman, Susie; Stein, Zena A.
2011-01-01
Young men in South Africa can play a critical role in preventing new human immunodeficiency virus (HIV) infections, yet are seldom targeted for HIV prevention. While reported condom use at last sex has increased considerably among young people, consistent condom use remains a challenge. In this study, 74 male higher education students gave their…
Cosmological consistency tests of gravity theory and cosmic acceleration
Ishak-Boushaki, Mustapha B.
2017-01-01
Testing general relativity at cosmological scales and probing the cause of cosmic acceleration are among the important objectives targeted by incoming and future astronomical surveys and experiments. I present our recent results on consistency tests that can provide insights about the underlying gravity theory and cosmic acceleration using cosmological data sets. We use statistical measures, the rate of cosmic expansion, the growth rate of large scale structure, and the physical consistency of these probes with one another.
Consistent resolution of some relativistic quantum paradoxes
International Nuclear Information System (INIS)
Griffiths, Robert B.
2002-01-01
A relativistic version of the (consistent or decoherent) histories approach to quantum theory is developed on the basis of earlier work by Hartle, and used to discuss relativistic forms of the paradoxes of spherical wave packet collapse, Bohm's formulation of the Einstein-Podolsky-Rosen paradox, and Hardy's paradox. It is argued that wave function collapse is not needed for introducing probabilities into relativistic quantum mechanics, and in any case should never be thought of as a physical process. Alternative approaches to stochastic time dependence can be used to construct a physical picture of the measurement process that is less misleading than collapse models. In particular, one can employ a coarse-grained but fully quantum-mechanical description in which particles move along trajectories, with behavior under Lorentz transformations the same as in classical relativistic physics, and detectors are triggered by particles reaching them along such trajectories. States entangled between spacelike separate regions are also legitimate quantum descriptions, and can be consistently handled by the formalism presented here. The paradoxes in question arise because of using modes of reasoning which, while correct for classical physics, are inconsistent with the mathematical structure of quantum theory, and are resolved (or tamed) by using a proper quantum analysis. In particular, there is no need to invoke, nor any evidence for, mysterious long-range superluminal influences, and thus no incompatibility, at least from this source, between relativity theory and quantum mechanics
Self-consistent model of confinement
International Nuclear Information System (INIS)
Swift, A.R.
1988-01-01
A model of the large-spatial-distance, zero--three-momentum, limit of QCD is developed from the hypothesis that there is an infrared singularity. Single quarks and gluons do not propagate because they have infinite energy after renormalization. The Hamiltonian formulation of the path integral is used to quantize QCD with physical, nonpropagating fields. Perturbation theory in the infrared limit is simplified by the absence of self-energy insertions and by the suppression of large classes of diagrams due to vanishing propagators. Remaining terms in the perturbation series are resummed to produce a set of nonlinear, renormalizable integral equations which fix both the confining interaction and the physical propagators. Solutions demonstrate the self-consistency of the concepts of an infrared singularity and nonpropagating fields. The Wilson loop is calculated to provide a general proof of confinement. Bethe-Salpeter equations for quark-antiquark pairs and for two gluons have finite-energy solutions in the color-singlet channel. The choice of gauge is addressed in detail. Large classes of corrections to the model are discussed and shown to support self-consistency
Subgame consistent cooperation a comprehensive treatise
Yeung, David W K
2016-01-01
Strategic behavior in the human and social world has been increasingly recognized in theory and practice. It is well known that non-cooperative behavior could lead to suboptimal or even highly undesirable outcomes. Cooperation suggests the possibility of obtaining socially optimal solutions and the calls for cooperation are prevalent in real-life problems. Dynamic cooperation cannot be sustainable if there is no guarantee that the agreed upon optimality principle at the beginning is maintained throughout the cooperation duration. It is due to the lack of this kind of guarantees that cooperative schemes fail to last till its end or even fail to get started. The property of subgame consistency in cooperative dynamic games and the corresponding solution mechanism resolve this “classic” problem in game theory. This book is a comprehensive treatise on subgame consistent dynamic cooperation covering the up-to-date state of the art analyses in this important topic. It sets out to provide the theory, solution tec...
Sludge characterization: the role of physical consistency
Energy Technology Data Exchange (ETDEWEB)
Spinosa, Ludovico; Wichmann, Knut
2003-07-01
The physical consistency is an important parameter in sewage sludge characterization as it strongly affects almost all treatment, utilization and disposal operations. In addition, in many european Directives a reference to the physical consistency is reported as a characteristic to be evaluated for fulfilling the regulations requirements. Further, in many analytical methods for sludge different procedures are indicated depending on whether a sample is liquid or not, is solid or not. Three physical behaviours (liquid, paste-like and solid) can be observed with sludges, so the development of analytical procedures to define the boundary limit between liquid and paste-like behaviours (flowability) and that between solid and paste-like ones (solidity) is of growing interest. Several devices can be used for evaluating the flowability and solidity properties, but often they are costly and difficult to be operated in the field. Tests have been carried out to evaluate the possibility to adopt a simple extrusion procedure for flowability measurements, and a Vicat needle for solidity ones. (author)
Analytical relativistic self-consistent-field calculations for atoms
International Nuclear Information System (INIS)
Barthelat, J.C.; Pelissier, M.; Durand, P.
1980-01-01
A new second-order representation of the Dirac equation is presented. This representation which is exact for a hydrogen atom is applied to approximate analytical self-consistent-field calculations for atoms. Results are given for the rare-gas atoms from helium to radon and for lead. The results compare favorably with numerical Dirac-Hartree-Fock solutions
Consistency of canonical formulation of Horava gravity
International Nuclear Information System (INIS)
Soo, Chopin
2011-01-01
Both the non-projectable and projectable version of Horava gravity face serious challenges. In the non-projectable version, the constraint algebra is seemingly inconsistent. The projectable version lacks a local Hamiltonian constraint, thus allowing for an extra graviton mode which can be problematic. A new formulation (based on arXiv:1007.1563) of Horava gravity which is naturally realized as a representation of the master constraint algebra (instead of the Dirac algebra) studied by loop quantum gravity researchers is presented. This formulation yields a consistent canonical theory with first class constraints; and captures the essence of Horava gravity in retaining only spatial diffeomorphisms as the physically relevant non-trivial gauge symmetry. At the same time the local Hamiltonian constraint is equivalently enforced by the master constraint.
Consistency of canonical formulation of Horava gravity
Energy Technology Data Exchange (ETDEWEB)
Soo, Chopin, E-mail: cpsoo@mail.ncku.edu.tw [Department of Physics, National Cheng Kung University, Tainan, Taiwan (China)
2011-09-22
Both the non-projectable and projectable version of Horava gravity face serious challenges. In the non-projectable version, the constraint algebra is seemingly inconsistent. The projectable version lacks a local Hamiltonian constraint, thus allowing for an extra graviton mode which can be problematic. A new formulation (based on arXiv:1007.1563) of Horava gravity which is naturally realized as a representation of the master constraint algebra (instead of the Dirac algebra) studied by loop quantum gravity researchers is presented. This formulation yields a consistent canonical theory with first class constraints; and captures the essence of Horava gravity in retaining only spatial diffeomorphisms as the physically relevant non-trivial gauge symmetry. At the same time the local Hamiltonian constraint is equivalently enforced by the master constraint.
Self-consistent modelling of ICRH
International Nuclear Information System (INIS)
Hellsten, T.; Hedin, J.; Johnson, T.; Laxaaback, M.; Tennfors, E.
2001-01-01
The performance of ICRH is often sensitive to the shape of the high energy part of the distribution functions of the resonating species. This requires self-consistent calculations of the distribution functions and the wave-field. In addition to the wave-particle interactions and Coulomb collisions the effects of the finite orbit width and the RF-induced spatial transport are found to be important. The inward drift dominates in general even for a symmetric toroidal wave spectrum in the centre of the plasma. An inward drift does not necessarily produce a more peaked heating profile. On the contrary, for low concentrations of hydrogen minority in deuterium plasmas it can even give rise to broader profiles. (author)
Consistent evolution in a pedestrian flow
Guan, Junbiao; Wang, Kaihua
2016-03-01
In this paper, pedestrian evacuation considering different human behaviors is studied by using a cellular automaton (CA) model combined with the snowdrift game theory. The evacuees are divided into two types, i.e. cooperators and defectors, and two different human behaviors, herding behavior and independent behavior, are investigated. It is found from a large amount of numerical simulations that the ratios of the corresponding evacuee clusters are evolved to consistent states despite 11 typically different initial conditions, which may largely owe to self-organization effect. Moreover, an appropriate proportion of initial defectors who are of herding behavior, coupled with an appropriate proportion of initial defectors who are of rationally independent thinking, are two necessary factors for short evacuation time.
Quasiparticle self-consistent GW method: a short summary
International Nuclear Information System (INIS)
Kotani, Takao; Schilfgaarde, Mark van; Faleev, Sergey V; Chantis, Athanasios
2007-01-01
We have developed a quasiparticle self-consistent GW method (QSGW), which is a new self-consistent method to calculate the electronic structure within the GW approximation. The method is formulated based on the idea of a self-consistent perturbation; the non-interacting Green function G 0 , which is the starting point for GWA to obtain G, is determined self-consistently so as to minimize the perturbative correction generated by GWA. After self-consistency is attained, we have G 0 , W (the screened Coulomb interaction) and G self-consistently. This G 0 can be interpreted as the optimum non-interacting propagator for the quasiparticles. We will summarize some theoretical discussions to justify QSGW. Then we will survey results which have been obtained up to now: e.g., band gaps for normal semiconductors are predicted to a precision of 0.1-0.3 eV; the self-consistency including the off-diagonal part is required for NiO and MnO; and so on. There are still some remaining disagreements with experiments; however, they are very systematic, and can be explained from the neglect of excitonic effects
Self-consistent electrodynamic scattering in the symmetric Bragg case
International Nuclear Information System (INIS)
Campos, H.S.
1988-01-01
We have analyzed the symmetric Bragg case, introducing a model of self consistent scattering for two elliptically polarized beams. The crystal is taken as a set of mathematical planes, each of them defined by a surface density of dipoles. We have considered the mesofield and the epifield differently from that of the Ewald's theory and, we assumed a plane of dipoles and the associated fields as a self consistent scattering unit. The exact analytical treatment when applied to any two neighbouring planes, results in a general and self consistent Bragg's equation, in terms of the amplitude and phase variations. The generalized solution for the set of N planes was obtained after introducing an absorption factor in the incident radiation, in two ways: (i) the analytical one, through a rule of field similarity, which says that the incidence occurs in both faces of the all crystal planes and also, through a matricial development with the Chebyshev polynomials; (ii) using the numerical solution we calculated, iteratively, the reflectivity, the reflection phase, the transmissivity, the transmission phase and the energy. The results are showed through reflection and transmission curves, which are characteristics as from kinematical as dynamical theories. The conservation of the energy results from the Ewald's self consistency principle is used. In the absorption case, the results show that it is not the only cause for the asymmetric form in the reflection curves. The model contains basic elements for a unified, microscope, self consistent, vectorial and exact formulation for interpretating the X ray diffraction in perfect crystals. (author)
An energetically consistent vertical mixing parameterization in CCSM4
DEFF Research Database (Denmark)
Nielsen, Søren Borg; Jochum, Markus; Eden, Carsten
2018-01-01
An energetically consistent stratification-dependent vertical mixing parameterization is implemented in the Community Climate System Model 4 and forced with energy conversion from the barotropic tides to internal waves. The structures of the resulting dissipation and diffusivity fields are compared......, however, depends greatly on the details of the vertical mixing parameterizations, where the new energetically consistent parameterization results in low thermocline diffusivities and a sharper and shallower thermocline. It is also investigated if the ocean state is more sensitive to a change in forcing...
Exploring the Consistent behavior of Information Services
Directory of Open Access Journals (Sweden)
Kapidakis Sarantos
2016-01-01
Full Text Available Computer services are normally assumed to work well all the time. This usually happens for crucial services like bank electronic services, but not necessarily so for others, that there is no commercial interest in their operation. In this work we examined the operation and the errors of information services and tried to find clues that will help predicting the consistency of the behavior and the quality of the harvesting, which is harder because of the transient conditions and the many services and the huge amount of harvested information. We found many unexpected situations. The services that always successfully satisfy a request may in fact return part of it. A significant part of the OAI services have ceased working while many other serves occasionally fail to respond. Some services fail in the same way each time, and we pronounce them dead, as we do not see a way to overcome that. Others also always, or sometimes fail, but not in the same way, and we hope that their behavior is affected by temporary factors, that may improve later on. We categorized the services into classes, to study their behavior in more detail.
[Consistent Declarative Memory with Depressive Symptomatology].
Botelho de Oliveira, Silvia; Flórez, Ruth Natalia Suárez; Caballero, Diego Andrés Vásquez
2012-12-01
Some studies have suggested that potentiated remembrance of negative events on people with depressive disorders seems to be an important factor in the etiology, course and maintenance of depression. Evaluate the emotional memory in people with and without depressive symptomatology by means of an audio-visual test. 73 university students were evaluated, male and female, between 18 and 40 years old, distributed in two groups: with depressive symptomatology (32) and without depressive symptomatology (40), using the Scale from the Center of Epidemiologic Studies for Depression (CES-D, English Abbreviation) and a cutting point of 20. There were not meaningful differences between free and voluntary recalls, with and without depressive symptomatology, in spite of the fact that both groups had granted a higher emotional value to the audio-visual test and that they had associated it with emotional sadness. People with depressive symptomatology did not exhibit the effect of mnemonic potentiation generally associated to the content of the emotional version of the test; therefore, the hypothesis of emotional consistency was not validated. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.
Self-consistent nuclear energy systems
International Nuclear Information System (INIS)
Shimizu, A.; Fujiie, Y.
1995-01-01
A concept of self-consistent energy systems (SCNES) has been proposed as an ultimate goal of the nuclear energy system in the coming centuries. SCNES should realize a stable and unlimited energy supply without endangering the human race and the global environment. It is defined as a system that realizes at least the following four objectives simultaneously: (a) energy generation -attain high efficiency in the utilization of fission energy; (b) fuel production - secure inexhaustible energy source: breeding of fissile material with the breeding ratio greater than one and complete burning of transuranium through recycling; (c) burning of radionuclides - zero release of radionuclides from the system: complete burning of transuranium and elimination of radioactive fission products by neutron capture reactions through recycling; (d) system safety - achieve system safety both for the public and experts: eliminate criticality-related safety issues by using natural laws and simple logic. This paper describes the concept of SCNES and discusses the feasibility of the system. Both ''neutron balance'' and ''energbalance'' of the system are introduced as the necessary conditions to be satisfied at least by SCNES. Evaluations made so far indicate that both the neutron balance and the energy balance can be realized by fast reactors but not by thermal reactors. Concerning the system safety, two safety concepts: ''self controllability'' and ''self-terminability'' are introduced to eliminate the criticality-related safety issues in fast reactors. (author)
Toward a consistent model for glass dissolution
International Nuclear Information System (INIS)
Strachan, D.M.; McGrail, B.P.; Bourcier, W.L.
1994-01-01
Understanding the process of glass dissolution in aqueous media has advanced significantly over the last 10 years through the efforts of many scientists around the world. Mathematical models describing the glass dissolution process have also advanced from simple empirical functions to structured models based on fundamental principles of physics, chemistry, and thermodynamics. Although borosilicate glass has been selected as the waste form for disposal of high-level wastes in at least 5 countries, there is no international consensus on the fundamental methodology for modeling glass dissolution that could be used in assessing the long term performance of waste glasses in a geologic repository setting. Each repository program is developing their own model and supporting experimental data. In this paper, we critically evaluate a selected set of these structured models and show that a consistent methodology for modeling glass dissolution processes is available. We also propose a strategy for a future coordinated effort to obtain the model input parameters that are needed for long-term performance assessments of glass in a geologic repository. (author) 4 figs., tabs., 75 refs
Dispersion sensitivity analysis & consistency improvement of APFSDS
Directory of Open Access Journals (Sweden)
Sangeeta Sharma Panda
2017-08-01
In Bore Balloting Motion simulation shows that reduction in residual spin by about 5% results in drastic 56% reduction in first maximum yaw. A correlation between first maximum yaw and residual spin is observed. Results of data analysis are used in design modification for existing ammunition. Number of designs are evaluated numerically before freezing five designs for further soundings. These designs are critically assessed in terms of their comparative performance during In-bore travel & external ballistics phase. Results are validated by free flight trials for the finalised design.
Teferra, Alemayehu Shimeka; Alemu, Fekadu Mazengia; Woldeyohannes, Solomon Meseret
2012-07-31
Reduction of maternal mortality is a global priority particularly in developing countries including Ethiopia where maternal mortality ratio is one of the highest in the world. The key to reducing maternal mortality ratio and improving maternal health is increasing attendance by skilled health personnel throughout pregnancy and delivery. However, delivery service is significantly lower in Amhara Regional State, Ethiopia. Therefore, this study aimed to assess factors affecting institutional delivery service utilization among mothers who gave birth in the last 12 months in Sekela District, Amhara Region, Ethiopia. Community-based cross-sectional study was conducted among mothers with birth in the last 12 months during August, 2010. Multistage sampling technique was used to select 371 participants. A pre tested and structured questionnaire was used to collect data. Bivariate and multivariate data analysis was performed using SPSS version 16.0 software. The study indicated that 12.1% of the mothers delivered in health facilities. Of 87.9% mothers who gave birth at home, 80.0% of them were assisted by family members and relatives. The common reasons for home delivery were closer attention from family members and relatives (60.9%), home delivery is usual practice (57.7%), unexpected labour (33.4%), not being sick or no problem at the time of delivery (21.6%) and family influence (14.4%). Being urban resident (AOR [95% CI] = 4.6 [1.91, 10.9]), ANC visit during last pregnancy (AOR [95% CI] = 4.26 [1.1, 16.4]), maternal education level (AOR [95%CI] =11.98 [3.36, 41.4]) and knowledge of mothers on pregnancy and delivery services (AOR [95% CI] = 2.97[1.1, 8.6]) had significant associations with institutional delivery service utilization. Very low institutional delivery service utilization was observed in the study area. Majority of the births at home were assisted by family members and relatives. ANC visit and lack of knowledge on pregnancy and delivery services were found to
View from Europe: stability, consistency or pragmatism
International Nuclear Information System (INIS)
Dunster, H.J.
1988-01-01
The last few years of this decade look like a period of reappraisal of radiation protection standards. The revised risk estimates from Japan will be available, and the United Nations Scientific Committee on the Effects of Atomic Radiation will be publishing new reports on biological topics. The International Commission on Radiological Protection (ICRP) has started a review of its basic recommendations, and the new specification for dose equivalent in radiation fields of the International Commission on Radiation Units and Measurements (ICRU) will be coming into use. All this is occurring at a time when some countries are still trying to catch up with committed dose equivalent and the recently recommended change in the value of the quality factor for neutrons. In Europe, the problems of adapting to new ICRP recommendations are considerable. The European Community, including 12 states and nine languages, takes ICRP recommendations as a basis and develops council directives that are binding on member states, which have then to arrange for their own regulatory changes. Any substantial adjustments could take 5 y or more to work through the system. Clearly, the regulatory preference is for stability. Equally clearly, trade unions and public interest groups favor a rapid response to scientific developments (provided that the change is downward). Organizations such as the ICRP have to balance their desire for internal consistency and intellectual purity against the practical problems of their clients in adjusting to change. This paper indicates some of the changes that might be necessary over the next few years and how, given a pragmatic approach, they might be accommodated in Europe without too much regulatory confusion
Self-consistent meson mass spectrum
International Nuclear Information System (INIS)
Balazs, L.A.P.
1982-01-01
A dual-topological-unitarization (or dual-fragmentation) approach to the calculation of hadron masses is presented, in which the effect of planar ''sea''-quark loops is taken into account from the beginning. Using techniques based on analyticity and generalized ladder-graph dynamics, we first derive the approximate ''generic'' Regge-trajectory formula α(t) = max (S 1 +S 2 , S 3 +S 4 )-(1/2) +2alpha-circumflex'[s/sub a/ +(1/2)(t-summationm/sub i/ 2 )] for any given hadronic process 1+2→3+4, where S/sub i/ and m/sub i/ are the spins and masses of i = 1,2,3,4, and √s/sub a/ is the effective mass of the lowest nonvanishing contribution (a) exchanged in the crossed channel. By requiring a minimization of secondary (background, etc.) contributions to a, and demanding simultaneous consistency for entire sets of such processes, we are then able to calculate the masses of all the lowest pseudoscalar and vector qq-bar states with q = u,d,s and the Regge trajectories on which they lie. By making certain additional assumptions we are also able to do this with q = u,d,c and q = u,d,b. Our only arbitrary parameters are m/sub rho/, m/sub K/*, m/sub psi/, and m/sub Upsilon/, one of which merely serves to fix the energy scale. In contrast to many other approaches, a small m/sub π/ 2 /m/sub rho/ 2 ratio arises quite naturally in the present scheme
Speed Consistency in the Smart Tachograph.
Borio, Daniele; Cano, Eduardo; Baldini, Gianmarco
2018-05-16
In the transportation sector, safety risks can be significantly reduced by monitoring the behaviour of drivers and by discouraging possible misconducts that entail fatigue and can increase the possibility of accidents. The Smart Tachograph (ST), the new revision of the Digital Tachograph (DT), has been designed with this purpose: to verify that speed limits and compulsory rest periods are respected by drivers. In order to operate properly, the ST periodically checks the consistency of data from different sensors, which can be potentially manipulated to avoid the monitoring of the driver behaviour. In this respect, the ST regulation specifies a test procedure to detect motion conflicts originating from inconsistencies between Global Navigation Satellite System (GNSS) and odometry data. This paper provides an experimental evaluation of the speed verification procedure specified by the ST regulation. Several hours of data were collected using three vehicles and considering light urban and highway environments. The vehicles were equipped with an On-Board Diagnostics (OBD) data reader and a GPS/Galileo receiver. The tests prescribed by the regulation were implemented with specific focus on synchronization aspects. The experimental analysis also considered aspects such as the impact of tunnels and the presence of data gaps. The analysis shows that the metrics selected for the tests are resilient to data gaps, latencies between GNSS and odometry data and simplistic manipulations such as data scaling. The new ST forces an attacker to falsify data from both sensors at the same time and in a coherent way. This makes more difficult the implementation of frauds in comparison to the current version of the DT.
Autonomous Navigation with Constrained Consistency for C-Ranger
Directory of Open Access Journals (Sweden)
Shujing Zhang
2014-06-01
Full Text Available Autonomous underwater vehicles (AUVs have become the most widely used tools for undertaking complex exploration tasks in marine environments. Their synthetic ability to carry out localization autonomously and build an environmental map concurrently, in other words, simultaneous localization and mapping (SLAM, are considered to be pivotal requirements for AUVs to have truly autonomous navigation. However, the consistency problem of the SLAM system has been greatly ignored during the past decades. In this paper, a consistency constrained extended Kalman filter (EKF SLAM algorithm, applying the idea of local consistency, is proposed and applied to the autonomous navigation of the C-Ranger AUV, which is developed as our experimental platform. The concept of local consistency (LC is introduced after an explicit theoretical derivation of the EKF-SLAM system. Then, we present a locally consistency-constrained EKF-SLAM design, LC-EKF, in which the landmark estimates used for linearization are fixed at the beginning of each local time period, rather than evaluated at the latest landmark estimates. Finally, our proposed LC-EKF algorithm is experimentally verified, both in simulations and sea trials. The experimental results show that the LC-EKF performs well with regard to consistency, accuracy and computational efficiency.
High-performance speech recognition using consistency modeling
Digalakis, Vassilios; Murveit, Hy; Monaco, Peter; Neumeyer, Leo; Sankar, Ananth
1994-12-01
The goal of SRI's consistency modeling project is to improve the raw acoustic modeling component of SRI's DECIPHER speech recognition system and develop consistency modeling technology. Consistency modeling aims to reduce the number of improper independence assumptions used in traditional speech recognition algorithms so that the resulting speech recognition hypotheses are more self-consistent and, therefore, more accurate. At the initial stages of this effort, SRI focused on developing the appropriate base technologies for consistency modeling. We first developed the Progressive Search technology that allowed us to perform large-vocabulary continuous speech recognition (LVCSR) experiments. Since its conception and development at SRI, this technique has been adopted by most laboratories, including other ARPA contracting sites, doing research on LVSR. Another goal of the consistency modeling project is to attack difficult modeling problems, when there is a mismatch between the training and testing phases. Such mismatches may include outlier speakers, different microphones and additive noise. We were able to either develop new, or transfer and evaluate existing, technologies that adapted our baseline genonic HMM recognizer to such difficult conditions.
Application of consistent fluid added mass matrix to core seismic
International Nuclear Information System (INIS)
Koo, K. H.; Lee, J. H.
2003-01-01
In this paper, the application algorithm of a consistent fluid added mass matrix including the coupling terms to the core seismic analysis is developed and installed at SAC-CORE3.0 code. As an example, we assumed the 7-hexagon system of the LMR core and carried out the vibration modal analysis and the nonlinear time history seismic response analysis using SAC-CORE3.0. Used consistent fluid added mass matrix is obtained by using the finite element program of the FAMD(Fluid Added Mass and Damping) code. From the results of the vibration modal analysis, the core duct assemblies reveal strongly coupled vibration modes, which are so different from the case of in-air condition. From the results of the time history seismic analysis, it was verified that the effects of the coupled terms of the consistent fluid added mass matrix are significant in impact responses and the dynamic responses
Self-consistency and coherent effects in nonlinear resonances
International Nuclear Information System (INIS)
Hofmann, I.; Franchetti, G.; Qiang, J.; Ryne, R. D.
2003-01-01
The influence of space charge on emittance growth is studied in simulations of a coasting beam exposed to a strong octupolar perturbation in an otherwise linear lattice, and under stationary parameters. We explore the importance of self-consistency by comparing results with a non-self-consistent model, where the space charge electric field is kept 'frozen-in' to its initial values. For Gaussian distribution functions we find that the 'frozen-in' model results in a good approximation of the self-consistent model, hence coherent response is practically absent and the emittance growth is self-limiting due to space charge de-tuning. For KV or waterbag distributions, instead, strong coherent response is found, which we explain in terms of absence of Landau damping
Cognitive consistency and math-gender stereotypes in Singaporean children.
Cvencek, Dario; Meltzoff, Andrew N; Kapur, Manu
2014-01-01
In social psychology, cognitive consistency is a powerful principle for organizing psychological concepts. There have been few tests of cognitive consistency in children and no research about cognitive consistency in children from Asian cultures, who pose an interesting developmental case. A sample of 172 Singaporean elementary school children completed implicit and explicit measures of math-gender stereotype (male=math), gender identity (me=male), and math self-concept (me=math). Results showed strong evidence for cognitive consistency; the strength of children's math-gender stereotypes, together with their gender identity, significantly predicted their math self-concepts. Cognitive consistency may be culturally universal and a key mechanism for developmental change in social cognition. We also discovered that Singaporean children's math-gender stereotypes increased as a function of age and that boys identified with math more strongly than did girls despite Singaporean girls' excelling in math. The results reveal both cultural universals and cultural variation in developing social cognition. Copyright © 2013 Elsevier Inc. All rights reserved.
Consistent Price Systems in Multiasset Markets
Directory of Open Access Journals (Sweden)
Florian Maris
2012-01-01
markets with proportional transaction costs as discussed in the recent paper by Guasoni et al. (2008, where the CFS property is introduced and shown sufficient for CPSs for processes with certain state space. The current paper extends the results in the work of Guasoni et al. (2008, to processes with more general state space.
Using the Milieu: Treatment-Environment Consistency.
Szekais, Barbara
1985-01-01
Describes trial use of milieu and activity-based therapy in two adult day centers to increase client involvement in physical and social environments of treatment settings. Reports results from empirical observations and recommends further investigation of this treatment modality in settings for the elderly. (Author/NRB)
Consistency of the Takens estimator for the correlation dimension
Borovkova, S.; Burton, Robert; Dehling, H.
Motivated by the problem of estimating the fractal dimension of a strange attractor, we prove weak consistency of U-statistics for stationary ergodic and mixing sequences when the kernel function is unbounded, extending by this earlier results of Aaronson, Burton, Dehling, Gilat, Hill and Weiss. We
Brief Report: Consistency of Search Engine Rankings for Autism Websites
Reichow, Brian; Naples, Adam; Steinhoff, Timothy; Halpern, Jason; Volkmar, Fred R.
2012-01-01
The World Wide Web is one of the most common methods used by parents to find information on autism spectrum disorders and most consumers find information through search engines such as Google or Bing. However, little is known about how the search engines operate or the consistency of the results that are returned over time. This study presents the…
Massively parallel self-consistent-field calculations
International Nuclear Information System (INIS)
Tilson, J.L.
1994-01-01
The advent of supercomputers with many computational nodes each with its own independent memory makes possible extremely fast computations. The author's work, as part of the US High Performance Computing and Communications Program (HPCCP), is focused on the development of electronic structure techniques for the solution of Grand Challenge-size molecules containing hundreds of atoms. Their efforts have resulted in a fully scalable Direct-SCF program that is portable and efficient. This code, named NWCHEM, is built around a distributed-data model. This distributed data is managed by a software package called Global Arrays developed within the HPCCP. They present performance results for Direct-SCF calculations of interest to the consortium
Valence nucleons in self-consistent fields
International Nuclear Information System (INIS)
Di Toro, M.; Lomnitz-Adler, J.
1978-01-01
An iterative approach to determine directly the best Hartree-Fock one-body density rho is extended by expressing rho in terms of a core and a valence part and allowing for general crossings of occupied and unoccupied levels in the valence part. Results are shown for 152 Sm and a microscopic analysis of the core structure of deformed light nuclei is carried out. (author)
Gender wage gap studies : consistency and decomposition
Kunze, Astrid
2006-01-01
This paper reviews the empirical literature on the gender wage gap, with particular attention given to the identification of the key parameters in human capital wage regression models. This is of great importance in the literature for two main reasons. First, the main explanatory variables in the wage model, i.e., measures of work experience and the time-out-of-work, are endogenous. As a result, applying traditional estimators may lead to inconsistent parameter estimates. Secon...
Self-consistent study of localization
International Nuclear Information System (INIS)
Brezini, A.; Olivier, G.
1981-08-01
The localization models of Abou-Chacra et al. and Kumar et al. are critically re-examined in the limit of weak disorder. By using an improved method of approximation, we have studied the displacement of the band edge and the mobility edge as function of disorder and compared the results of Abou-Chacra et al. and Kumar et al. in the light of the present approximation. (author)
Merging By Decentralized Eventual Consistency Algorithms
Directory of Open Access Journals (Sweden)
Ahmed-Nacer Mehdi
2015-12-01
Full Text Available Merging mechanism is an essential operation for version control systems. When each member of collaborative development works on an individual copy of the project, software merging allows to reconcile modifications made concurrently as well as managing software change through branching. The collaborative system is in charge to propose a merge result that includes user’s modifications. Theusers now have to check and adapt this result. The adaptation should be as effort-less as possible, otherwise, the users may get frustrated and will quit the collaboration. This paper aims to reduce the conflicts during the collaboration and im prove the productivity. It has three objectives: study the users’ behavior during the collaboration, evaluate the quality of textual merging results produced by specific algorithms and propose a solution to improve the r esult quality produced by the default merge tool of distributed version control systems. Through a study of eight open-source repositories totaling more than 3 million lines of code, we observe the behavior of the concurrent modifications during t he merge p rocedure. We i dentified when th e ex isting merge techniques under-perform, and we propose solutions to improve the quality of the merge. We finally compare with the traditional merge tool through a large corpus of collaborative editing.
An Explicit Consistent Geometric Stiffness Matrix for the DKT Element
Directory of Open Access Journals (Sweden)
Eliseu Lucena Neto
Full Text Available Abstract A large number of references dealing with the geometric stiffness matrix of the DKT finite element exist in the literature, where nearly all of them adopt an inconsistent form. While such a matrix may be part of the element to treat nonlinear problems in general, it is of crucial importance for linearized buckling analysis. The present work seems to be the first to obtain an explicit expression for this matrix in a consistent way. Numerical results on linear buckling of plates assess the element performance either with the proposed explicit consistent matrix, or with the most commonly used inconsistent matrix.
Quantum consistency of open string theories
International Nuclear Information System (INIS)
Govaerts, J.
1989-01-01
We discuss how Virasoro anomalies in open string theories uniquely select the gauge group SO(2 D/2 ) independently of any regularisation, although the cancellation of these anomalies does not occur in tachyonic theories, and regulators can always be chosen to make these theories (one-loop) finite for any SO(n) and USp(n) gauge group. The discussion is mainly restricted to open bosonic strings. These results open new perspectives for the recent suggestion made by Sagnotti, the generalisations of which allow for the construction of new open string theories in less than ten dimensions. (orig.)
Assessing atmospheric bias correction for dynamical consistency using potential vorticity
International Nuclear Information System (INIS)
Rocheta, Eytan; Sharma, Ashish; Evans, Jason P
2014-01-01
Correcting biases in atmospheric variables prior to impact studies or dynamical downscaling can lead to new biases as dynamical consistency between the ‘corrected’ fields is not maintained. Use of these bias corrected fields for subsequent impact studies and dynamical downscaling provides input conditions that do not appropriately represent intervariable relationships in atmospheric fields. Here we investigate the consequences of the lack of dynamical consistency in bias correction using a measure of model consistency—the potential vorticity (PV). This paper presents an assessment of the biases present in PV using two alternative correction techniques—an approach where bias correction is performed individually on each atmospheric variable, thereby ignoring the physical relationships that exists between the multiple variables that are corrected, and a second approach where bias correction is performed directly on the PV field, thereby keeping the system dynamically coherent throughout the correction process. In this paper we show that bias correcting variables independently results in increased errors above the tropopause in the mean and standard deviation of the PV field, which are improved when using the alternative proposed. Furthermore, patterns of spatial variability are improved over nearly all vertical levels when applying the alternative approach. Results point to a need for a dynamically consistent atmospheric bias correction technique which results in fields that can be used as dynamically consistent lateral boundaries in follow-up downscaling applications. (letter)
Validation of consistency of Mendelian sampling variance.
Tyrisevä, A-M; Fikse, W F; Mäntysaari, E A; Jakobsen, J; Aamand, G P; Dürr, J; Lidauer, M H
2018-03-01
Experiences from international sire evaluation indicate that the multiple-trait across-country evaluation method is sensitive to changes in genetic variance over time. Top bulls from birth year classes with inflated genetic variance will benefit, hampering reliable ranking of bulls. However, none of the methods available today enable countries to validate their national evaluation models for heterogeneity of genetic variance. We describe a new validation method to fill this gap comprising the following steps: estimating within-year genetic variances using Mendelian sampling and its prediction error variance, fitting a weighted linear regression between the estimates and the years under study, identifying possible outliers, and defining a 95% empirical confidence interval for a possible trend in the estimates. We tested the specificity and sensitivity of the proposed validation method with simulated data using a real data structure. Moderate (M) and small (S) size populations were simulated under 3 scenarios: a control with homogeneous variance and 2 scenarios with yearly increases in phenotypic variance of 2 and 10%, respectively. Results showed that the new method was able to estimate genetic variance accurately enough to detect bias in genetic variance. Under the control scenario, the trend in genetic variance was practically zero in setting M. Testing cows with an average birth year class size of more than 43,000 in setting M showed that tolerance values are needed for both the trend and the outlier tests to detect only cases with a practical effect in larger data sets. Regardless of the magnitude (yearly increases in phenotypic variance of 2 or 10%) of the generated trend, it deviated statistically significantly from zero in all data replicates for both cows and bulls in setting M. In setting S with a mean of 27 bulls in a year class, the sampling error and thus the probability of a false-positive result clearly increased. Still, overall estimated genetic
Consistent Partial Least Squares Path Modeling via Regularization.
Jung, Sunho; Park, JaeHong
2018-01-01
Partial least squares (PLS) path modeling is a component-based structural equation modeling that has been adopted in social and psychological research due to its data-analytic capability and flexibility. A recent methodological advance is consistent PLS (PLSc), designed to produce consistent estimates of path coefficients in structural models involving common factors. In practice, however, PLSc may frequently encounter multicollinearity in part because it takes a strategy of estimating path coefficients based on consistent correlations among independent latent variables. PLSc has yet no remedy for this multicollinearity problem, which can cause loss of statistical power and accuracy in parameter estimation. Thus, a ridge type of regularization is incorporated into PLSc, creating a new technique called regularized PLSc. A comprehensive simulation study is conducted to evaluate the performance of regularized PLSc as compared to its non-regularized counterpart in terms of power and accuracy. The results show that our regularized PLSc is recommended for use when serious multicollinearity is present.
Linear augmented plane wave method for self-consistent calculations
International Nuclear Information System (INIS)
Takeda, T.; Kuebler, J.
1979-01-01
O.K. Andersen has recently introduced a linear augmented plane wave method (LAPW) for the calculation of electronic structure that was shown to be computationally fast. A more general formulation of an LAPW method is presented here. It makes use of a freely disposable number of eigenfunctions of the radial Schroedinger equation. These eigenfunctions can be selected in a self-consistent way. The present formulation also results in a computationally fast method. It is shown that Andersen's LAPW is obtained in a special limit from the present formulation. Self-consistent test calculations for copper show the present method to be remarkably accurate. As an application, scalar-relativistic self-consistent calculations are presented for the band structure of FCC lanthanum. (author)
Martial arts striking hand peak acceleration, accuracy and consistency.
Neto, Osmar Pinto; Marzullo, Ana Carolina De Miranda; Bolander, Richard P; Bir, Cynthia A
2013-01-01
The goal of this paper was to investigate the possible trade-off between peak hand acceleration and accuracy and consistency of hand strikes performed by martial artists of different training experiences. Ten male martial artists with training experience ranging from one to nine years volunteered to participate in the experiment. Each participant performed 12 maximum effort goal-directed strikes. Hand acceleration during the strikes was obtained using a tri-axial accelerometer block. A pressure sensor matrix was used to determine the accuracy and consistency of the strikes. Accuracy was estimated by the radial distance between the centroid of each subject's 12 strikes and the target, whereas consistency was estimated by the square root of the 12 strikes mean squared distance from their centroid. We found that training experience was significantly correlated to hand peak acceleration prior to impact (r(2)=0.456, p =0.032) and accuracy (r(2)=0. 621, p=0.012). These correlations suggest that more experienced participants exhibited higher hand peak accelerations and at the same time were more accurate. Training experience, however, was not correlated to consistency (r(2)=0.085, p=0.413). Overall, our results suggest that martial arts training may lead practitioners to achieve higher striking hand accelerations with better accuracy and no change in striking consistency.
Quantification of intraventricular hemorrhage is consistent using a spherical sensitivity matrix
Tang, Te; Sadleir, Rosalind
2010-04-01
We have developed a robust current pattern for detection of intraventricular hemorrhage (IVH). In this study, the current pattern was applied on two realistic shaped neonatal head models and one head-shaped phantom. We found that a sensitivity matrix calculated from a spherical model gave us satisfactory reconstructions in terms of both image quality and quantification. Incorporating correct geometry information into the forward model improved image quality. However, it did not improve quantification accuracy. The results indicate that using a spherical matrix may be a more practical choice for monitoring IVH volumes in neonates.
Time-Consistent and Market-Consistent Evaluations (Revised version of 2012-086)
Stadje, M.A.; Pelsser, A.
2014-01-01
Abstract: We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from
Thermodynamically consistent data-driven computational mechanics
González, David; Chinesta, Francisco; Cueto, Elías
2018-05-01
In the paradigm of data-intensive science, automated, unsupervised discovering of governing equations for a given physical phenomenon has attracted a lot of attention in several branches of applied sciences. In this work, we propose a method able to avoid the identification of the constitutive equations of complex systems and rather work in a purely numerical manner by employing experimental data. In sharp contrast to most existing techniques, this method does not rely on the assumption on any particular form for the model (other than some fundamental restrictions placed by classical physics such as the second law of thermodynamics, for instance) nor forces the algorithm to find among a predefined set of operators those whose predictions fit best to the available data. Instead, the method is able to identify both the Hamiltonian (conservative) and dissipative parts of the dynamics while satisfying fundamental laws such as energy conservation or positive production of entropy, for instance. The proposed method is tested against some examples of discrete as well as continuum mechanics, whose accurate results demonstrate the validity of the proposed approach.
Self-consistent studies of magnetic thin film Ni (001)
International Nuclear Information System (INIS)
Wang, C.S.; Freeman, A.J.
1979-01-01
Advances in experimental methods for studying surface phenomena have provided the stimulus to develop theoretical methods capable of interpreting this wealth of new information. Of particular interest have been the relative roles of bulk and surface contributions since in several important cases agreement between experiment and bulk self-consistent (SC) calculations within the local spin density functional formalism (LSDF) is lacking. We discuss our recent extension of the (LSDF) approach to the study of thin films (slabs) and the role of surface effects on magnetic properties. Results are described for Ni (001) films using our new SC numerical basis set LCAO method. Self-consistency within the superposition of overlapping spherical atomic charge density model is obtained iteratively with the atomic configuration as the adjustable parameter. Results are presented for the electronic charge densities and local density of states. The origin and role of (magnetic) surface states is discussed by comparison with results of earlier bulk calculations
Diagnosing a Strong-Fault Model by Conflict and Consistency
Directory of Open Access Journals (Sweden)
Wenfeng Zhang
2018-03-01
Full Text Available The diagnosis method for a weak-fault model with only normal behaviors of each component has evolved over decades. However, many systems now demand a strong-fault models, the fault modes of which have specific behaviors as well. It is difficult to diagnose a strong-fault model due to its non-monotonicity. Currently, diagnosis methods usually employ conflicts to isolate possible fault and the process can be expedited when some observed output is consistent with the model’s prediction where the consistency indicates probably normal components. This paper solves the problem of efficiently diagnosing a strong-fault model by proposing a novel Logic-based Truth Maintenance System (LTMS with two search approaches based on conflict and consistency. At the beginning, the original a strong-fault model is encoded by Boolean variables and converted into Conjunctive Normal Form (CNF. Then the proposed LTMS is employed to reason over CNF and find multiple minimal conflicts and maximal consistencies when there exists fault. The search approaches offer the best candidate efficiency based on the reasoning result until the diagnosis results are obtained. The completeness, coverage, correctness and complexity of the proposals are analyzed theoretically to show their strength and weakness. Finally, the proposed approaches are demonstrated by applying them to a real-world domain—the heat control unit of a spacecraft—where the proposed methods are significantly better than best first and conflict directly with A* search methods.
Diagnosing a Strong-Fault Model by Conflict and Consistency.
Zhang, Wenfeng; Zhao, Qi; Zhao, Hongbo; Zhou, Gan; Feng, Wenquan
2018-03-29
The diagnosis method for a weak-fault model with only normal behaviors of each component has evolved over decades. However, many systems now demand a strong-fault models, the fault modes of which have specific behaviors as well. It is difficult to diagnose a strong-fault model due to its non-monotonicity. Currently, diagnosis methods usually employ conflicts to isolate possible fault and the process can be expedited when some observed output is consistent with the model's prediction where the consistency indicates probably normal components. This paper solves the problem of efficiently diagnosing a strong-fault model by proposing a novel Logic-based Truth Maintenance System (LTMS) with two search approaches based on conflict and consistency. At the beginning, the original a strong-fault model is encoded by Boolean variables and converted into Conjunctive Normal Form (CNF). Then the proposed LTMS is employed to reason over CNF and find multiple minimal conflicts and maximal consistencies when there exists fault. The search approaches offer the best candidate efficiency based on the reasoning result until the diagnosis results are obtained. The completeness, coverage, correctness and complexity of the proposals are analyzed theoretically to show their strength and weakness. Finally, the proposed approaches are demonstrated by applying them to a real-world domain-the heat control unit of a spacecraft-where the proposed methods are significantly better than best first and conflict directly with A* search methods.
Kutepov, A L
2015-08-12
Self-consistent solutions of Hedin's equations (HE) for the two-site Hubbard model (HM) have been studied. They have been found for three-point vertices of increasing complexity (Γ = 1 (GW approximation), Γ1 from the first-order perturbation theory, and the exact vertex Γ(E)). Comparison is made between the cases when an additional quasiparticle (QP) approximation for Green's functions is applied during the self-consistent iterative solving of HE and when QP approximation is not applied. The results obtained with the exact vertex are directly related to the present open question-which approximation is more advantageous for future implementations, GW + DMFT or QPGW + DMFT. It is shown that in a regime of strong correlations only the originally proposed GW + DMFT scheme is able to provide reliable results. Vertex corrections based on perturbation theory (PT) systematically improve the GW results when full self-consistency is applied. The application of QP self-consistency combined with PT vertex corrections shows similar problems to the case when the exact vertex is applied combined with QP sc. An analysis of Ward Identity violation is performed for all studied in this work's approximations and its relation to the general accuracy of the schemes used is provided.
The numerical multiconfiguration self-consistent field approach for atoms
International Nuclear Information System (INIS)
Stiehler, Johannes
1995-12-01
The dissertation uses the Multiconfiguration Self-Consistent Field Approach to specify the electronic wave function of N electron atoms in a static electrical field. It presents numerical approaches to describe the wave functions and introduces new methods to compute the numerical Fock equations. Based on results computed with an implemented computer program the universal application, flexibility and high numerical precision of the presented approach is shown. RHF results and for the first time MCSCF results for polarizabilities and hyperpolarizabilities of various states of the atoms He to Kr are discussed. In addition, an application to interpret a plasma spectrum of gallium is presented. (orig.)
Consistent Conformal Extensions of the Standard Model arXiv
Loebbert, Florian; Plefka, Jan
The question of whether classically conformal modifications of the standard model are consistent with experimental obervations has recently been subject to renewed interest. The method of Gildener and Weinberg provides a natural framework for the study of the effective potential of the resulting multi-scalar standard model extensions. This approach relies on the assumption of the ordinary loop hierarchy $\\lambda_\\text{s} \\sim g^2_\\text{g}$ of scalar and gauge couplings. On the other hand, Andreassen, Frost and Schwartz recently argued that in the (single-scalar) standard model, gauge invariant results require the consistent scaling $\\lambda_\\text{s} \\sim g^4_\\text{g}$. In the present paper we contrast these two hierarchy assumptions and illustrate the differences in the phenomenological predictions of minimal conformal extensions of the standard model.
On the consistent effect histories approach to quantum mechanics
International Nuclear Information System (INIS)
Rudolph, O.
1996-01-01
A formulation of the consistent histories approach to quantum mechanics in terms of generalized observables (POV measures) and effect operators is provided. The usual notion of open-quote open-quote history close-quote close-quote is generalized to the notion of open-quote open-quote effect history.close-quote close-quote The space of effect histories carries the structure of a D-poset. Recent results of J. D. Maitland Wright imply that every decoherence functional defined for ordinary histories can be uniquely extended to a bi-additive decoherence functional on the space of effect histories. Omngrave es close-quote logical interpretation is generalized to the present context. The result of this work considerably generalizes and simplifies the earlier formulation of the consistent effect histories approach to quantum mechanics communicated in a previous work of this author. copyright 1996 American Institute of Physics
Consistency of Pay-For-Performance Results Across a Geographically Dispersed Command
2010-04-01
Contract ..................................................................................... 21 Herzberg Two - Factor Theory ...satisfaction. Herzberg Two - Factor Theory Focusing on motivating an employee, their portion of the psychological contract typically includes...Hygiene Theory separates factors into two distinct categories that can affect employee morale. One set of factors can promote job satisfaction and
Do Different Tests of Episodic Memory Produce Consistent Results in Human Adults?
Cheke, Lucy G.; Clayton, Nicola S.
2013-01-01
A number of different philosophical, theoretical, and empirical perspectives on episodic memory have led to the development of very different tests with which to assess it. Although these tests putatively assess the same psychological capacity, they have rarely been directly compared. Here, a sample of undergraduates was tested on three different…
An algebraic method for constructing stable and consistent autoregressive filters
International Nuclear Information System (INIS)
Harlim, John; Hong, Hoon; Robbins, Jacob L.
2015-01-01
In this paper, we introduce an algebraic method to construct stable and consistent univariate autoregressive (AR) models of low order for filtering and predicting nonlinear turbulent signals with memory depth. By stable, we refer to the classical stability condition for the AR model. By consistent, we refer to the classical consistency constraints of Adams–Bashforth methods of order-two. One attractive feature of this algebraic method is that the model parameters can be obtained without directly knowing any training data set as opposed to many standard, regression-based parameterization methods. It takes only long-time average statistics as inputs. The proposed method provides a discretization time step interval which guarantees the existence of stable and consistent AR model and simultaneously produces the parameters for the AR models. In our numerical examples with two chaotic time series with different characteristics of decaying time scales, we find that the proposed AR models produce significantly more accurate short-term predictive skill and comparable filtering skill relative to the linear regression-based AR models. These encouraging results are robust across wide ranges of discretization times, observation times, and observation noise variances. Finally, we also find that the proposed model produces an improved short-time prediction relative to the linear regression-based AR-models in forecasting a data set that characterizes the variability of the Madden–Julian Oscillation, a dominant tropical atmospheric wave pattern
Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines
Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.
2011-01-01
Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433
Robust Visual Tracking Via Consistent Low-Rank Sparse Learning
Zhang, Tianzhu
2014-06-19
Object tracking is the process of determining the states of a target in consecutive video frames based on properties of motion and appearance consistency. In this paper, we propose a consistent low-rank sparse tracker (CLRST) that builds upon the particle filter framework for tracking. By exploiting temporal consistency, the proposed CLRST algorithm adaptively prunes and selects candidate particles. By using linear sparse combinations of dictionary templates, the proposed method learns the sparse representations of image regions corresponding to candidate particles jointly by exploiting the underlying low-rank constraints. In addition, the proposed CLRST algorithm is computationally attractive since temporal consistency property helps prune particles and the low-rank minimization problem for learning joint sparse representations can be efficiently solved by a sequence of closed form update operations. We evaluate the proposed CLRST algorithm against 14 state-of-the-art tracking methods on a set of 25 challenging image sequences. Experimental results show that the CLRST algorithm performs favorably against state-of-the-art tracking methods in terms of accuracy and execution time.
CONSISTENCY UNDER SAMPLING OF EXPONENTIAL RANDOM GRAPH MODELS.
Shalizi, Cosma Rohilla; Rinaldo, Alessandro
2013-04-01
The growing availability of network data and of scientific interest in distributed systems has led to the rapid development of statistical models of network structure. Typically, however, these are models for the entire network, while the data consists only of a sampled sub-network. Parameters for the whole network, which is what is of interest, are estimated by applying the model to the sub-network. This assumes that the model is consistent under sampling , or, in terms of the theory of stochastic processes, that it defines a projective family. Focusing on the popular class of exponential random graph models (ERGMs), we show that this apparently trivial condition is in fact violated by many popular and scientifically appealing models, and that satisfying it drastically limits ERGM's expressive power. These results are actually special cases of more general results about exponential families of dependent random variables, which we also prove. Using such results, we offer easily checked conditions for the consistency of maximum likelihood estimation in ERGMs, and discuss some possible constructive responses.
Conformal consistency relations for single-field inflation
International Nuclear Information System (INIS)
Creminelli, Paolo; Noreña, Jorge; Simonović, Marko
2012-01-01
We generalize the single-field consistency relations to capture not only the leading term in the squeezed limit — going as 1/q 3 , where q is the small wavevector — but also the subleading one, going as 1/q 2 . This term, for an (n+1)-point function, is fixed in terms of the variation of the n-point function under a special conformal transformation; this parallels the fact that the 1/q 3 term is related with the scale dependence of the n-point function. For the squeezed limit of the 3-point function, this conformal consistency relation implies that there are no terms going as 1/q 2 . We verify that the squeezed limit of the 4-point function is related to the conformal variation of the 3-point function both in the case of canonical slow-roll inflation and in models with reduced speed of sound. In the second case the conformal consistency conditions capture, at the level of observables, the relation among operators induced by the non-linear realization of Lorentz invariance in the Lagrangian. These results mean that, in any single-field model, primordial correlation functions of ζ are endowed with an SO(4,1) symmetry, with dilations and special conformal transformations non-linearly realized by ζ. We also verify the conformal consistency relations for any n-point function in models with a modulation of the inflaton potential, where the scale dependence is not negligible. Finally, we generalize (some of) the consistency relations involving tensors and soft internal momenta
Self-consistent equilibria in the pulsar magnetosphere
International Nuclear Information System (INIS)
Endean, V.G.
1976-01-01
For a 'collisionless' pulsar magnetosphere the self-consistent equilibrium particle distribution functions are functions of the constants of the motion ony. Reasons are given for concluding that to a good approximation they will be functions of the rotating frame Hamiltonian only. This is shown to result in a rigid rotation of the plasma, which therefore becomes trapped inside the velocity of light cylinder. The self-consistent field equations are derived, and a method of solving them is illustrated. The axial component of the magnetic field decays to zero at the plasma boundary. In practice, some streaming of particles into the wind zone may occur as a second-order effect. Acceleration of such particles to very high energies is expected when they approach the velocity of light cylinder, but they cannot be accelerated to very high energies near the star. (author)
Consistent creep and rupture properties for creep-fatigue evaluation
International Nuclear Information System (INIS)
Schultz, C.C.
1978-01-01
The currently accepted practice of using inconsistent representations of creep and rupture behaviors in the prediction of creep-fatigue life is shown to introduce a factor of safety beyond that specified in current ASME Code design rules for 304 stainless steel Class 1 nuclear components. Accurate predictions of creep-fatigue life for uniaxial tests on a given heat of material are obtained by using creep and rupture properties for that same heat of material. The use of a consistent representation of creep and rupture properties for a mininum strength heat is also shown to provide adequate predictions. The viability of using consistent properties (either actual or those of a minimum heat) to predict creep-fatigue life thus identifies significant design uses for the results of characterization tests and improved creep and rupture correlations
Consistent creep and rupture properties for creep-fatigue evaluation
International Nuclear Information System (INIS)
Schultz, C.C.
1979-01-01
The currently accepted practice of using inconsistent representations of creep and rupture behaviors in the prediction of creep-fatigue life is shown to introduce a factor of safety beyond that specified in current ASME Code design rules for 304 stainless steel Class 1 nuclear components. Accurate predictions of creep-fatigue life for uniaxial tests on a given heat of material are obtained by using creep and rupture properties for that same heat of material. The use of a consistent representation of creep and rupture properties for a minimum strength heat is also shown to provide reasonable predictions. The viability of using consistent properties (either actual or those of a minimum strength heat) to predict creep-fatigue life thus identifies significant design uses for the results of characterization tests and improved creep and rupture correlations. 12 refs
EVALUATION OF CONSISTENCY AND SETTING TIME OF IRANIAN DENTAL STONES
Directory of Open Access Journals (Sweden)
F GOL BIDI
2000-09-01
Full Text Available Introduction. Dental stones are widely used in dentistry and the success or failure of many dental treatments depend on the accuracy of these gypsums. The purpose of this study was the evaluation of Iranian dental stones and comparison between Iranian and foreign ones. In this investigation, consistency and setting time were compared between Pars Dendn, Almas and Hinrizit stones. The latter is accepted by ADA (American Dental Association. Consistency and setting time are 2 of 5 properties that are necessitated by both ADA specification No. 25 and Iranian Standard Organization specification No. 2569 for evaluation of dental stones. Methods. In this study, the number and preparation of specimens and test conditions were done according to the ADA specification No. 25 and all the measurements were done with vicat apparatus. Results. The results of this study showed that the standard consistency of Almas stone was obtained by 42ml water and 100gr powder and the setting time of this stone was 11±0.03 min. Which was with in the limits of ADA specification (12±4 min. The standard consistency of Pars Dandan stone was obrianed by 31ml water and 100 gr powder, but the setting time of this stone was 5± 0.16 min which was nt within the limits of ADA specification. Discussion: Comparison of Iranian and Hinrizit stones properties showed that two probable problems of Iranian stones are:1- Unhemogrnousity of Iranian stoned powder was caused by uncontrolled temperature, pressure and humidity in the production process of stone. 2- Impurities such as sodium chloride was responsible fo shortening of Pars Dendens setting time.
Mantell, Joanne E.; Smit, Jennifer A.; Beksinska, Mags; Scorgie, Fiona; Milford, Cecilia; Balch, Erin; Mabude, Zonke; Smith, Emily; Adams-Skinner, Jessica; Exner, Theresa M.; Hoffman, Susie; Stein, Zena A.
2011-01-01
Young men in South Africa can play a critical role in preventing new human immunodeficiency virus (HIV) infections, yet are seldom targeted for HIV prevention. While reported condom use at last sex has increased considerably among young people, consistent condom use remains a challenge. In this study, 74 male higher education students gave their perspectives on male and female condoms in 10 focus group discussions. All believed that condoms should be used when wanting to prevent conception an...
Consistent Partial Least Squares Path Modeling via Regularization
Directory of Open Access Journals (Sweden)
Sunho Jung
2018-02-01
Full Text Available Partial least squares (PLS path modeling is a component-based structural equation modeling that has been adopted in social and psychological research due to its data-analytic capability and flexibility. A recent methodological advance is consistent PLS (PLSc, designed to produce consistent estimates of path coefficients in structural models involving common factors. In practice, however, PLSc may frequently encounter multicollinearity in part because it takes a strategy of estimating path coefficients based on consistent correlations among independent latent variables. PLSc has yet no remedy for this multicollinearity problem, which can cause loss of statistical power and accuracy in parameter estimation. Thus, a ridge type of regularization is incorporated into PLSc, creating a new technique called regularized PLSc. A comprehensive simulation study is conducted to evaluate the performance of regularized PLSc as compared to its non-regularized counterpart in terms of power and accuracy. The results show that our regularized PLSc is recommended for use when serious multicollinearity is present.
Context-specific metabolic networks are consistent with experiments.
Directory of Open Access Journals (Sweden)
Scott A Becker
2008-05-01
Full Text Available Reconstructions of cellular metabolism are publicly available for a variety of different microorganisms and some mammalian genomes. To date, these reconstructions are "genome-scale" and strive to include all reactions implied by the genome annotation, as well as those with direct experimental evidence. Clearly, many of the reactions in a genome-scale reconstruction will not be active under particular conditions or in a particular cell type. Methods to tailor these comprehensive genome-scale reconstructions into context-specific networks will aid predictive in silico modeling for a particular situation. We present a method called Gene Inactivity Moderated by Metabolism and Expression (GIMME to achieve this goal. The GIMME algorithm uses quantitative gene expression data and one or more presupposed metabolic objectives to produce the context-specific reconstruction that is most consistent with the available data. Furthermore, the algorithm provides a quantitative inconsistency score indicating how consistent a set of gene expression data is with a particular metabolic objective. We show that this algorithm produces results consistent with biological experiments and intuition for adaptive evolution of bacteria, rational design of metabolic engineering strains, and human skeletal muscle cells. This work represents progress towards producing constraint-based models of metabolism that are specific to the conditions where the expression profiling data is available.
The study of consistent properties of gelatinous shampoo with minoxidil
Directory of Open Access Journals (Sweden)
I. V. Gnitko
2016-04-01
Full Text Available The aim of the work is the study of consistent properties of gelatinous shampoo with minoxidil 1% for the complex therapy and prevention of alopecia. This shampoo with minoxidil was selected according to the complex physical-chemical, biopharmaceutical and microbiological investigations. Methods and results. It has been established that consistent properties of the gelatinous minoxidil 1% shampoo and the «mechanical stability» (1.70 describe the formulation as exceptionally thixotropic composition with possibility of restoration after mechanical loads. Also this fact allows to predict stability of the consistent properties during long storage. Conclusion. Factors of dynamic flowing for the foam detergent gel with minoxidil (Кd1=38.9%; Kd2=78.06% quantitatively confirm sufficient degree of distribution at the time of spreading composition on the skin surface of the hairy part of head or during technological operations of manufacturing. Insignificant difference of «mechanical stability» for the gelatinous minoxidil 1% shampoo and its base indicates the absence of interactions between active substance and the base.
Consistent and efficient processing of ADCP streamflow measurements
Mueller, David S.; Constantinescu, George; Garcia, Marcelo H.; Hanes, Dan
2016-01-01
The use of Acoustic Doppler Current Profilers (ADCPs) from a moving boat is a commonly used method for measuring streamflow. Currently, the algorithms used to compute the average depth, compute edge discharge, identify invalid data, and estimate velocity and discharge for invalid data vary among manufacturers. These differences could result in different discharges being computed from identical data. Consistent computational algorithm, automated filtering, and quality assessment of ADCP streamflow measurements that are independent of the ADCP manufacturer are being developed in a software program that can process ADCP moving-boat discharge measurements independent of the ADCP used to collect the data.
Designing apps for success developing consistent app design practices
David, Matthew
2014-01-01
In 2007, Apple released the iPhone. With this release came tools as revolutionary as the internet was to businesses and individuals back in the mid- and late-nineties: Apps. Much like websites drove (and still drive) business, so too do apps drive sales, efficiencies and communication between people. But also like web design and development, in its early years and iterations, guidelines and best practices for apps are few and far between.Designing Apps for Success provides web/app designers and developers with consistent app design practices that result in timely, appropriate, and efficiently
Two-particle self-consistent approach to unconventional superconductivity
Energy Technology Data Exchange (ETDEWEB)
Otsuki, Junya [Department of Physics, Tohoku University, Sendai (Japan); Theoretische Physik III, Zentrum fuer Elektronische Korrelationen und Magnetismus, Universitaet Augsburg (Germany)
2013-07-01
A non-perturbative approach to unconventional superconductivity is developed based on the idea of the two-particle self-consistent (TPSC) theory. An exact sum-rule which the momentum-dependent pairing susceptibility satisfies is derived. Effective pairing interactions between quasiparticles are determined so that an approximate susceptibility should fulfill this sum-rule, in which fluctuations belonging to different symmetries mix at finite momentum. The mixing leads to a suppression of the d{sub x{sup 2}-y{sup 2}} pairing close to the half-filling, resulting in a maximum of T{sub c} away from half-filling.
Self-consistent calculation of 208Pb spectrum
International Nuclear Information System (INIS)
Pal'chik, V.V.; Pyatov, N.I.; Fayans, S.A.
1981-01-01
The self-consistent model with exact accounting for one-particle continuum is applied to calculate all discrete particle-hole natural parity states with 2 208 Pb nucleus (up to the neutron emission threshold, 7.4 MeV). Contributions to the energy-weighted sum rules S(EL) of the first collective levels and total contributions of all discrete levels are evaluated. Most strongly the collectivization is manifested for octupole states. With multipolarity growth L contributions of discrete levels are sharply reduced. The results are compared with other models and the experimental data obtained in (e, e'), (p, p') reactions and other data [ru
The Consistency Of High Attorney Of Papua In Corruption Investigation
Directory of Open Access Journals (Sweden)
Samsul Tamher
2015-08-01
Full Text Available This study aimed to determine the consistency of High Attorney of Papua in corruption investigation and efforts to return the state financial loss. The type of study used in this paper is a normative-juridical and empirical-juridical. The results showed that the High Attorney of Papua in corruption investigation is not optimal due to the political interference on a case that involving local officials so that the High Attorney in decide the case is not accordance with the rule of law. The efforts of the High Attorney of Papua to return the state financial loss through State Auction Body civil- and criminal laws.
Parton Distributions based on a Maximally Consistent Dataset
Rojo, Juan
2016-04-01
The choice of data that enters a global QCD analysis can have a substantial impact on the resulting parton distributions and their predictions for collider observables. One of the main reasons for this has to do with the possible presence of inconsistencies, either internal within an experiment or external between different experiments. In order to assess the robustness of the global fit, different definitions of a conservative PDF set, that is, a PDF set based on a maximally consistent dataset, have been introduced. However, these approaches are typically affected by theory biases in the selection of the dataset. In this contribution, after a brief overview of recent NNPDF developments, we propose a new, fully objective, definition of a conservative PDF set, based on the Bayesian reweighting approach. Using the new NNPDF3.0 framework, we produce various conservative sets, which turn out to be mutually in agreement within the respective PDF uncertainties, as well as with the global fit. We explore some of their implications for LHC phenomenology, finding also good consistency with the global fit result. These results provide a non-trivial validation test of the new NNPDF3.0 fitting methodology, and indicate that possible inconsistencies in the fitted dataset do not affect substantially the global fit PDFs.
Analytic Intermodel Consistent Modeling of Volumetric Human Lung Dynamics.
Ilegbusi, Olusegun; Seyfi, Behnaz; Neylon, John; Santhanam, Anand P
2015-10-01
Human lung undergoes breathing-induced deformation in the form of inhalation and exhalation. Modeling the dynamics is numerically complicated by the lack of information on lung elastic behavior and fluid-structure interactions between air and the tissue. A mathematical method is developed to integrate deformation results from a deformable image registration (DIR) and physics-based modeling approaches in order to represent consistent volumetric lung dynamics. The computational fluid dynamics (CFD) simulation assumes the lung is a poro-elastic medium with spatially distributed elastic property. Simulation is performed on a 3D lung geometry reconstructed from four-dimensional computed tomography (4DCT) dataset of a human subject. The heterogeneous Young's modulus (YM) is estimated from a linear elastic deformation model with the same lung geometry and 4D lung DIR. The deformation obtained from the CFD is then coupled with the displacement obtained from the 4D lung DIR by means of the Tikhonov regularization (TR) algorithm. The numerical results include 4DCT registration, CFD, and optimal displacement data which collectively provide consistent estimate of the volumetric lung dynamics. The fusion method is validated by comparing the optimal displacement with the results obtained from the 4DCT registration.
Self-consistent expansion for the molecular beam epitaxy equation.
Katzav, Eytan
2002-03-01
Motivated by a controversy over the correct results derived from the dynamic renormalization group (DRG) analysis of the nonlinear molecular beam epitaxy (MBE) equation, a self-consistent expansion for the nonlinear MBE theory is considered. The scaling exponents are obtained for spatially correlated noise of the general form D(r-r('),t-t('))=2D(0)[r-->-r(')](2rho-d)delta(t-t(')). I find a lower critical dimension d(c)(rho)=4+2rho, above which the linear MBE solution appears. Below the lower critical dimension a rho-dependent strong-coupling solution is found. These results help to resolve the controversy over the correct exponents that describe nonlinear MBE, using a reliable method that proved itself in the past by giving reasonable results for the strong-coupling regime of the Kardar-Parisi-Zhang system (for d>1), where DRG failed to do so.
Nonlinear and self-consistent treatment of ECRH
Energy Technology Data Exchange (ETDEWEB)
Tsironis, C.; Vlahos, L.
2005-07-01
A self-consistent formulation for the nonlinear interaction of electromagnetic waves with relativistic magnetized electrons is applied for the description of the ECRH. In general, electron-cyclotron absorption is the result of resonances between the cyclotron harmonics and the Doppler-shifted waver frequency. The resonant interaction results to an intense wave-particle energy exchange and an electron acceleration, and for that reason it is widely applied in fusion experiments for plasma heating and current drive. The linear theory, for the wave absorption, as well as the quasilinear theory for the electron distribution function, are the most frequently-used tools for the study of wave-particle interactions. However, in many cases the validity of these theories is violated, namely cases where nonlinear effects, like, e. g. particle trapping in the wave field, are dominant in the particle phase-space. Our model consists of electrons streaming and gyrating in a tokamak plasma slab, which is finite in the directions perpendicular to the main magnetic field. The particles interact with an electromagnetic electron-cyclotron wave of the ordinary (O-) or the extraordinary (X-) mode. A set of nonlinear and relativistic equations is derived, which take into account the effects of the charged particle motions on the wave. These consist of the equations of motion for the plasma electrons in the slab, as well as the wave equation in terms of the vector potential. The effect of the electron motions on the temporal evolution of the wave is reflected in the current density source term. (Author)
Nonlinear and self-consistent treatment of ECRH
International Nuclear Information System (INIS)
Tsironis, C.; Vlahos, L.
2005-01-01
A self-consistent formulation for the nonlinear interaction of electromagnetic waves with relativistic magnetized electrons is applied for the description of the ECRH. In general, electron-cyclotron absorption is the result of resonances between the cyclotron harmonics and the Doppler-shifted waver frequency. The resonant interaction results to an intense wave-particle energy exchange and an electron acceleration, and for that reason it is widely applied in fusion experiments for plasma heating and current drive. The linear theory, for the wave absorption, as well as the quasilinear theory for the electron distribution function, are the most frequently-used tools for the study of wave-particle interactions. However, in many cases the validity of these theories is violated, namely cases where nonlinear effects, like, e. g. particle trapping in the wave field, are dominant in the particle phase-space. Our model consists of electrons streaming and gyrating in a tokamak plasma slab, which is finite in the directions perpendicular to the main magnetic field. The particles interact with an electromagnetic electron-cyclotron wave of the ordinary (O-) or the extraordinary (X-) mode. A set of nonlinear and relativistic equations is derived, which take into account the effects of the charged particle motions on the wave. These consist of the equations of motion for the plasma electrons in the slab, as well as the wave equation in terms of the vector potential. The effect of the electron motions on the temporal evolution of the wave is reflected in the current density source term. (Author)
Multiplicative Consistency for Interval Valued Reciprocal Preference Relations
Wu, Jian; Chiclana, Francisco
2014-01-01
The multiplicative consistency (MC) property of interval additive reciprocal preference relations (IARPRs) is explored, and then the consistency index is quantified by the multiplicative consistency estimated IARPR. The MC property is used to measure the level of consistency of the information provided by the experts and also to propose the consistency index induced ordered weighted averaging (CI-IOWA) operator. The novelty of this operator is that it aggregates individual IARPRs in such ...
Self-consistent modeling of electron cyclotron resonance ion sources
International Nuclear Information System (INIS)
Girard, A.; Hitz, D.; Melin, G.; Serebrennikov, K.; Lecot, C.
2004-01-01
In order to predict the performances of electron cyclotron resonance ion source (ECRIS), it is necessary to perfectly model the different parts of these sources: (i) magnetic configuration; (ii) plasma characteristics; (iii) extraction system. The magnetic configuration is easily calculated via commercial codes; different codes also simulate the ion extraction, either in two dimension, or even in three dimension (to take into account the shape of the plasma at the extraction influenced by the hexapole). However the characteristics of the plasma are not always mastered. This article describes the self-consistent modeling of ECRIS: we have developed a code which takes into account the most important construction parameters: the size of the plasma (length, diameter), the mirror ratio and axial magnetic profile, whether a biased probe is installed or not. These input parameters are used to feed a self-consistent code, which calculates the characteristics of the plasma: electron density and energy, charge state distribution, plasma potential. The code is briefly described, and some of its most interesting results are presented. Comparisons are made between the calculations and the results obtained experimentally
Self-consistent modeling of electron cyclotron resonance ion sources
Girard, A.; Hitz, D.; Melin, G.; Serebrennikov, K.; Lécot, C.
2004-05-01
In order to predict the performances of electron cyclotron resonance ion source (ECRIS), it is necessary to perfectly model the different parts of these sources: (i) magnetic configuration; (ii) plasma characteristics; (iii) extraction system. The magnetic configuration is easily calculated via commercial codes; different codes also simulate the ion extraction, either in two dimension, or even in three dimension (to take into account the shape of the plasma at the extraction influenced by the hexapole). However the characteristics of the plasma are not always mastered. This article describes the self-consistent modeling of ECRIS: we have developed a code which takes into account the most important construction parameters: the size of the plasma (length, diameter), the mirror ratio and axial magnetic profile, whether a biased probe is installed or not. These input parameters are used to feed a self-consistent code, which calculates the characteristics of the plasma: electron density and energy, charge state distribution, plasma potential. The code is briefly described, and some of its most interesting results are presented. Comparisons are made between the calculations and the results obtained experimentally.
Sparse PDF Volumes for Consistent Multi-Resolution Volume Rendering
Sicat, Ronell Barrera
2014-12-31
This paper presents a new multi-resolution volume representation called sparse pdf volumes, which enables consistent multi-resolution volume rendering based on probability density functions (pdfs) of voxel neighborhoods. These pdfs are defined in the 4D domain jointly comprising the 3D volume and its 1D intensity range. Crucially, the computation of sparse pdf volumes exploits data coherence in 4D, resulting in a sparse representation with surprisingly low storage requirements. At run time, we dynamically apply transfer functions to the pdfs using simple and fast convolutions. Whereas standard low-pass filtering and down-sampling incur visible differences between resolution levels, the use of pdfs facilitates consistent results independent of the resolution level used. We describe the efficient out-of-core computation of large-scale sparse pdf volumes, using a novel iterative simplification procedure of a mixture of 4D Gaussians. Finally, our data structure is optimized to facilitate interactive multi-resolution volume rendering on GPUs.
Directory of Open Access Journals (Sweden)
TRAVADE F.
1999-04-01
Full Text Available Une expérimentation a été conduite en 1996 au niveau de l'aménagement hydroélectrique de St. Cricq (gave d'Ossau dans le but de tester deux dispositifs de dévalaison destinés à éviter le passage dans les turbines des smolts de saumon atlantique (Salmo sala L.. Le premier, localisé au barrage, consiste en une barrière acoustique répulsive destinée à détourner les smolts vers un clapet de décharge. Le champ sonore, généré par 16 émetteurs immergés, est dans la gamme de fréquence 100-600 Hz avec un niveau sonore maximal de 120-130 dB. Le deuxième dispositif, localisé à l'usine, consiste en un exutoire de surface situé en rive, à proximité immédiate de la grille de prise d'eau. L'efficacité des deux dispositifs a été évaluée par marquage-recapture et le comportement des smolts a été suivi par radiotélémétrie. Les passages de smolts par le clapet du barrage varient de 20 % à 60 % ; cette efficacité résulte essentiellement du débit élevé transitant par le clapet. Les tests effectués avec et sans barrière acoustique n'ont mis en évidence aucun effet significatif du champ sonore. L'efficacité de l'exutoire du bassin de mise en charge est d'environ 80 %. La faible largeur de la prise d'eau (11 m, le faible espacement des barreaux de la grille de prise d'eau (2,5 cm, les conditions hydrauliques devant la grille qui guident rapidement les poissons vers l'entrée de l'exutoire, et la configuration de l'entrée de l'exutoire qui crée un écoulement à faible gradient de vitesse peuvent expliquer l'efficacité élevée de ce dispositif.
A Secure Localization Approach against Wormhole Attacks Using Distance Consistency
Directory of Open Access Journals (Sweden)
Lou Wei
2010-01-01
Full Text Available Wormhole attacks can negatively affect the localization in wireless sensor networks. A typical wormhole attack can be launched by two colluding attackers, one of which sniffs packets at one point in the network and tunnels them through a wired or wireless link to another point, and the other relays them within its vicinity. In this paper, we investigate the impact of the wormhole attack on the localization and propose a novel distance-consistency-based secure localization scheme against wormhole attacks, which includes three phases of wormhole attack detection, valid locators identification and self-localization. The theoretical model is further formulated to analyze the proposed secure localization scheme. The simulation results validate the theoretical results and also demonstrate the effectiveness of our proposed scheme.
Self-consistent determination of quasiparticle properties in nuclear matter
International Nuclear Information System (INIS)
Oset, E.; Palanques-Mestre, A.
1981-01-01
The self-energy of nuclear matter is calculated by directing the attention to the energy and momentum dependent pieces which determine the quasiparticle properties. A microscopic approach is followed which starts from the boson exchange picture for the NN interaction, then the π-and p-mesons are shown to play a major role in the nucleon renormalization. The calculation is done self-consistently and the effective mass and pole strength determined as a function of the nuclear density and momentum. Particular emphasis is put on the non-static character of the interaction and its consequences. Finally a comparison is made with other calculations and with experimental results. The consequences of the nucleon renormalization in pion condensation are also examined with the result that the critical density is pushed up appreciably. (orig.)
Self-Consistent Dynamical Model of the Broad Line Region
Energy Technology Data Exchange (ETDEWEB)
Czerny, Bozena [Center for Theoretical Physics, Polish Academy of Sciences, Warsaw (Poland); Li, Yan-Rong [Key Laboratory for Particle Astrophysics, Institute of High Energy Physics, Chinese Academy of Sciences, Beijing (China); Sredzinska, Justyna; Hryniewicz, Krzysztof [Copernicus Astronomical Center, Polish Academy of Sciences, Warsaw (Poland); Panda, Swayam [Center for Theoretical Physics, Polish Academy of Sciences, Warsaw (Poland); Copernicus Astronomical Center, Polish Academy of Sciences, Warsaw (Poland); Wildy, Conor [Center for Theoretical Physics, Polish Academy of Sciences, Warsaw (Poland); Karas, Vladimir, E-mail: bcz@cft.edu.pl [Astronomical Institute, Czech Academy of Sciences, Prague (Czech Republic)
2017-06-22
We develop a self-consistent description of the Broad Line Region based on the concept of a failed wind powered by radiation pressure acting on a dusty accretion disk atmosphere in Keplerian motion. The material raised high above the disk is illuminated, dust evaporates, and the matter falls back toward the disk. This material is the source of emission lines. The model predicts the inner and outer radius of the region, the cloud dynamics under the dust radiation pressure and, subsequently, the gravitational field of the central black hole, which results in asymmetry between the rise and fall. Knowledge of the dynamics allows us to predict the shapes of the emission lines as functions of the basic parameters of an active nucleus: black hole mass, accretion rate, black hole spin (or accretion efficiency) and the viewing angle with respect to the symmetry axis. Here we show preliminary results based on analytical approximations to the cloud motion.
Self-consistent simulation of the CSR effect
International Nuclear Information System (INIS)
Li, R.; Bohn, C.L.; Bisogano, J.J.
1998-01-01
When a microbunch with high charge traverses a curved trajectory, the curvature-induced bunch self-interaction, by way of coherent synchrotron radiation (CSR) and space-charge forces, may cause serious emittance degradation. In this paper, the authors present a self-consistent simulation for the study of the impact of CSR on beam optics. The dynamics of the bunch under the influence of the CSR forces is simulated using macroparticles, where the CSR force in turn depends on the history of bunch dynamics in accordance with causality. The simulation is benchmarked with analytical results obtained for a rigid-line bunch. Here they present the algorithm used in the simulation, along with the simulation results obtained for bending systems in the Jefferson Lab (JLab) free-electron-laser (FEL) lattice
Self-Consistent Dynamical Model of the Broad Line Region
Directory of Open Access Journals (Sweden)
Bozena Czerny
2017-06-01
Full Text Available We develop a self-consistent description of the Broad Line Region based on the concept of a failed wind powered by radiation pressure acting on a dusty accretion disk atmosphere in Keplerian motion. The material raised high above the disk is illuminated, dust evaporates, and the matter falls back toward the disk. This material is the source of emission lines. The model predicts the inner and outer radius of the region, the cloud dynamics under the dust radiation pressure and, subsequently, the gravitational field of the central black hole, which results in asymmetry between the rise and fall. Knowledge of the dynamics allows us to predict the shapes of the emission lines as functions of the basic parameters of an active nucleus: black hole mass, accretion rate, black hole spin (or accretion efficiency and the viewing angle with respect to the symmetry axis. Here we show preliminary results based on analytical approximations to the cloud motion.
On the consistency of classical and quantum supergravity theories
Energy Technology Data Exchange (ETDEWEB)
Hack, Thomas-Paul [II. Institute for Theoretical Physics, University of Hamburg (Germany); Makedonski, Mathias [Department of Mathematical Sciences, University of Copenhagen (Denmark); Schenkel, Alexander [Department of Stochastics, University of Wuppertal (Germany)
2012-07-01
It is known that pure N=1 supergravity in d=4 spacetime dimensions is consistent at a classical and quantum level, i.e. that in a particular gauge the field equations assume a hyperbolic form - ensuring causal propagation of the degrees of freedom - and that the associated canonical quantum field theory satisfies unitarity. It seems, however, that it is yet unclear whether these properties persist if one considers the more general and realistic case of N=1, d=4 supergravity theories including arbitrary matter fields. We partially clarify the issue by introducing novel hyperbolic gauges for the gravitino field and proving that they commute with the resulting equations of motion. Moreover, we review recent partial results on the unitarity of these general supergravity theories and suggest first steps towards a comprehensive unitarity proof.
Dictionary-based fiber orientation estimation with improved spatial consistency.
Ye, Chuyang; Prince, Jerry L
2018-02-01
Diffusion magnetic resonance imaging (dMRI) has enabled in vivo investigation of white matter tracts. Fiber orientation (FO) estimation is a key step in tract reconstruction and has been a popular research topic in dMRI analysis. In particular, the sparsity assumption has been used in conjunction with a dictionary-based framework to achieve reliable FO estimation with a reduced number of gradient directions. Because image noise can have a deleterious effect on the accuracy of FO estimation, previous works have incorporated spatial consistency of FOs in the dictionary-based framework to improve the estimation. However, because FOs are only indirectly determined from the mixture fractions of dictionary atoms and not modeled as variables in the objective function, these methods do not incorporate FO smoothness directly, and their ability to produce smooth FOs could be limited. In this work, we propose an improvement to Fiber Orientation Reconstruction using Neighborhood Information (FORNI), which we call FORNI+; this method estimates FOs in a dictionary-based framework where FO smoothness is better enforced than in FORNI alone. We describe an objective function that explicitly models the actual FOs and the mixture fractions of dictionary atoms. Specifically, it consists of data fidelity between the observed signals and the signals represented by the dictionary, pairwise FO dissimilarity that encourages FO smoothness, and weighted ℓ 1 -norm terms that ensure the consistency between the actual FOs and the FO configuration suggested by the dictionary representation. The FOs and mixture fractions are then jointly estimated by minimizing the objective function using an iterative alternating optimization strategy. FORNI+ was evaluated on a simulation phantom, a physical phantom, and real brain dMRI data. In particular, in the real brain dMRI experiment, we have qualitatively and quantitatively evaluated the reproducibility of the proposed method. Results demonstrate that
Consistent individual differences in fathering in threespined stickleback Gasterosteus aculeatus
Directory of Open Access Journals (Sweden)
Laura R. STEIN, Alison M. BELL
2012-02-01
Full Text Available There is growing evidence that individual animals show consistent differences in behavior. For example, individual threespined stickleback fish differ in how they react to predators and how aggressive they are during social interactions with conspecifics. A relatively unexplored but potentially important axis of variation is parental behavior. In sticklebacks, fathers provide all of the parental care that is necessary for offspring survival; therefore paternal care is directly tied to fitness. In this study, we assessed whether individual male sticklebacks differ consistently from each other in parental behavior. We recorded visits to nest, total time fanning, and activity levels of 11 individual males every day throughout one clutch, and then allowed the males to breed again. Half of the males were exposed to predation risk while parenting during the first clutch, and the other half of the males experienced predation risk during the second clutch. We detected dramatic temporal changes in parental behaviors over the course of the clutch: for example, total time fanning increased six-fold prior to eggs hatching, then decreased to approximately zero. Despite these temporal changes, males retained their individually-distinctive parenting styles within a clutch that could not be explained by differences in body size or egg mass. Moreover, individual differences in parenting were maintained when males reproduced for a second time. Males that were exposed to simulated predation risk briefly decreased fanning and increased activity levels. Altogether, these results show that individual sticklebacks consistently differ from each other in how they behave as parents [Current Zoology 58 (1: 45–52, 2012].
Consistent individual differences in fathering in threespined stickleback Gasterosteus aculeatus
Institute of Scientific and Technical Information of China (English)
Laura R. STEIN; Alison M. BELL
2012-01-01
There is growing evidence that individual animals show consistent differences in behavior.For example,individual threespined stickleback fish differ in how they react to predators and how aggressive they are during social interactions with conspecifics.A relatively unexplored but potentially important axis of variation is parental behavior.In sticklebacks,fathers provide all of the parental care that is necessary for offspring survival; therefore paternal care is directly tied to fimess.In this study,we assessed whether individual male sticklebacks differ consistently from each other in parental behavior.We recorded visits to nest,total time fanning,and activity levels of 11 individual males every day throughout one clutch,and then allowed the males to breed again.Half of the males were exposed to predation risk while parenting during the fast clutch,and the other half of the males experienced predation risk during the second clutch.We detected dramatic temporal changes in parental behaviors over the course of the clutch:for example,total time fanning increased six-fold prior to eggs hatching,then decreased to approximately zero.Despite these temporal changes,males retained their individually-distinctive parenting styles within a clutch that could not be explained by differences in body size or egg mass.Moreover,individual differences in parenting were maintained when males reproduced for a second time.Males that were exposed to simulated predation risk briefly decreased fanning and increased activity levels.Altogether,these results show that individual sticklebacks consistently differ from each other in how they behave as parents [Current Zoology 58 (1):45-52,2012].
Creation of Consistent Burn Wounds: A Rat Model
Directory of Open Access Journals (Sweden)
Elijah Zhengyang Cai
2014-07-01
Full Text Available Background Burn infliction techniques are poorly described in rat models. An accurate study can only be achieved with wounds that are uniform in size and depth. We describe a simple reproducible method for creating consistent burn wounds in rats. Methods Ten male Sprague-Dawley rats were anesthetized and dorsum shaved. A 100 g cylindrical stainless-steel rod (1 cm diameter was heated to 100℃ in boiling water. Temperature was monitored using a thermocouple. We performed two consecutive toe-pinch tests on different limbs to assess the depth of sedation. Burn infliction was limited to the loin. The skin was pulled upwards, away from the underlying viscera, creating a flat surface. The rod rested on its own weight for 5, 10, and 20 seconds at three different sites on each rat. Wounds were evaluated for size, morphology and depth. Results Average wound size was 0.9957 cm2 (standard deviation [SD] 0.1845 (n=30. Wounds created with duration of 5 seconds were pale, with an indistinct margin of erythema. Wounds of 10 and 20 seconds were well-defined, uniformly brown with a rim of erythema. Average depths of tissue damage were 1.30 mm (SD 0.424, 2.35 mm (SD 0.071, and 2.60 mm (SD 0.283 for duration of 5, 10, 20 seconds respectively. Burn duration of 5 seconds resulted in full-thickness damage. Burn duration of 10 seconds and 20 seconds resulted in full-thickness damage, involving subjacent skeletal muscle. Conclusions This is a simple reproducible method for creating burn wounds consistent in size and depth in a rat burn model.
Self-consistent viscous heating of rapidly compressed turbulence
Campos, Alejandro; Morgan, Brandon
2017-11-01
Given turbulence subjected to infinitely rapid deformations, linear terms representing interactions between the mean flow and the turbulence dictate the evolution of the flow, whereas non-linear terms corresponding to turbulence-turbulence interactions are safely ignored. For rapidly deformed flows where the turbulence Reynolds number is not sufficiently large, viscous effects can't be neglected and tend to play a prominent role, as shown in the study of Davidovits & Fisch (2016). For such a case, the rapid increase of viscosity in a plasma-as compared to the weaker scaling of viscosity in a fluid-leads to the sudden viscous dissipation of turbulent kinetic energy. As shown in Davidovits & Fisch, increases in temperature caused by the direct compression of the plasma drive sufficiently large values of viscosity. We report on numerical simulations of turbulence where the increase in temperature is the result of both the direct compression (an inviscid mechanism) and the self-consistent viscous transfer of energy from the turbulent scales towards the thermal energy. A comparison between implicit large-eddy simulations against well-resolved direct numerical simulations is included to asses the effect of the numerical and subgrid-scale dissipation on the self-consistent viscous This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Feeling Expression Using Avatars and Its Consistency for Subjective Annotation
Ito, Fuyuko; Sasaki, Yasunari; Hiroyasu, Tomoyuki; Miki, Mitsunori
Consumer Generated Media(CGM) is growing rapidly and the amount of content is increasing. However, it is often difficult for users to extract important contents and the existence of contents recording their experiences can easily be forgotten. As there are no methods or systems to indicate the subjective value of the contents or ways to reuse them, subjective annotation appending subjectivity, such as feelings and intentions, to contents is needed. Representation of subjectivity depends on not only verbal expression, but also nonverbal expression. Linguistically expressed annotation, typified by collaborative tagging in social bookmarking systems, has come into widespread use, but there is no system of nonverbally expressed annotation on the web. We propose the utilization of controllable avatars as a means of nonverbal expression of subjectivity, and confirmed the consistency of feelings elicited by avatars over time for an individual and in a group. In addition, we compared the expressiveness and ease of subjective annotation between collaborative tagging and controllable avatars. The result indicates that the feelings evoked by avatars are consistent in both cases, and using controllable avatars is easier than collaborative tagging for representing feelings elicited by contents that do not express meaning, such as photos.
Detection and quantification of flow consistency in business process models.
Burattin, Andrea; Bernstein, Vered; Neurauter, Manuel; Soffer, Pnina; Weber, Barbara
2018-01-01
Business process models abstract complex business processes by representing them as graphical models. Their layout, as determined by the modeler, may have an effect when these models are used. However, this effect is currently not fully understood. In order to systematically study this effect, a basic set of measurable key visual features is proposed, depicting the layout properties that are meaningful to the human user. The aim of this research is thus twofold: first, to empirically identify key visual features of business process models which are perceived as meaningful to the user and second, to show how such features can be quantified into computational metrics, which are applicable to business process models. We focus on one particular feature, consistency of flow direction, and show the challenges that arise when transforming it into a precise metric. We propose three different metrics addressing these challenges, each following a different view of flow consistency. We then report the results of an empirical evaluation, which indicates which metric is more effective in predicting the human perception of this feature. Moreover, two other automatic evaluations describing the performance and the computational capabilities of our metrics are reported as well.
Criteria for the generation of spectra consistent time histories
International Nuclear Information System (INIS)
Lin, C.-W.
1977-01-01
Several methods are available to conduct seismic analysis for nuclear power plant systems and components. Among them, the response spectrum technique has been most widely adopted for linear type of modal analysis. However, for designs which consist of structural or material nonlinearites such as frequency dependent soil properties, the existance of gaps, single tie rods, and friction between supports where the response has to be computed as a function of time, time history approach is the only viable method of analysis. Two examples of time history analysis are: 1) soil-structure interaction study and, 2) a coupled reactor coolant system and building analysis to either generate the floor response specra or compute nonlinear system time history response. The generation of a suitable time history input for the analysis has been discussed in the literature. Some general guidelines are available to insure that the time history imput will be as conservative as the design response spectra. Very little has been reported as to the effect of the dyanmic characteristics of the time history input upon the system response. In fact, the only available discussion in this respect concerns only with the statitical independent nature of the time history components. In this paper, numerical results for cases using the time history approach are presented. Criteria are also established which may be advantageously used to arrive at spectra consistent time histories which are conservative and more importantly, realistic. (Auth.)
A model for cytoplasmic rheology consistent with magnetic twisting cytometry.
Butler, J P; Kelly, S M
1998-01-01
Magnetic twisting cytometry is gaining wide applicability as a tool for the investigation of the rheological properties of cells and the mechanical properties of receptor-cytoskeletal interactions. Current technology involves the application and release of magnetically induced torques on small magnetic particles bound to or inside cells, with measurements of the resulting angular rotation of the particles. The properties of purely elastic or purely viscous materials can be determined by the angular strain and strain rate, respectively. However, the cytoskeleton and its linkage to cell surface receptors display elastic, viscous, and even plastic deformation, and the simultaneous characterization of these properties using only elastic or viscous models is internally inconsistent. Data interpretation is complicated by the fact that in current technology, the applied torques are not constant in time, but decrease as the particles rotate. This paper describes an internally consistent model consisting of a parallel viscoelastic element in series with a parallel viscoelastic element, and one approach to quantitative parameter evaluation. The unified model reproduces all essential features seen in data obtained from a wide variety of cell populations, and contains the pure elastic, viscoelastic, and viscous cases as subsets.
Self-consistent electron transport in collisional plasmas
International Nuclear Information System (INIS)
Mason, R.J.
1982-01-01
A self-consistent scheme has been developed to model electron transport in evolving plasmas of arbitrary classical collisionality. The electrons and ions are treated as either multiple donor-cell fluids, or collisional particles-in-cell. Particle suprathermal electrons scatter off ions, and drag against fluid background thermal electrons. The background electrons undergo ion friction, thermal coupling, and bremsstrahlung. The components move in self-consistent advanced E-fields, obtained by the Implicit Moment Method, which permits Δt >> ω/sub p/ -1 and Δx >> lambda/sub D/ - offering a 10 2 - 10 3 -fold speed-up over older explicit techniques. The fluid description for the background plasma components permits the modeling of transport in systems spanning more than a 10 7 -fold change in density, and encompassing contiguous collisional and collisionless regions. Results are presented from application of the scheme to the modeling of CO 2 laser-generated suprathermal electron transport in expanding thin foils, and in multi-foil target configurations
Privacy, Time Consistent Optimal Labour Income Taxation and Education Policy
Konrad, Kai A.
1999-01-01
Incomplete information is a commitment device for time consistency problems. In the context of time consistent labour income taxation privacy reduces welfare losses and increases the effectiveness of public education as a second best policy.
Scapparone, Eugenio
2011-01-01
In this paper selected results obtained by the ALICE experiment at the LHC will be presented. Data collected during the pp runs taken at sqrt(s)=0.9, 2.76 and 7 TeV and Pb-Pb runs at sqrt(s_NN)=2.76 TeV allowed interesting studies on the properties of the hadronic and nuclear matter: proton runs gave us the possibility to explore the ordinary matter at very high energy and up to very low pt, while Pb-Pb runs provided spectacular events where several thousands of particles produced in the interaction revealed how a very dense medium behaves, providing a deeper picture on the quark gluon plasma(QGP) chemical composition and dynamics.
Generalized contexts and consistent histories in quantum mechanics
International Nuclear Information System (INIS)
Losada, Marcelo; Laura, Roberto
2014-01-01
We analyze a restriction of the theory of consistent histories by imposing that a valid description of a physical system must include quantum histories which satisfy the consistency conditions for all states. We prove that these conditions are equivalent to imposing the compatibility conditions of our formalism of generalized contexts. Moreover, we show that the theory of consistent histories with the consistency conditions for all states and the formalism of generalized context are equally useful representing expressions which involve properties at different times
Personality and Situation Predictors of Consistent Eating Patterns
Vainik, Uku; Dub?, Laurette; Lu, Ji; Fellows, Lesley K.
2015-01-01
Introduction A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studi...
Checking Consistency of Pedigree Information is NP-complete
DEFF Research Database (Denmark)
Aceto, Luca; Hansen, Jens A.; Ingolfsdottir, Anna
Consistency checking is a fundamental computational problem in genetics. Given a pedigree and information on the genotypes of some of the individuals in it, the aim of consistency checking is to determine whether these data are consistent with the classic Mendelian laws of inheritance. This probl...
26 CFR 1.338-8 - Asset and stock consistency.
2010-04-01
... that are controlled foreign corporations. (6) Stock consistency. This section limits the application of... 26 Internal Revenue 4 2010-04-01 2010-04-01 false Asset and stock consistency. 1.338-8 Section 1... (CONTINUED) INCOME TAXES Effects on Corporation § 1.338-8 Asset and stock consistency. (a) Introduction—(1...
Optimization of nonthermal fusion power consistent with energy channeling
International Nuclear Information System (INIS)
Snyder, P.B.; Herrmann, M.C.; Fisch, N.J.
1995-02-01
If the energy of charged fusion products can be diverted directly to fuel ions, non-Maxwellian fuel ion distributions and temperature differences between species will result. To determine the importance of these nonthermal effects, the fusion power density is optimized at constant-β for nonthermal distributions that are self-consistently maintained by channeling of energy from charged fusion products. For D-T and D- 3 He reactors, with 75% of charged fusion product power diverted to fuel ions, temperature differences between electrons and ions increase the reactivity by 40-70%, while non- Maxwellian fuel ion distributions and temperature differences between ionic species increase the reactivity by an additional 3-15%
A photon position sensor consisting of single-electron circuits
International Nuclear Information System (INIS)
Kikombo, Andrew Kilinga; Amemiya, Yoshihito; Tabe, Michiharu
2009-01-01
This paper proposes a solid-state sensor that can detect the position of incident photons with a high spatial resolution. The sensor consists of a two-dimensional array of single-electron oscillators, each coupled to its neighbors through coupling capacitors. An incident photon triggers an excitatory circular wave of electron tunneling in the oscillator array. The wave propagates in all directions to reach the periphery of the array. By measuring the arrival time of the wave at the periphery, we can know the position of the incident photon. The tunneling wave's generation, propagation, arrival at the array periphery, and the determination of incident photon positions are demonstrated with the results of Monte Carlo based computer simulations.
Self-consistent, relativistic, ferromagnetic band structure of gadolinium
International Nuclear Information System (INIS)
Harmon, B.N.; Schirber, J.; Koelling, D.D.
1977-01-01
An initial self-consistent calculation of the ground state magnetic band structure of gadolinium is described. A linearized APW method was used which included all single particle relativistic effects except spin-orbit coupling. The spin polarized potential was obtained in the muffin-tin form using the local spin density approximation for exchange and correlation. The most striking and unorthodox aspect of the results is the position of the 4f spin-down ''bands'' which are required to float just on top of the Fermi level in order to obtain convergence. If the 4f states (l = 3 resonance) are removed from the occupied region of the conduction bands the magnetic moment is approximately .75 μ/sub B//atom; however, as the 4f spin-down states are allowed to find their own position they hybridize with the conduction bands at the Fermi level and the moment becomes smaller. Means of improving the calculation are discussed
Multirobot FastSLAM Algorithm Based on Landmark Consistency Correction
Directory of Open Access Journals (Sweden)
Shi-Ming Chen
2014-01-01
Full Text Available Considering the influence of uncertain map information on multirobot SLAM problem, a multirobot FastSLAM algorithm based on landmark consistency correction is proposed. Firstly, electromagnetism-like mechanism is introduced to the resampling procedure in single-robot FastSLAM, where we assume that each sampling particle is looked at as a charged electron and attraction-repulsion mechanism in electromagnetism field is used to simulate interactive force between the particles to improve the distribution of particles. Secondly, when multiple robots observe the same landmarks, every robot is regarded as one node and Kalman-Consensus Filter is proposed to update landmark information, which further improves the accuracy of localization and mapping. Finally, the simulation results show that the algorithm is suitable and effective.
Migraine patients consistently show abnormal vestibular bedside tests
Directory of Open Access Journals (Sweden)
Eliana Teixeira Maranhão
2015-01-01
Full Text Available Migraine and vertigo are common disorders, with lifetime prevalences of 16% and 7% respectively, and co-morbidity around 3.2%. Vestibular syndromes and dizziness occur more frequently in migraine patients. We investigated bedside clinical signs indicative of vestibular dysfunction in migraineurs.Objective To test the hypothesis that vestibulo-ocular reflex, vestibulo-spinal reflex and fall risk (FR responses as measured by 14 bedside tests are abnormal in migraineurs without vertigo, as compared with controls.Method Cross-sectional study including sixty individuals – thirty migraineurs, 25 women, 19-60 y-o; and 30 gender/age healthy paired controls.Results Migraineurs showed a tendency to perform worse in almost all tests, albeit only the Romberg tandem test was statistically different from controls. A combination of four abnormal tests better discriminated the two groups (93.3% specificity.Conclusion Migraine patients consistently showed abnormal vestibular bedside tests when compared with controls.
A New Heteroskedastic Consistent Covariance Matrix Estimator using Deviance Measure
Directory of Open Access Journals (Sweden)
Nuzhat Aftab
2016-06-01
Full Text Available In this article we propose a new heteroskedastic consistent covariance matrix estimator, HC6, based on deviance measure. We have studied and compared the finite sample behavior of the new test and compared it with other this kind of estimators, HC1, HC3 and HC4m, which are used in case of leverage observations. Simulation study is conducted to study the effect of various levels of heteroskedasticity on the size and power of quasi-t test with HC estimators. Results show that the test statistic based on our new suggested estimator has better asymptotic approximation and less size distortion as compared to other estimators for small sample sizes when high level ofheteroskedasticity is present in data.
Back to the Future: Consistency-Based Trajectory Tracking
Kurien, James; Nayak, P. Pandurand; Norvig, Peter (Technical Monitor)
2000-01-01
Given a model of a physical process and a sequence of commands and observations received over time, the task of an autonomous controller is to determine the likely states of the process and the actions required to move the process to a desired configuration. We introduce a representation and algorithms for incrementally generating approximate belief states for a restricted but relevant class of partially observable Markov decision processes with very large state spaces. The algorithm presented incrementally generates, rather than revises, an approximate belief state at any point by abstracting and summarizing segments of the likely trajectories of the process. This enables applications to efficiently maintain a partial belief state when it remains consistent with observations and revisit past assumptions about the process' evolution when the belief state is ruled out. The system presented has been implemented and results on examples from the domain of spacecraft control are presented.
Statistically Consistent k-mer Methods for Phylogenetic Tree Reconstruction.
Allman, Elizabeth S; Rhodes, John A; Sullivant, Seth
2017-02-01
Frequencies of k-mers in sequences are sometimes used as a basis for inferring phylogenetic trees without first obtaining a multiple sequence alignment. We show that a standard approach of using the squared Euclidean distance between k-mer vectors to approximate a tree metric can be statistically inconsistent. To remedy this, we derive model-based distance corrections for orthologous sequences without gaps, which lead to consistent tree inference. The identifiability of model parameters from k-mer frequencies is also studied. Finally, we report simulations showing that the corrected distance outperforms many other k-mer methods, even when sequences are generated with an insertion and deletion process. These results have implications for multiple sequence alignment as well since k-mer methods are usually the first step in constructing a guide tree for such algorithms.
A self-consistent nuclear energy supply system
International Nuclear Information System (INIS)
Fujii-e, Y.; Morita, T.; Kawakami, H.; Arie, K.; Suzuki, M.; Iida, M.; Yamazaki, H.
1992-01-01
A self-consistent nuclear energy supply system (SCNESS) is investigated for a Fast Reactor. SCNESS is proposed as a future stable energy supplier with no harmful influence on humans or environment for the ultimate goal of nuclear energy development. SCNESS should be inherently safe, be able to breed fissionable material, and transmute long-lived radioactive nuclides (i.e., minor actinides and long-lived fission products). The relationship between these characteristics and the spatial assignment of excess neutrons (v-1) for each characteristic are analyzed. The analysis shows that excess neutrons play an intrinsic role in realizing SCNESS. The reactor concept of SCNESS is investigated by considering utilization of excess neutrons. Results show that a small-size axially double-layered annular core with metal fuel is a choice candidate for SCNESS. SCNESS is concluded feasible. (author). 4 refs., 9 figs
Domain Adaptation for Pedestrian Detection Based on Prediction Consistency
Directory of Open Access Journals (Sweden)
Yu Li-ping
2014-01-01
Full Text Available Pedestrian detection is an active area of research in computer vision. It remains a quite challenging problem in many applications where many factors cause a mismatch between source dataset used to train the pedestrian detector and samples in the target scene. In this paper, we propose a novel domain adaptation model for merging plentiful source domain samples with scared target domain samples to create a scene-specific pedestrian detector that performs as well as rich target domain simples are present. Our approach combines the boosting-based learning algorithm with an entropy-based transferability, which is derived from the prediction consistency with the source classifications, to selectively choose the samples showing positive transferability in source domains to the target domain. Experimental results show that our approach can improve the detection rate, especially with the insufficient labeled data in target scene.
Consistent partnership formation: application to a sexually transmitted disease model.
Artzrouni, Marc; Deuchert, Eva
2012-02-01
We apply a consistent sexual partnership formation model which hinges on the assumption that one gender's choices drives the process (male or female dominant model). The other gender's behavior is imputed. The model is fitted to UK sexual behavior data and applied to a simple incidence model of HSV-2. With a male dominant model (which assumes accurate male reports on numbers of partners) the modeled incidences of HSV-2 are 77% higher for men and 50% higher for women than with a female dominant model (which assumes accurate female reports). Although highly stylized, our simple incidence model sheds light on the inconsistent results one can obtain with misreported data on sexual activity and age preferences. Copyright © 2011 Elsevier Inc. All rights reserved.
A New Bias Corrected Version of Heteroscedasticity Consistent Covariance Estimator
Directory of Open Access Journals (Sweden)
Munir Ahmed
2016-06-01
Full Text Available In the presence of heteroscedasticity, different available flavours of the heteroscedasticity consistent covariance estimator (HCCME are used. However, the available literature shows that these estimators can be considerably biased in small samples. Cribari–Neto et al. (2000 introduce a bias adjustment mechanism and give the modified White estimator that becomes almost bias-free even in small samples. Extending these results, Cribari-Neto and Galvão (2003 present a similar bias adjustment mechanism that can be applied to a wide class of HCCMEs’. In the present article, we follow the same mechanism as proposed by Cribari-Neto and Galvão to give bias-correction version of HCCME but we use adaptive HCCME rather than the conventional HCCME. The Monte Carlo study is used to evaluate the performance of our proposed estimators.
Example-Based Image Colorization Using Locality Consistent Sparse Representation.
Bo Li; Fuchen Zhao; Zhuo Su; Xiangguo Liang; Yu-Kun Lai; Rosin, Paul L
2017-11-01
Image colorization aims to produce a natural looking color image from a given gray-scale image, which remains a challenging problem. In this paper, we propose a novel example-based image colorization method exploiting a new locality consistent sparse representation. Given a single reference color image, our method automatically colorizes the target gray-scale image by sparse pursuit. For efficiency and robustness, our method operates at the superpixel level. We extract low-level intensity features, mid-level texture features, and high-level semantic features for each superpixel, which are then concatenated to form its descriptor. The collection of feature vectors for all the superpixels from the reference image composes the dictionary. We formulate colorization of target superpixels as a dictionary-based sparse reconstruction problem. Inspired by the observation that superpixels with similar spatial location and/or feature representation are likely to match spatially close regions from the reference image, we further introduce a locality promoting regularization term into the energy formulation, which substantially improves the matching consistency and subsequent colorization results. Target superpixels are colorized based on the chrominance information from the dominant reference superpixels. Finally, to further improve coherence while preserving sharpness, we develop a new edge-preserving filter for chrominance channels with the guidance from the target gray-scale image. To the best of our knowledge, this is the first work on sparse pursuit image colorization from single reference images. Experimental results demonstrate that our colorization method outperforms the state-of-the-art methods, both visually and quantitatively using a user study.
Consistency in performance evaluation reports and medical records.
Lu, Mingshan; Ma, Ching-to Albert
2002-12-01
In the health care market managed care has become the latest innovation for the delivery of services. For efficient implementation, the managed care organization relies on accurate information. So clinicians are often asked to report on patients before referrals are approved, treatments authorized, or insurance claims processed. What are clinicians responses to solicitation for information by managed care organizations? The existing health literature has already pointed out the importance of provider gaming, sincere reporting, nudging, and dodging the rules. We assess the consistency of clinicians reports on clients across administrative data and clinical records. For about 1,000 alcohol abuse treatment episodes, we compare clinicians reports across two data sets. The first one, the Maine Addiction Treatment System (MATS), was an administrative data set; the state government used it for program performance monitoring and evaluation. The second was a set of medical record abstracts, taken directly from the clinical records of treatment episodes. A clinician s reporting practice exhibits an inconsistency if the information reported in MATS differs from the information reported in the medical record in a statistically significant way. We look for evidence of inconsistencies in five categories: admission alcohol use frequency, discharge alcohol use frequency, termination status, admission employment status, and discharge employment status. Chi-square tests, Kappa statistics, and sensitivity and specificity tests are used for hypothesis testing. Multiple imputation methods are employed to address the problem of missing values in the record abstract data set. For admission and discharge alcohol use frequency measures, we find, respectively, strong and supporting evidence for inconsistencies. We find equally strong evidence for consistency in reports of admission and discharge employment status, and mixed evidence on report consistency on termination status. Patterns of
Hunger enhances consistent economic choices in non-human primates.
Yamada, Hiroshi
2017-05-24
Hunger and thirst are fundamental biological processes that drive consumption behavior in humans and non-human animals. While the existing literature in neuroscience suggests that these satiety states change how consumable rewards are represented in the brain, it remains unclear as to how they change animal choice behavior and the underlying economic preferences. Here, I used combined techniques from experimental economics, psychology, and neuroscience to measure food preferences of marmoset monkeys (Callithrix jacchus), a recently developed primate model for neuroscience. Hunger states of animals were manipulated by scheduling feeding intervals, resulting in three different conditions: sated, non-sated, and hungry. During these hunger states, animals performed pairwise choices of food items, which included all possible pairwise combinations of five different food items except for same-food pairs. Results showed that hunger enhanced economic rationality, evident as a decrease of transitivity violations (item A was preferred to item B, and B to C, but C was preferred to A). Further analysis demonstrated that hungry monkeys chose more-preferred items over less-preferred items in a more deterministic manner, while the individual food preferences appeared to remain stable across hunger states. These results suggest that hunger enhances consistent choice behavior and shifts animals towards efficient outcome maximization.
Kerchner, Charles T.
2008-01-01
Reporting on a talk the author gave some months ago, the headline in "La Opinion," Los Angeles' premier Spanish language newspaper, declared the city's school system "en crisis permanente." No one wrote in to disagree. Indeed, at the end of "Learning from L.A.: Institutional Change in American Public Education" (Harvard Education Press) the author…
Bayesian detection of causal rare variants under posterior consistency.
Liang, Faming
2013-07-26
Identification of causal rare variants that are associated with complex traits poses a central challenge on genome-wide association studies. However, most current research focuses only on testing the global association whether the rare variants in a given genomic region are collectively associated with the trait. Although some recent work, e.g., the Bayesian risk index method, have tried to address this problem, it is unclear whether the causal rare variants can be consistently identified by them in the small-n-large-P situation. We develop a new Bayesian method, the so-called Bayesian Rare Variant Detector (BRVD), to tackle this problem. The new method simultaneously addresses two issues: (i) (Global association test) Are there any of the variants associated with the disease, and (ii) (Causal variant detection) Which variants, if any, are driving the association. The BRVD ensures the causal rare variants to be consistently identified in the small-n-large-P situation by imposing some appropriate prior distributions on the model and model specific parameters. The numerical results indicate that the BRVD is more powerful for testing the global association than the existing methods, such as the combined multivariate and collapsing test, weighted sum statistic test, RARECOVER, sequence kernel association test, and Bayesian risk index, and also more powerful for identification of causal rare variants than the Bayesian risk index method. The BRVD has also been successfully applied to the Early-Onset Myocardial Infarction (EOMI) Exome Sequence Data. It identified a few causal rare variants that have been verified in the literature.
Consistency of FMEA used in the validation of analytical procedures.
Oldenhof, M T; van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Vredenbregt, M J; Weda, M; Barends, D M
2011-02-20
In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection-Mass Spectrometry (HPLC-DAD-MS) analytical procedure used in the quality control of medicines. Each team was free to define their own ranking scales for the probability of severity (S), occurrence (O), and detection (D) of failure modes. We calculated Risk Priority Numbers (RPNs) and we identified the failure modes above the 90th percentile of RPN values as failure modes needing urgent corrective action; failure modes falling between the 75th and 90th percentile of RPN values were identified as failure modes needing necessary corrective action, respectively. Team 1 and Team 2 identified five and six failure modes needing urgent corrective action respectively, with two being commonly identified. Of the failure modes needing necessary corrective actions, about a third were commonly identified by both teams. These results show inconsistency in the outcome of the FMEA. To improve consistency, we recommend that FMEA is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating that this inconsistency is not always a drawback. Copyright © 2010 Elsevier B.V. All rights reserved.
Bayesian detection of causal rare variants under posterior consistency.
Directory of Open Access Journals (Sweden)
Faming Liang
Full Text Available Identification of causal rare variants that are associated with complex traits poses a central challenge on genome-wide association studies. However, most current research focuses only on testing the global association whether the rare variants in a given genomic region are collectively associated with the trait. Although some recent work, e.g., the Bayesian risk index method, have tried to address this problem, it is unclear whether the causal rare variants can be consistently identified by them in the small-n-large-P situation. We develop a new Bayesian method, the so-called Bayesian Rare Variant Detector (BRVD, to tackle this problem. The new method simultaneously addresses two issues: (i (Global association test Are there any of the variants associated with the disease, and (ii (Causal variant detection Which variants, if any, are driving the association. The BRVD ensures the causal rare variants to be consistently identified in the small-n-large-P situation by imposing some appropriate prior distributions on the model and model specific parameters. The numerical results indicate that the BRVD is more powerful for testing the global association than the existing methods, such as the combined multivariate and collapsing test, weighted sum statistic test, RARECOVER, sequence kernel association test, and Bayesian risk index, and also more powerful for identification of causal rare variants than the Bayesian risk index method. The BRVD has also been successfully applied to the Early-Onset Myocardial Infarction (EOMI Exome Sequence Data. It identified a few causal rare variants that have been verified in the literature.
Quench simulation of SMES consisting of some superconducting coils
International Nuclear Information System (INIS)
Noguchi, S.; Oga, Y.; Igarashi, H.
2011-01-01
A chain of quenches may be caused by a quench of one element coil when SMES is consists of many element coils. To avoid the chain of quenches, the energy stored in element coil has to be quickly discharged. The cause of the chain of the quenches is the short time constant of the decreasing current of the quenched coil. In recent years, many HTS superconducting magnetic energy storage (HTS-SMES) systems are investigated and designed. They usually consist of some superconducting element coils due to storing excessively high energy. If one of them was quenched, the storage energy of the superconducting element coil quenched has to be immediately dispersed to protect the HTS-SMES system. As the result, the current of the other element coils, which do not reach to quench, increases since the magnetic coupling between the quenched element coil and the others are excessively strong. The increase of the current may cause the quench of the other element coils. If the energy dispersion of the element coil quenched was failed, the other superconducting element coil would be quenched in series. Therefore, it is necessary to investigate the behavior of the HTS-SMES after quenching one or more element coils. To protect a chain of quenches, it is also important to investigate the time constant of the coils. We have developed a simulation code to investigate the behavior of the HTS-SMES. By the quench simulation, it is indicated that a chain of quenches is caused by a quench of one element coil.
Self-consistent equilibria in cylindrical reversed-field pinch
International Nuclear Information System (INIS)
Lo Surdo, C.; Paccagnella, R.; Guo, S.
1995-03-01
The object of this work is to study the self-consistent magnetofluidstatic equilibria of a 2-region (plasma + gas) reversed-field pinch (RFP) in cylindrical approximation (namely, with vanishing inverse aspect ratio). Differently from what happens in a tokamak, in a RFP a significant part of the plasma current is driven by a dynamo electric field (DEF), in its turn mainly due to plasma turbulence. So, it is worked out a reasonable mathematical model of the above self-consistent equilibria under the following main points it has been: a) to the lowest order, and according to a standard ansatz, the turbulent DEF say ε t , is expressed as a homogeneous transform of the magnetic field B of degree 1, ε t =(α) (B), with α≡a given 2-nd rank tensor, homogeneous of degree 0 in B and generally depending on the plasma state; b) ε t does not explicitly appear in the plasma energy balance, as it were produced by a Maxwell demon able of extract the corresponding Joule power from the plasma. In particular, it is showed that, if both α and the resistivity tensor η are isotropic and constant, the magnetic field is force-free with abnormality equal to αη 0 /η, in the limit of vanishing β; that is, the well-known J.B. Taylor'result is recovered, in this particular conditions, starting from ideas quite different from the usual ones (minimization of total magnetic energy under constrained total elicity). Finally, the general problem is solved numerically under circular (besides cylindrical) symmetry, for simplicity neglecting the existence of gas region (i.e., assuming the plasma in direct contact with the external wall)
Bayesian detection of causal rare variants under posterior consistency.
Liang, Faming; Xiong, Momiao
2013-01-01
Identification of causal rare variants that are associated with complex traits poses a central challenge on genome-wide association studies. However, most current research focuses only on testing the global association whether the rare variants in a given genomic region are collectively associated with the trait. Although some recent work, e.g., the Bayesian risk index method, have tried to address this problem, it is unclear whether the causal rare variants can be consistently identified by them in the small-n-large-P situation. We develop a new Bayesian method, the so-called Bayesian Rare Variant Detector (BRVD), to tackle this problem. The new method simultaneously addresses two issues: (i) (Global association test) Are there any of the variants associated with the disease, and (ii) (Causal variant detection) Which variants, if any, are driving the association. The BRVD ensures the causal rare variants to be consistently identified in the small-n-large-P situation by imposing some appropriate prior distributions on the model and model specific parameters. The numerical results indicate that the BRVD is more powerful for testing the global association than the existing methods, such as the combined multivariate and collapsing test, weighted sum statistic test, RARECOVER, sequence kernel association test, and Bayesian risk index, and also more powerful for identification of causal rare variants than the Bayesian risk index method. The BRVD has also been successfully applied to the Early-Onset Myocardial Infarction (EOMI) Exome Sequence Data. It identified a few causal rare variants that have been verified in the literature.
Modeling self-consistent multi-class dynamic traffic flow
Cho, Hsun-Jung; Lo, Shih-Ching
2002-09-01
In this study, we present a systematic self-consistent multiclass multilane traffic model derived from the vehicular Boltzmann equation and the traffic dispersion model. The multilane domain is considered as a two-dimensional space and the interaction among vehicles in the domain is described by a dispersion model. The reason we consider a multilane domain as a two-dimensional space is that the driving behavior of road users may not be restricted by lanes, especially motorcyclists. The dispersion model, which is a nonlinear Poisson equation, is derived from the car-following theory and the equilibrium assumption. Under the concept that all kinds of users share the finite section, the density is distributed on a road by the dispersion model. In addition, the dynamic evolution of the traffic flow is determined by the systematic gas-kinetic model derived from the Boltzmann equation. Multiplying Boltzmann equation by the zeroth, first- and second-order moment functions, integrating both side of the equation and using chain rules, we can derive continuity, motion and variance equation, respectively. However, the second-order moment function, which is the square of the individual velocity, is employed by previous researches does not have physical meaning in traffic flow. Although the second-order expansion results in the velocity variance equation, additional terms may be generated. The velocity variance equation we propose is derived from multiplying Boltzmann equation by the individual velocity variance. It modifies the previous model and presents a new gas-kinetic traffic flow model. By coupling the gas-kinetic model and the dispersion model, a self-consistent system is presented.
Consistent Regulation of Infrastructure Businesses: Some Economic Issues
Flavio M. Menezes
2008-01-01
This paper examines some important economic aspects associated with the notion that consistency in the regulation of infrastructure businesses is a desirable feature. It makes two important points. First, it is not easy to measure consistency. In particular, one cannot simply point to different regulatory parameters as evidence of inconsistent regulatory policy. Second, even if one does observe consistency emerging from decisions made by different regulators, it does not necessarily mean that...
Improving risk assessment by defining consistent and reliable system scenarios
Directory of Open Access Journals (Sweden)
B. Mazzorana
2009-02-01
Full Text Available During the entire procedure of risk assessment for hydrologic hazards, the selection of consistent and reliable scenarios, constructed in a strictly systematic way, is fundamental for the quality and reproducibility of the results. However, subjective assumptions on relevant impact variables such as sediment transport intensity on the system loading side and weak point response mechanisms repeatedly cause biases in the results, and consequently affect transparency and required quality standards. Furthermore, the system response of mitigation measures to extreme event loadings represents another key variable in hazard assessment, as well as the integral risk management including intervention planning. Formative Scenario Analysis, as a supplement to conventional risk assessment methods, is a technique to construct well-defined sets of assumptions to gain insight into a specific case and the potential system behaviour. By two case studies, carried out (1 to analyse sediment transport dynamics in a torrent section equipped with control measures, and (2 to identify hazards induced by woody debris transport at hydraulic weak points, the applicability of the Formative Scenario Analysis technique is presented. It is argued that during scenario planning in general and with respect to integral risk management in particular, Formative Scenario Analysis allows for the development of reliable and reproducible scenarios in order to design more specifically an application framework for the sustainable assessment of natural hazards impact. The overall aim is to optimise the hazard mapping and zoning procedure by methodologically integrating quantitative and qualitative knowledge.
Consistent Steering System using SCTP for Bluetooth Scatternet Sensor Network
Dhaya, R.; Sadasivam, V.; Kanthavel, R.
2012-12-01
Wireless communication is the best way to convey information from source to destination with flexibility and mobility and Bluetooth is the wireless technology suitable for short distance. On the other hand a wireless sensor network (WSN) consists of spatially distributed autonomous sensors to cooperatively monitor physical or environmental conditions, such as temperature, sound, vibration, pressure, motion or pollutants. Using Bluetooth piconet wireless technique in sensor nodes creates limitation in network depth and placement. The introduction of Scatternet solves the network restrictions with lack of reliability in data transmission. When the depth of the network increases, it results in more difficulties in routing. No authors so far focused on the reliability factors of Scatternet sensor network's routing. This paper illustrates the proposed system architecture and routing mechanism to increase the reliability. The another objective is to use reliable transport protocol that uses the multi-homing concept and supports multiple streams to prevent head-of-line blocking. The results show that the Scatternet sensor network has lower packet loss even in the congestive environment than the existing system suitable for all surveillance applications.
Single-field consistency relations of large scale structure
International Nuclear Information System (INIS)
Creminelli, Paolo; Noreña, Jorge; Simonović, Marko; Vernizzi, Filippo
2013-01-01
We derive consistency relations for the late universe (CDM and ΛCDM): relations between an n-point function of the density contrast δ and an (n+1)-point function in the limit in which one of the (n+1) momenta becomes much smaller than the others. These are based on the observation that a long mode, in single-field models of inflation, reduces to a diffeomorphism since its freezing during inflation all the way until the late universe, even when the long mode is inside the horizon (but out of the sound horizon). These results are derived in Newtonian gauge, at first and second order in the small momentum q of the long mode and they are valid non-perturbatively in the short-scale δ. In the non-relativistic limit our results match with [1]. These relations are a consequence of diffeomorphism invariance; they are not satisfied in the presence of extra degrees of freedom during inflation or violation of the Equivalence Principle (extra forces) in the late universe
Consistent microscopic and phenomenological analysis of composite particle opticle potential
International Nuclear Information System (INIS)
Mukhopadhyay, Sheela; Srivastava, D.K.; Ganguly, N.K.
1976-01-01
A microscopic calculation of composits particle optical potential has been done using a realistic nucleon-helion interaction and folding it with the density distribution of the targets. The second order effects were simulated by introducing a scaling factor which was searched on to reproduce the experimental scattering results. Composite particle optical potential was also derived from the nucleon-nucleus optical potential. The second order term was explicitly treated as a parameter. Elastic scattering of 20 MeV 3 H on targets ranging from 40 Ca to 208 Pb to 208 Pb have also been analysed using phenomenological optical model. Agreement of these results with the above calculations verified the consistency of the microscopic theory. But the equivalent sharp radius calculated with n-helion interaction was observed to be smaller than phenomenological value. This was attributed to the absence of saturation effects in the density-independent interaction used. Saturation has been introduced by a density dependent term of the form (1-c zetasup(2/3)), where zeta is the compound density of the target helion system. (author)
A diagnostic test for apraxia in stroke patients: internal consistency and diagnostic value.
Heugten, C.M. van; Dekker, J.; Deelman, B.G.; Stehmann-Saris, F.C.; Kinebanian, A.
1999-01-01
The internal consistency and the diagnostic value of a test for apraxia in patients having had a stroke are presented. Results indicate that the items of the test form a strong and consistent scale: Cronbach's alpha as well as the results of a Mokken scale analysis present good reliability and good
Personality consistency in dogs: a meta-analysis.
Fratkin, Jamie L; Sinn, David L; Patall, Erika A; Gosling, Samuel D
2013-01-01
Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that 'puppy tests' measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family). Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43). Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., 'puppy tests') versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed.
Personality consistency in dogs: a meta-analysis.
Directory of Open Access Journals (Sweden)
Jamie L Fratkin
Full Text Available Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that 'puppy tests' measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family. Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43. Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., 'puppy tests' versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed.
Personality Consistency in Dogs: A Meta-Analysis
Fratkin, Jamie L.; Sinn, David L.; Patall, Erika A.; Gosling, Samuel D.
2013-01-01
Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that ‘puppy tests’ measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family). Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43). Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., ‘puppy tests’) versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed. PMID:23372787
Thermodynamically consistent Bayesian analysis of closed biochemical reaction systems
Directory of Open Access Journals (Sweden)
Goutsias John
2010-11-01
Full Text Available Abstract Background Estimating the rate constants of a biochemical reaction system with known stoichiometry from noisy time series measurements of molecular concentrations is an important step for building predictive models of cellular function. Inference techniques currently available in the literature may produce rate constant values that defy necessary constraints imposed by the fundamental laws of thermodynamics. As a result, these techniques may lead to biochemical reaction systems whose concentration dynamics could not possibly occur in nature. Therefore, development of a thermodynamically consistent approach for estimating the rate constants of a biochemical reaction system is highly desirable. Results We introduce a Bayesian analysis approach for computing thermodynamically consistent estimates of the rate constants of a closed biochemical reaction system with known stoichiometry given experimental data. Our method employs an appropriately designed prior probability density function that effectively integrates fundamental biophysical and thermodynamic knowledge into the inference problem. Moreover, it takes into account experimental strategies for collecting informative observations of molecular concentrations through perturbations. The proposed method employs a maximization-expectation-maximization algorithm that provides thermodynamically feasible estimates of the rate constant values and computes appropriate measures of estimation accuracy. We demonstrate various aspects of the proposed method on synthetic data obtained by simulating a subset of a well-known model of the EGF/ERK signaling pathway, and examine its robustness under conditions that violate key assumptions. Software, coded in MATLAB®, which implements all Bayesian analysis techniques discussed in this paper, is available free of charge at http://www.cis.jhu.edu/~goutsias/CSS%20lab/software.html. Conclusions Our approach provides an attractive statistical methodology for
Measuring consistency of web page design and its effects on performance and satisfaction.
Ozok, A A; Salvendy, G
2000-04-01
This study examines the methods for measuring the consistency levels of web pages and the effect of consistency on the performance and satisfaction of the world-wide web (WWW) user. For clarification, a home page is referred to as a single page that is the default page of a web site on the WWW. A web page refers to a single screen that indicates a specific address on the WWW. This study has tested a series of web pages that were mostly hyperlinked. Therefore, the term 'web page' has been adopted for the nomenclature while referring to the objects of which the features were tested. It was hypothesized that participants would perform better and be more satisfied using web pages that have consistent rather than inconsistent interface design; that the overall consistency level of an interface design would significantly correlate with the three elements of consistency, physical, communicational and conceptual consistency; and that physical and communicational consistencies would interact with each other. The hypotheses were tested in a four-group, between-subject design, with 10 participants in each group. The results partially support the hypothesis regarding error rate, but not regarding satisfaction and performance time. The results also support the hypothesis that each of the three elements of consistency significantly contribute to the overall consistency of a web page, and that physical and communicational consistencies interact with each other, while conceptual consistency does not interact with them.
Energy Technology Data Exchange (ETDEWEB)
Osterwald, C.R.; Emery, K. [National Renewable Energy Lab., Golden, CO (United States); Anevsky, S. [All-Union Research Inst. for Optophysical Measurements, Moscow (Russian Federation)] [and others
1996-05-01
This paper presents the results of an international intercomparison of photovoltaic (PV) performance measurements and calibrations. The intercomparison, which was organized and operated by a group of experts representing national laboratories from across the globe (i.e., the authors of this paper), was accomplished by circulating two sample sets. One set consisted of twenty silicon reference cells that would, hopefully, form the basis of an international PV reference scale. A qualification procedure applied to the calibration results gave average calibration numbers with an overall standard deviation of less than 2% for the entire set. The second set was assembled from a wide range of newer technologies that present unique problems for PV measurements. As might be expected, these results showed much larger differences among laboratories. Methods were then identified that should be used to measure such devices, along with problems to avoid.
Student Consistency and Implications for Feedback in Online Assessment Systems
Madhyastha, Tara M.; Tanimoto, Steven
2009-01-01
Most of the emphasis on mining online assessment logs has been to identify content-specific errors. However, the pattern of general "consistency" is domain independent, strongly related to performance, and can itself be a target of educational data mining. We demonstrate that simple consistency indicators are related to student outcomes,…
26 CFR 301.6224(c)-3 - Consistent settlements.
2010-04-01
... 26 Internal Revenue 18 2010-04-01 2010-04-01 false Consistent settlements. 301.6224(c)-3 Section... settlements. (a) In general. If the Internal Revenue Service enters into a settlement agreement with any..., settlement terms consistent with those contained in the settlement agreement entered into. (b) Requirements...
A Preliminary Study toward Consistent Soil Moisture from AMSR2
Parinussa, R.M.; Holmes, T.R.H.; Wanders, N.; Dorigo, W.A.; de Jeu, R.A.M.
2015-01-01
A preliminary study toward consistent soil moisture products from the Advanced Microwave Scanning Radiometer 2 (AMSR2) is presented. Its predecessor, the Advanced Microwave Scanning Radiometer for Earth Observing System (AMSR-E), has providedEarth scientists with a consistent and continuous global
Consistency and Inconsistency in PhD Thesis Examination
Holbrook, Allyson; Bourke, Sid; Lovat, Terry; Fairbairn, Hedy
2008-01-01
This is a mixed methods investigation of consistency in PhD examination. At its core is the quantification of the content and conceptual analysis of examiner reports for 804 Australian theses. First, the level of consistency between what examiners say in their reports and the recommendation they provide for a thesis is explored, followed by an…
Delimiting Coefficient a from Internal Consistency and Unidimensionality
Sijtsma, Klaas
2015-01-01
I discuss the contribution by Davenport, Davison, Liou, & Love (2015) in which they relate reliability represented by coefficient a to formal definitions of internal consistency and unidimensionality, both proposed by Cronbach (1951). I argue that coefficient a is a lower bound to reliability and that concepts of internal consistency and…
Policy consistency and the achievement of Nigeria's foreign policy ...
African Journals Online (AJOL)
This study is an attempt to investigate the policy consistency of Nigeria‟s foreign policy and to understand the basis for this consistency; and also to see whether peacekeeping/peace-enforcement is key instrument in the achievement of Nigeria‟s foreign policy goals. The objective of the study was to examine whether the ...
Decentralized Consistency Checking in Cross-organizational Workflows
Wombacher, Andreas
Service Oriented Architectures facilitate loosely coupled composed services, which are established in a decentralized way. One challenge for such composed services is to guarantee consistency, i.e., deadlock-freeness. This paper presents a decentralized approach to consistency checking, which
Consistency of a system of equations: What does that mean?
Still, Georg J.; Kern, Walter; Koelewijn, Jaap; Bomhoff, M.J.
2010-01-01
The concept of (structural) consistency also called structural solvability is an important basic tool for analyzing the structure of systems of equations. Our aim is to provide a sound and practically relevant meaning to this concept. The implications of consistency are expressed in terms of
Consistency of hand preference: predictions to intelligence and school achievement.
Kee, D W; Gottfried, A; Bathurst, K
1991-05-01
Gottfried and Bathurst (1983) reported that hand preference consistency measured over time during infancy and early childhood predicts intellectual precocity for females, but not for males. In the present study longitudinal assessments of children previously classified by Gottfried and Bathurst as consistent or nonconsistent in cross-time hand preference were conducted during middle childhood (ages 5 to 9). Findings show that (a) early measurement of hand preference consistency for females predicts school-age intellectual precocity, (b) the locus of the difference between consistent vs. nonconsistent females is in verbal intelligence, and (c) the precocity of the consistent females was also revealed on tests of school achievement, particularly tests of reading and mathematics.
Personality and Situation Predictors of Consistent Eating Patterns.
Vainik, Uku; Dubé, Laurette; Lu, Ji; Fellows, Lesley K
2015-01-01
A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studied. A community-based sample of 164 women completed various personality tests, and 139 of them also reported their eating behaviour 6 times/day over 10 observational days. We focused on observations with meals (breakfast, lunch, or dinner). The participants indicated if their momentary eating patterns were consistent with their own baseline eating patterns in terms of healthiness or size of the meal. Further, participants described various characteristics of each eating situation. Eating consistency was positively predicted by trait self-control. Eating consistency was undermined by eating in the evening, eating with others, eating away from home, having consumed alcohol and having undertaken physical exercise. Interactions emerged between personality traits and situations, including punishment sensitivity, restraint, physical activity and alcohol consumption. Trait self-control and several eating situation variables were related to eating consistency. These findings provide a starting point for targeting interventions to improve consistency, suggesting that a focus on self-control skills, together with addressing contextual factors such as social situations and time of day, may be most promising. This work is a first step to provide people with the tools they need to maintain a consistently healthy lifestyle in a food-rich environment.
Personality and Situation Predictors of Consistent Eating Patterns.
Directory of Open Access Journals (Sweden)
Uku Vainik
Full Text Available A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studied.A community-based sample of 164 women completed various personality tests, and 139 of them also reported their eating behaviour 6 times/day over 10 observational days. We focused on observations with meals (breakfast, lunch, or dinner. The participants indicated if their momentary eating patterns were consistent with their own baseline eating patterns in terms of healthiness or size of the meal. Further, participants described various characteristics of each eating situation.Eating consistency was positively predicted by trait self-control. Eating consistency was undermined by eating in the evening, eating with others, eating away from home, having consumed alcohol and having undertaken physical exercise. Interactions emerged between personality traits and situations, including punishment sensitivity, restraint, physical activity and alcohol consumption.Trait self-control and several eating situation variables were related to eating consistency. These findings provide a starting point for targeting interventions to improve consistency, suggesting that a focus on self-control skills, together with addressing contextual factors such as social situations and time of day, may be most promising. This work is a first step to provide people with the tools they need to maintain a consistently healthy lifestyle in a food-rich environment.
Thermodynamically consistent coarse graining of biocatalysts beyond Michaelis–Menten
Wachtel, Artur; Rao, Riccardo; Esposito, Massimiliano
2018-04-01
Starting from the detailed catalytic mechanism of a biocatalyst we provide a coarse-graining procedure which, by construction, is thermodynamically consistent. This procedure provides stoichiometries, reaction fluxes (rate laws), and reaction forces (Gibbs energies of reaction) for the coarse-grained level. It can treat active transporters and molecular machines, and thus extends the applicability of ideas that originated in enzyme kinetics. Our results lay the foundations for systematic studies of the thermodynamics of large-scale biochemical reaction networks. Moreover, we identify the conditions under which a relation between one-way fluxes and forces holds at the coarse-grained level as it holds at the detailed level. In doing so, we clarify the speculations and broad claims made in the literature about such a general flux–force relation. As a further consequence we show that, in contrast to common belief, the second law of thermodynamics does not require the currents and the forces of biochemical reaction networks to be always aligned.
Consistency check of photon beam physical data after recommissioning process
International Nuclear Information System (INIS)
Kadman, B; Chawapun, N; Ua-apisitwong, S; Asakit, T; Chumpu, N; Rueansri, J
2016-01-01
In radiotherapy, medical linear accelerator (Linac) is the key system used for radiation treatments delivery. Although, recommissioning was recommended after major modification of the machine by AAPM TG53, but it might not be practical in radiotherapy center with heavy workloads. The main purpose of this study was to compare photon beam physical data between initial commissioning and recommissioning of 6 MV Elekta Precise linac. The parameters for comparing were the percentage depth dose (PDD) and beam profiles. The clinical commissioning test cases followed IAEA-TECDOC-1583 were planned on REF 91230 IMRT Dose Verification Phantom by Philips’ Pinnacle treatment planning system. The Delta 4PT was used for dose distribution verification with 90% passing criteria of the gamma index (3%/3mm). Our results revealed that the PDDs and beam profiles agreed within a tolerance limit recommended by TRS430. Most of the point doses and dose distribution verification passed the acceptance criteria. This study showed the consistency of photon beam physical data after recommissioning process. There was a good agreement between initial commissioning and recommissioning within a tolerance limit, demonstrated that the full recommissioning process might not be required. However, in the complex treatment planning geometry, the initial data should be applied with great caution. (paper)
Individual consistency and flexibility in human social information use.
Toelch, Ulf; Bruce, Matthew J; Newson, Lesley; Richerson, Peter J; Reader, Simon M
2014-02-07
Copying others appears to be a cost-effective way of obtaining adaptive information, particularly when flexibly employed. However, adult humans differ considerably in their propensity to use information from others, even when this 'social information' is beneficial, raising the possibility that stable individual differences constrain flexibility in social information use. We used two dissimilar decision-making computer games to investigate whether individuals flexibly adjusted their use of social information to current conditions or whether they valued social information similarly in both games. Participants also completed established personality questionnaires. We found that participants demonstrated considerable flexibility, adjusting social information use to current conditions. In particular, individuals employed a 'copy-when-uncertain' social learning strategy, supporting a core, but untested, assumption of influential theoretical models of cultural transmission. Moreover, participants adjusted the amount invested in their decision based on the perceived reliability of personally gathered information combined with the available social information. However, despite this strategic flexibility, participants also exhibited consistent individual differences in their propensities to use and value social information. Moreover, individuals who favoured social information self-reported as more collectivist than others. We discuss the implications of our results for social information use and cultural transmission.
MAP estimators and their consistency in Bayesian nonparametric inverse problems
Dashti, M.
2013-09-01
We consider the inverse problem of estimating an unknown function u from noisy measurements y of a known, possibly nonlinear, map applied to u. We adopt a Bayesian approach to the problem and work in a setting where the prior measure is specified as a Gaussian random field μ0. We work under a natural set of conditions on the likelihood which implies the existence of a well-posed posterior measure, μy. Under these conditions, we show that the maximum a posteriori (MAP) estimator is well defined as the minimizer of an Onsager-Machlup functional defined on the Cameron-Martin space of the prior; thus, we link a problem in probability with a problem in the calculus of variations. We then consider the case where the observational noise vanishes and establish a form of Bayesian posterior consistency for the MAP estimator. We also prove a similar result for the case where the observation of can be repeated as many times as desired with independent identically distributed noise. The theory is illustrated with examples from an inverse problem for the Navier-Stokes equation, motivated by problems arising in weather forecasting, and from the theory of conditioned diffusions, motivated by problems arising in molecular dynamics. © 2013 IOP Publishing Ltd.
Globfit: Consistently fitting primitives by discovering global relations
Li, Yangyan; Wu, Xiaokun; Chrysathou, Yiorgos; Sharf, Andrei Sharf; Cohen-Or, Daniel; Mitra, Niloy J.
2011-01-01
Given a noisy and incomplete point set, we introduce a method that simultaneously recovers a set of locally fitted primitives along with their global mutual relations. We operate under the assumption that the data corresponds to a man-made engineering object consisting of basic primitives, possibly repeated and globally aligned under common relations. We introduce an algorithm to directly couple the local and global aspects of the problem. The local fit of the model is determined by how well the inferred model agrees to the observed data, while the global relations are iteratively learned and enforced through a constrained optimization. Starting with a set of initial RANSAC based locally fitted primitives, relations across the primitives such as orientation, placement, and equality are progressively learned and conformed to. In each stage, a set of feasible relations are extracted among the candidate relations, and then aligned to, while best fitting to the input data. The global coupling corrects the primitives obtained in the local RANSAC stage, and brings them to precise global alignment. We test the robustness of our algorithm on a range of synthesized and scanned data, with varying amounts of noise, outliers, and non-uniform sampling, and validate the results against ground truth, where available. © 2011 ACM.
Globfit: Consistently fitting primitives by discovering global relations
Li, Yangyan
2011-07-01
Given a noisy and incomplete point set, we introduce a method that simultaneously recovers a set of locally fitted primitives along with their global mutual relations. We operate under the assumption that the data corresponds to a man-made engineering object consisting of basic primitives, possibly repeated and globally aligned under common relations. We introduce an algorithm to directly couple the local and global aspects of the problem. The local fit of the model is determined by how well the inferred model agrees to the observed data, while the global relations are iteratively learned and enforced through a constrained optimization. Starting with a set of initial RANSAC based locally fitted primitives, relations across the primitives such as orientation, placement, and equality are progressively learned and conformed to. In each stage, a set of feasible relations are extracted among the candidate relations, and then aligned to, while best fitting to the input data. The global coupling corrects the primitives obtained in the local RANSAC stage, and brings them to precise global alignment. We test the robustness of our algorithm on a range of synthesized and scanned data, with varying amounts of noise, outliers, and non-uniform sampling, and validate the results against ground truth, where available. © 2011 ACM.
MAP estimators and their consistency in Bayesian nonparametric inverse problems
International Nuclear Information System (INIS)
Dashti, M; Law, K J H; Stuart, A M; Voss, J
2013-01-01
We consider the inverse problem of estimating an unknown function u from noisy measurements y of a known, possibly nonlinear, map G applied to u. We adopt a Bayesian approach to the problem and work in a setting where the prior measure is specified as a Gaussian random field μ 0 . We work under a natural set of conditions on the likelihood which implies the existence of a well-posed posterior measure, μ y . Under these conditions, we show that the maximum a posteriori (MAP) estimator is well defined as the minimizer of an Onsager–Machlup functional defined on the Cameron–Martin space of the prior; thus, we link a problem in probability with a problem in the calculus of variations. We then consider the case where the observational noise vanishes and establish a form of Bayesian posterior consistency for the MAP estimator. We also prove a similar result for the case where the observation of G(u) can be repeated as many times as desired with independent identically distributed noise. The theory is illustrated with examples from an inverse problem for the Navier–Stokes equation, motivated by problems arising in weather forecasting, and from the theory of conditioned diffusions, motivated by problems arising in molecular dynamics. (paper)
A self-consistent upward leader propagation model
International Nuclear Information System (INIS)
Becerra, Marley; Cooray, Vernon
2006-01-01
The knowledge of the initiation and propagation of an upward moving connecting leader in the presence of a downward moving lightning stepped leader is a must in the determination of the lateral attraction distance of a lightning flash by any grounded structure. Even though different models that simulate this phenomenon are available in the literature, they do not take into account the latest developments in the physics of leader discharges. The leader model proposed here simulates the advancement of positive upward leaders by appealing to the presently understood physics of that process. The model properly simulates the upward continuous progression of the positive connecting leaders from its inception to the final connection with the downward stepped leader (final jump). Thus, the main physical properties of upward leaders, namely the charge per unit length, the injected current, the channel gradient and the leader velocity are self-consistently obtained. The obtained results are compared with an altitude triggered lightning experiment and there is good agreement between the model predictions and the measured leader current and the experimentally inferred spatial and temporal location of the final jump. It is also found that the usual assumption of constant charge per unit length, based on laboratory experiments, is not valid for lightning upward connecting leaders
Self-Consistent Study of Conjugated Aromatic Molecular Transistors
International Nuclear Information System (INIS)
Jing, Wang; Yun-Ye, Liang; Hao, Chen; Peng, Wang; Note, R.; Mizuseki, H.; Kawazoe, Y.
2010-01-01
We study the current through conjugated aromatic molecular transistors modulated by a transverse field. The self-consistent calculation is realized with density function theory through the standard quantum chemistry software Gaussian03 and the non-equilibrium Green's function formalism. The calculated I – V curves controlled by the transverse field present the characteristics of different organic molecular transistors, the transverse field effect of which is improved by the substitutions of nitrogen atoms or fluorine atoms. On the other hand, the asymmetry of molecular configurations to the axis connecting two sulfur atoms is in favor of realizing the transverse field modulation. Suitably designed conjugated aromatic molecular transistors possess different I – V characteristics, some of them are similar to those of metal-oxide-semiconductor field-effect transistors (MOSFET). Some of the calculated molecular devices may work as elements in graphene electronics. Our results present the richness and flexibility of molecular transistors, which describe the colorful prospect of next generation devices. (condensed matter: electronic structure, electrical, magnetic, and optical properties)
Consistent estimation of Gibbs energy using component contributions.
Directory of Open Access Journals (Sweden)
Elad Noor
Full Text Available Standard Gibbs energies of reactions are increasingly being used in metabolic modeling for applying thermodynamic constraints on reaction rates, metabolite concentrations and kinetic parameters. The increasing scope and diversity of metabolic models has led scientists to look for genome-scale solutions that can estimate the standard Gibbs energy of all the reactions in metabolism. Group contribution methods greatly increase coverage, albeit at the price of decreased precision. We present here a way to combine the estimations of group contribution with the more accurate reactant contributions by decomposing each reaction into two parts and applying one of the methods on each of them. This method gives priority to the reactant contributions over group contributions while guaranteeing that all estimations will be consistent, i.e. will not violate the first law of thermodynamics. We show that there is a significant increase in the accuracy of our estimations compared to standard group contribution. Specifically, our cross-validation results show an 80% reduction in the median absolute residual for reactions that can be derived by reactant contributions only. We provide the full framework and source code for deriving estimates of standard reaction Gibbs energy, as well as confidence intervals, and believe this will facilitate the wide use of thermodynamic data for a better understanding of metabolism.
G-Consistent Subsets and Reduced Dynamical Quantum Maps
Ceballos, Russell R.
A quantum system which evolves in time while interacting with an external environ- ment is said to be an open quantum system (OQS), and the influence of the environment on the unperturbed unitary evolution of the system generally leads to non-unitary dynamics. This kind of open system dynamical evolution has been typically modeled by a Standard Prescription (SP) which assumes that the state of the OQS is initially uncorrelated with the environment state. It is here shown that when a minimal set of physically motivated assumptions are adopted, not only does there exist constraints on the reduced dynamics of an OQS such that this SP does not always accurately describe the possible initial cor- relations existing between the OQS and environment, but such initial correlations, and even entanglement, can be witnessed when observing a particular class of reduced state transformations termed purity extractions are observed. Furthermore, as part of a more fundamental investigation to better understand the minimal set of assumptions required to formulate well defined reduced dynamical quantum maps, it is demonstrated that there exists a one-to-one correspondence between the set of initial reduced states and the set of admissible initial system-environment composite states when G-consistency is enforced. Given the discussions surrounding the requirement of complete positivity and the reliance on the SP, the results presented here may well be found valuable for determining the ba- sic properties of reduced dynamical maps, and when restrictions on the OQS dynamics naturally emerge.
Large scale Bayesian nuclear data evaluation with consistent model defects
International Nuclear Information System (INIS)
Schnabel, G
2015-01-01
The aim of nuclear data evaluation is the reliable determination of cross sections and related quantities of the atomic nuclei. To this end, evaluation methods are applied which combine the information of experiments with the results of model calculations. The evaluated observables with their associated uncertainties and correlations are assembled into data sets, which are required for the development of novel nuclear facilities, such as fusion reactors for energy supply, and accelerator driven systems for nuclear waste incineration. The efficiency and safety of such future facilities is dependent on the quality of these data sets and thus also on the reliability of the applied evaluation methods. This work investigated the performance of the majority of available evaluation methods in two scenarios. The study indicated the importance of an essential component in these methods, which is the frequently ignored deficiency of nuclear models. Usually, nuclear models are based on approximations and thus their predictions may deviate from reliable experimental data. As demonstrated in this thesis, the neglect of this possibility in evaluation methods can lead to estimates of observables which are inconsistent with experimental data. Due to this finding, an extension of Bayesian evaluation methods is proposed to take into account the deficiency of the nuclear models. The deficiency is modeled as a random function in terms of a Gaussian process and combined with the model prediction. This novel formulation conserves sum rules and allows to explicitly estimate the magnitude of model deficiency. Both features are missing in available evaluation methods so far. Furthermore, two improvements of existing methods have been developed in the course of this thesis. The first improvement concerns methods relying on Monte Carlo sampling. A Metropolis-Hastings scheme with a specific proposal distribution is suggested, which proved to be more efficient in the studied scenarios than the
Evaluating the hydrological consistency of satellite based water cycle components
Lopez Valencia, Oliver Miguel
2016-06-15
Advances in multi-satellite based observations of the earth system have provided the capacity to retrieve information across a wide-range of land surface hydrological components and provided an opportunity to characterize terrestrial processes from a completely new perspective. Given the spatial advantage that space-based observations offer, several regional-to-global scale products have been developed, offering insights into the multi-scale behaviour and variability of hydrological states and fluxes. However, one of the key challenges in the use of satellite-based products is characterizing the degree to which they provide realistic and representative estimates of the underlying retrieval: that is, how accurate are the hydrological components derived from satellite observations? The challenge is intrinsically linked to issues of scale, since the availability of high-quality in-situ data is limited, and even where it does exist, is generally not commensurate to the resolution of the satellite observation. Basin-scale studies have shown considerable variability in achieving water budget closure with any degree of accuracy using satellite estimates of the water cycle. In order to assess the suitability of this type of approach for evaluating hydrological observations, it makes sense to first test it over environments with restricted hydrological inputs, before applying it to more hydrological complex basins. Here we explore the concept of hydrological consistency, i.e. the physical considerations that the water budget impose on the hydrologic fluxes and states to be temporally and spatially linked, to evaluate the reproduction of a set of large-scale evaporation (E) products by using a combination of satellite rainfall (P) and Gravity Recovery and Climate Experiment (GRACE) observations of storage change, focusing on arid and semi-arid environments, where the hydrological flows can be more realistically described. Our results indicate no persistent hydrological
Facial Mimicry and Emotion Consistency: Influences of Memory and Context.
Kirkham, Alexander J; Hayes, Amy E; Pawling, Ralph; Tipper, Steven P
2015-01-01
This study investigates whether mimicry of facial emotions is a stable response or can instead be modulated and influenced by memory of the context in which the emotion was initially observed, and therefore the meaning of the expression. The study manipulated emotion consistency implicitly, where a face expressing smiles or frowns was irrelevant and to be ignored while participants categorised target scenes. Some face identities always expressed emotions consistent with the scene (e.g., smiling with a positive scene), whilst others were always inconsistent (e.g., frowning with a positive scene). During this implicit learning of face identity and emotion consistency there was evidence for encoding of face-scene emotion consistency, with slower RTs, a reduction in trust, and inhibited facial EMG for faces expressing incompatible emotions. However, in a later task where the faces were subsequently viewed expressing emotions with no additional context, there was no evidence for retrieval of prior emotion consistency, as mimicry of emotion was similar for consistent and inconsistent individuals. We conclude that facial mimicry can be influenced by current emotion context, but there is little evidence of learning, as subsequent mimicry of emotionally consistent and inconsistent faces is similar.
Facial Mimicry and Emotion Consistency: Influences of Memory and Context.
Directory of Open Access Journals (Sweden)
Alexander J Kirkham
Full Text Available This study investigates whether mimicry of facial emotions is a stable response or can instead be modulated and influenced by memory of the context in which the emotion was initially observed, and therefore the meaning of the expression. The study manipulated emotion consistency implicitly, where a face expressing smiles or frowns was irrelevant and to be ignored while participants categorised target scenes. Some face identities always expressed emotions consistent with the scene (e.g., smiling with a positive scene, whilst others were always inconsistent (e.g., frowning with a positive scene. During this implicit learning of face identity and emotion consistency there was evidence for encoding of face-scene emotion consistency, with slower RTs, a reduction in trust, and inhibited facial EMG for faces expressing incompatible emotions. However, in a later task where the faces were subsequently viewed expressing emotions with no additional context, there was no evidence for retrieval of prior emotion consistency, as mimicry of emotion was similar for consistent and inconsistent individuals. We conclude that facial mimicry can be influenced by current emotion context, but there is little evidence of learning, as subsequent mimicry of emotionally consistent and inconsistent faces is similar.
Consistent realization of Celestial and Terrestrial Reference Frames
Kwak, Younghee; Bloßfeld, Mathis; Schmid, Ralf; Angermann, Detlef; Gerstl, Michael; Seitz, Manuela
2018-03-01
The Celestial Reference System (CRS) is currently realized only by Very Long Baseline Interferometry (VLBI) because it is the space geodetic technique that enables observations in that frame. In contrast, the Terrestrial Reference System (TRS) is realized by means of the combination of four space geodetic techniques: Global Navigation Satellite System (GNSS), VLBI, Satellite Laser Ranging (SLR), and Doppler Orbitography and Radiopositioning Integrated by Satellite. The Earth orientation parameters (EOP) are the link between the two types of systems, CRS and TRS. The EOP series of the International Earth Rotation and Reference Systems Service were combined of specifically selected series from various analysis centers. Other EOP series were generated by a simultaneous estimation together with the TRF while the CRF was fixed. Those computation approaches entail inherent inconsistencies between TRF, EOP, and CRF, also because the input data sets are different. A combined normal equation (NEQ) system, which consists of all the parameters, i.e., TRF, EOP, and CRF, would overcome such an inconsistency. In this paper, we simultaneously estimate TRF, EOP, and CRF from an inter-technique combined NEQ using the latest GNSS, VLBI, and SLR data (2005-2015). The results show that the selection of local ties is most critical to the TRF. The combination of pole coordinates is beneficial for the CRF, whereas the combination of Δ UT1 results in clear rotations of the estimated CRF. However, the standard deviations of the EOP and the CRF improve by the inter-technique combination which indicates the benefits of a common estimation of all parameters. It became evident that the common determination of TRF, EOP, and CRF systematically influences future ICRF computations at the level of several μas. Moreover, the CRF is influenced by up to 50 μas if the station coordinates and EOP are dominated by the satellite techniques.
Protective Factors, Risk Indicators, and Contraceptive Consistency Among College Women.
Morrison, Leslie F; Sieving, Renee E; Pettingell, Sandra L; Hellerstedt, Wendy L; McMorris, Barbara J; Bearinger, Linda H
2016-01-01
To explore risk and protective factors associated with consistent contraceptive use among emerging adult female college students and whether effects of risk indicators were moderated by protective factors. Secondary analysis of National Longitudinal Study of Adolescent to Adult Health Wave III data. Data collected through in-home interviews in 2001 and 2002. National sample of 18- to 25-year-old women (N = 842) attending 4-year colleges. We examined relationships between protective factors, risk indicators, and consistent contraceptive use. Consistent contraceptive use was defined as use all of the time during intercourse in the past 12 months. Protective factors included external supports of parental closeness and relationship with caring nonparental adult and internal assets of self-esteem, confidence, independence, and life satisfaction. Risk indicators included heavy episodic drinking, marijuana use, and depression symptoms. Multivariable logistic regression models were used to evaluate relationships between protective factors and consistent contraceptive use and between risk indicators and contraceptive use. Self-esteem, confidence, independence, and life satisfaction were significantly associated with more consistent contraceptive use. In a final model including all internal assets, life satisfaction was significantly related to consistent contraceptive use. Marijuana use and depression symptoms were significantly associated with less consistent use. With one exception, protective factors did not moderate relationships between risk indicators and consistent use. Based on our findings, we suggest that risk and protective factors may have largely independent influences on consistent contraceptive use among college women. A focus on risk and protective factors may improve contraceptive use rates and thereby reduce unintended pregnancy among college students. Copyright © 2016 AWHONN, the Association of Women's Health, Obstetric and Neonatal Nurses. Published
The Consistent Preferences Approach to Deductive Reasoning in Games
Asheim, Geir B
2006-01-01
"The Consistent Preferences Approach to Deductive Reasoning in Games" presents, applies, and synthesizes what my co-authors and I have called the 'consistent preferences' approach to deductive reasoning in games. Briefly described, this means that the object of the analysis is the ranking by each player of his own strategies, rather than his choice. The ranking can be required to be consistent (in different senses) with his beliefs about the opponent's ranking of her strategies. This can be contrasted to the usual 'rational choice' approach where a player's strategy choice is (in dif
‘And God gave Solomon wisdom’: Proficiency in ornithomancy
Directory of Open Access Journals (Sweden)
Abraham O. Shemesh
2018-04-01
Full Text Available The biblical text accords a great deal of attention to King Solomon’s personal abilities and governmental power. Solomon was described as a judge, poet, constructor and the wisest of all people in the Ancient Near East and Egypt. The current study discusses the interpretation of the midrashim that show how Solomon’s wisdom was manifested in his considerable knowledge of ornithomancy, that is, divination using birds, a practice that was considered as an important wisdom in the ancient world because of its practical applications, particularly in the military sphere. It seems that Solomon’s portrayal as a magician is intended and aimed at emphasising his abilities and his impressive character. Moreover, it may have had the purpose of disproving the conception of Solomon as inferior to his surroundings in this respect and the idea that he or his kingdom could be controlled by nations that command this type of wisdom.
Cretaceous choristoderan reptiles gave birth to live young
Ji, Qiang; Wu, Xiao-Chun; Cheng, Yen-Nien
2010-04-01
Viviparity (giving birth to live young) in fossil reptiles has been known only in a few marine groups: ichthyosaurs, pachypleurosaurs, and mosasaurs. Here, we report a pregnant specimen of the Early Cretaceous Hyphalosaurus baitaigouensis, a species of Choristodera, a diapsid group known from unequivocal fossil remains from the Middle Jurassic to the early Miocene (about 165 to 20 million years ago). This specimen provides the first evidence of viviparity in choristoderan reptiles and is also the sole record of viviparity in fossil reptiles which lived in freshwater ecosystems. This exquisitely preserved specimen contains up to 18 embryos arranged in pairs. Size comparison with small free-living individuals and the straight posture of the posterior-most pair suggest that those embryos were at term and had probably reached parturition. The posterior-most embryo on the left side has the head positioned toward the rear, contrary to normal position, suggesting a complication that may have contributed to the mother’s death. Viviparity would certainly have freed species of Hyphalosaurus from the need to return to land to deposit eggs; taking this advantage, they would have avoided intense competition with contemporaneous terrestrial carnivores such as dinosaurs.
AWElectric : that gave me goosebumps, did you feel it too?
Neidlinger, K.; Truong, K.P.; Telfair, C.; Feijs, L.M.G.; Dertien, E.; Evers, V.
2017-01-01
Awe is a powerful, visceral sensation described as a sudden chill or shudder accompanied by goosebumps. People feel awe in the face of extraordinary experiences: The sublimity of nature, the beauty of art and music, the adrenaline rush of fear. Awe is healthy, both physically and mentally. It can be
AWElectric : that gave me goosebumps, did you feel it too?
Neidlinger, Kristin; Truong, Khiet Phuong; Telfair, Caty; Feijs, Loe; Dertien, Edwin Christian; Evers, Vanessa
2017-01-01
Awe is a powerful, visceral sensation described as a sudden chill or shudder accompanied by goosebumps. People feel awe in the face of extraordinary experiences: the sublimity of nature, the beauty of art and music, the adrenaline rush of fear. Awe is healthy, both physically and mentally. It can be
International Nuclear Information System (INIS)
Lino, A.T.; Takahashi, E.K.; Leite, J.R.; Ferraz, A.C.
1988-01-01
The band structure of metallic sodium is calculated, using for the first time the self-consistent field variational cellular method. In order to implement the self-consistency in the variational cellular theory, the crystal electronic charge density was calculated within the muffin-tin approximation. The comparison between our results and those derived from other calculations leads to the conclusion that the proposed self-consistent version of the variational cellular method is fast and accurate. (author) [pt
On the consistent histories approach to quantum mechanics
International Nuclear Information System (INIS)
Dowker, F.; Kent, A.
1996-01-01
We review the consistent histories formulations of quantum mechanics developed by Griffiths, Omnes, Gell-Man, and Hartle, and we describe the classifications of consistent sets. We illustrate some general features of consistent sets by a few lemmas and examples. We also consider various interpretations of the formalism, and we examine the new problems which arise in reconstructing the past and predicting the future. It is shown that Omnes characterization of true statements---statements that can be deduced unconditionally in his interpretation---is incorrect. We examine critically Gell-Mann and Hartle's interpretation of the formalism, and in particular, their discussions of communication, prediction, and retrodiction, and we conclude that their explanation of the apparent persistence of quasiclassicality relies on assumptions about an as-yet-unknown theory of experience. Our overall conclusion is that the consistent histories approach illustrates the need to supplement quantum mechanics by some selection principle in order to produce a fundamental theory capable of unconditional predictions
Consistency of Trend Break Point Estimator with Underspecified Break Number
Directory of Open Access Journals (Sweden)
Jingjing Yang
2017-01-01
Full Text Available This paper discusses the consistency of trend break point estimators when the number of breaks is underspecified. The consistency of break point estimators in a simple location model with level shifts has been well documented by researchers under various settings, including extensions such as allowing a time trend in the model. Despite the consistency of break point estimators of level shifts, there are few papers on the consistency of trend shift break point estimators in the presence of an underspecified break number. The simulation study and asymptotic analysis in this paper show that the trend shift break point estimator does not converge to the true break points when the break number is underspecified. In the case of two trend shifts, the inconsistency problem worsens if the magnitudes of the breaks are similar and the breaks are either both positive or both negative. The limiting distribution for the trend break point estimator is developed and closely approximates the finite sample performance.
Liking for Evaluators: Consistency and Self-Esteem Theories
Regan, Judith Weiner
1976-01-01
Consistency and self-esteem theories make contrasting predictions about the relationship between a person's self-evaluation and his liking for an evaluator. Laboratory experiments confirmed predictions about these theories. (Editor/RK)
Sparse PDF Volumes for Consistent Multi-Resolution Volume Rendering
Sicat, Ronell Barrera; Kruger, Jens; Moller, Torsten; Hadwiger, Markus
2014-01-01
This paper presents a new multi-resolution volume representation called sparse pdf volumes, which enables consistent multi-resolution volume rendering based on probability density functions (pdfs) of voxel neighborhoods. These pdfs are defined
Structures, profile consistency, and transport scaling in electrostatic convection
DEFF Research Database (Denmark)
Bian, N.H.; Garcia, O.E.
2005-01-01
Two mechanisms at the origin of profile consistency in models of electrostatic turbulence in magnetized plasmas are considered. One involves turbulent diffusion in collisionless plasmas and the subsequent turbulent equipartition of Lagrangian invariants. By the very nature of its definition...
15 CFR 930.36 - Consistency determinations for proposed activities.
2010-01-01
... necessity of issuing separate consistency determinations for each incremental action controlled by the major... plans), and that affect any coastal use or resource of more than one State. Many States share common...
Decentralized Consistent Network Updates in SDN with ez-Segway
Nguyen, Thanh Dang; Chiesa, Marco; Canini, Marco
2017-01-01
We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and black-holes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes
The utility of theory of planned behavior in predicting consistent ...
African Journals Online (AJOL)
admin
disease. Objective: To examine the utility of theory of planned behavior in predicting consistent condom use intention of HIV .... (24-25), making subjective norms as better predictors of intention ..... Organizational Behavior and Human Decision.
Island of Stability for Consistent Deformations of Einstein's Gravity
DEFF Research Database (Denmark)
Dietrich, Dennis D.; Berkhahn, Felix; Hofmann, Stefan
2012-01-01
We construct deformations of general relativity that are consistent and phenomenologically viable, since they respect, in particular, cosmological backgrounds. These deformations have unique symmetries in accordance with their Minkowski cousins (Fierz-Pauli theory for massive gravitons) and incor...
Self-consistent normal ordering of gauge field theories
International Nuclear Information System (INIS)
Ruehl, W.
1987-01-01
Mean-field theories with a real action of unconstrained fields can be self-consistently normal ordered. This leads to a considerable improvement over standard mean-field theory. This concept is applied to lattice gauge theories. First an appropriate real action mean-field theory is constructed. The equations determining the Gaussian kernel necessary for self-consistent normal ordering of this mean-field theory are derived. (author). 4 refs
Consistency of the least weighted squares under heteroscedasticity
Czech Academy of Sciences Publication Activity Database
Víšek, Jan Ámos
2011-01-01
Roč. 2011, č. 47 (2011), s. 179-206 ISSN 0023-5954 Grant - others:GA UK(CZ) GA402/09/055 Institutional research plan: CEZ:AV0Z10750506 Keywords : Regression * Consistency * The least weighted squares * Heteroscedasticity Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.454, year: 2011 http://library.utia.cas.cz/separaty/2011/SI/visek-consistency of the least weighted squares under heteroscedasticity.pdf
Self-consistency corrections in effective-interaction calculations
International Nuclear Information System (INIS)
Starkand, Y.; Kirson, M.W.
1975-01-01
Large-matrix extended-shell-model calculations are used to compute self-consistency corrections to the effective interaction and to the linked-cluster effective interaction. The corrections are found to be numerically significant and to affect the rate of convergence of the corresponding perturbation series. The influence of various partial corrections is tested. It is concluded that self-consistency is an important effect in determining the effective interaction and improving the rate of convergence. (author)
Parquet equations for numerical self-consistent-field theory
International Nuclear Information System (INIS)
Bickers, N.E.
1991-01-01
In recent years increases in computational power have provided new motivation for the study of self-consistent-field theories for interacting electrons. In this set of notes, the so-called parquet equations for electron systems are derived pedagogically. The principal advantages of the parquet approach are outlined, and its relationship to simpler self-consistent-field methods, including the Baym-Kadanoff technique, is discussed in detail. (author). 14 refs, 9 figs
Consistent Estimation of Pricing Kernels from Noisy Price Data
Vladislav Kargin
2003-01-01
If pricing kernels are assumed non-negative then the inverse problem of finding the pricing kernel is well-posed. The constrained least squares method provides a consistent estimate of the pricing kernel. When the data are limited, a new method is suggested: relaxed maximization of the relative entropy. This estimator is also consistent. Keywords: $\\epsilon$-entropy, non-parametric estimation, pricing kernel, inverse problems.
Measuring consistency of autobiographical memory recall in depression.
LENUS (Irish Health Repository)
Semkovska, Maria
2012-05-15
Autobiographical amnesia assessments in depression need to account for normal changes in consistency over time, contribution of mood and type of memories measured. We report herein validation studies of the Columbia Autobiographical Memory Interview - Short Form (CAMI-SF), exclusively used in depressed patients receiving electroconvulsive therapy (ECT) but without previous published report of normative data. The CAMI-SF was administered twice with a 6-month interval to 44 healthy volunteers to obtain normative data for retrieval consistency of its Semantic, Episodic-Extended and Episodic-Specific components and assess their reliability and validity. Healthy volunteers showed significant large decreases in retrieval consistency on all components. The Semantic and Episodic-Specific components demonstrated substantial construct validity. We then assessed CAMI-SF retrieval consistencies over a 2-month interval in 30 severely depressed patients never treated with ECT compared with healthy controls (n=19). On initial assessment, depressed patients produced less episodic-specific memories than controls. Both groups showed equivalent amounts of consistency loss over a 2-month interval on all components. At reassessment, only patients with persisting depressive symptoms were distinguishable from controls on episodic-specific memories retrieved. Research quantifying retrograde amnesia following ECT for depression needs to control for normal loss in consistency over time and contribution of persisting depressive symptoms.
Measuring consistency of autobiographical memory recall in depression.
Semkovska, Maria; Noone, Martha; Carton, Mary; McLoughlin, Declan M
2012-05-15
Autobiographical amnesia assessments in depression need to account for normal changes in consistency over time, contribution of mood and type of memories measured. We report herein validation studies of the Columbia Autobiographical Memory Interview - Short Form (CAMI-SF), exclusively used in depressed patients receiving electroconvulsive therapy (ECT) but without previous published report of normative data. The CAMI-SF was administered twice with a 6-month interval to 44 healthy volunteers to obtain normative data for retrieval consistency of its Semantic, Episodic-Extended and Episodic-Specific components and assess their reliability and validity. Healthy volunteers showed significant large decreases in retrieval consistency on all components. The Semantic and Episodic-Specific components demonstrated substantial construct validity. We then assessed CAMI-SF retrieval consistencies over a 2-month interval in 30 severely depressed patients never treated with ECT compared with healthy controls (n=19). On initial assessment, depressed patients produced less episodic-specific memories than controls. Both groups showed equivalent amounts of consistency loss over a 2-month interval on all components. At reassessment, only patients with persisting depressive symptoms were distinguishable from controls on episodic-specific memories retrieved. Research quantifying retrograde amnesia following ECT for depression needs to control for normal loss in consistency over time and contribution of persisting depressive symptoms. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Are prescription drug insurance choices consistent with expected utility theory?
Bundorf, M Kate; Mata, Rui; Schoenbaum, Michael; Bhattacharya, Jay
2013-09-01
To determine the extent to which people make choices inconsistent with expected utility theory when choosing among prescription drug insurance plans and whether tabular or graphical presentation format influences the consistency of their choices. Members of an Internet-enabled panel chose between two Medicare prescription drug plans. The "low variance" plan required higher out-of-pocket payments for the drugs respondents usually took but lower out-of-pocket payments for the drugs they might need if they developed a new health condition than the "high variance" plan. The probability of a change in health varied within subjects and the presentation format (text vs. graphical) and the affective salience of the clinical condition (abstract vs. risk related to specific clinical condition) varied between subjects. Respondents were classified based on whether they consistently chose either the low or high variance plan. Logistic regression models were estimated to examine the relationship between decision outcomes and task characteristics. The majority of respondents consistently chose either the low or high variance plan, consistent with expected utility theory. Half of respondents consistently chose the low variance plan. Respondents were less likely to make discrepant choices when information was presented in graphical format. Many people, although not all, make choices consistent with expected utility theory when they have information on differences among plans in the variance of out-of-pocket spending. Medicare beneficiaries would benefit from information on the extent to which prescription drug plans provide risk protection. PsycINFO Database Record (c) 2013 APA, all rights reserved.
Consistent Code Qualification Process and Application to WWER-1000 NPP
International Nuclear Information System (INIS)
Berthon, A.; Petruzzi, A.; Giannotti, W.; D'Auria, F.; Reventos, F.
2006-01-01
Calculation analysis by application of the system codes are performed to evaluate the NPP or the facility behavior during a postulated transient or to evaluate the code capability. The calculation analysis constitutes a process that involves the code itself, the data of the reference plant, the data about the transient, the nodalization, and the user. All these elements affect one each other and affect the results. A major issue in the use of mathematical model is constituted by the model capability to reproduce the plant or facility behavior under steady state and transient conditions. These aspects constitute two main checks that must be satisfied during the qualification process. The first of them is related to the realization of a scheme of the reference plant; the second one is related to the capability to reproduce the transient behavior. The aim of this paper is to describe the UMAE (Uncertainty Method based on Accuracy Extrapolation) methodology developed at University of Pisa for qualifying a nodalization and analysing the calculated results and to perform the uncertainty evaluation of the system code by the CIAU code (Code with the capability of Internal Assessment of Uncertainty). The activity consists with the re-analysis of the Experiment BL-44 (SBLOCA) performed in the LOBI facility and the analysis of a Kv-scaling calculation of the WWER-1000 NPP nodalization taking as reference the test BL-44. Relap5/Mod3.3 has been used as thermal-hydraulic system code and the standard procedure adopted at University of Pisa has been applied to show the capability of the code to predict the significant aspects of the transient and to obtain a qualified nodalization of the WWER-1000 through a systematic qualitative and quantitative accuracy evaluation. The qualitative accuracy evaluation is based on the selection of Relevant Thermal-hydraulic Aspects (RTAs) and is a prerequisite to the application of the Fast Fourier Transform Based Method (FFTBM) which quantifies
Consistency of Vegetation Index Seasonality Across the Amazon Rainforest
Maeda, Eduardo Eiji; Moura, Yhasmin Mendes; Wagner, Fabien; Hilker, Thomas; Lyapustin, Alexei I.; Wang, Yujie; Chave, Jerome; Mottus, Matti; Aragao, Luiz E.O.C.; Shimabukuro, Yosio
2016-01-01
Vegetation indices (VIs) calculated from remotely sensed reflectance are widely used tools for characterizing the extent and status of vegetated areas. Recently, however, their capability to monitor the Amazon forest phenology has been intensely scrutinized. In this study, we analyze the consistency of VIs seasonal patterns obtained from two MODIS products: the Collection 5 BRDF product (MCD43) and the Multi-Angle Implementation of Atmospheric Correction algorithm (MAIAC). The spatio-temporal patterns of the VIs were also compared with field measured leaf litterfall, gross ecosystem productivity and active microwave data. Our results show that significant seasonal patterns are observed in all VIs after the removal of view-illumination effects and cloud contamination. However, we demonstrate inconsistencies in the characteristics of seasonal patterns between different VIs and MODIS products. We demonstrate that differences in the original reflectance band values form a major source of discrepancy between MODIS VI products. The MAIAC atmospheric correction algorithm significantly reduces noise signals in the red and blue bands. Another important source of discrepancy is caused by differences in the availability of clear-sky data, as the MAIAC product allows increased availability of valid pixels in the equatorial Amazon. Finally, differences in VIs seasonal patterns were also caused by MODIS collection 5 calibration degradation. The correlation of remote sensing and field data also varied spatially, leading to different temporal offsets between VIs, active microwave and field measured data. We conclude that recent improvements in the MAIAC product have led to changes in the characteristics of spatio-temporal patterns of VIs seasonality across the Amazon forest, when compared to the MCD43 product. Nevertheless, despite improved quality and reduced uncertainties in the MAIAC product, a robust biophysical interpretation of VIs seasonality is still missing.
Accuracy and Consistency of Respiratory Gating in Abdominal Cancer Patients
International Nuclear Information System (INIS)
Ge, Jiajia; Santanam, Lakshmi; Yang, Deshan; Parikh, Parag J.
2013-01-01
Purpose: To evaluate respiratory gating accuracy and intrafractional consistency for abdominal cancer patients treated with respiratory gated treatment on a regular linear accelerator system. Methods and Materials: Twelve abdominal patients implanted with fiducials were treated with amplitude-based respiratory-gated radiation therapy. On the basis of daily orthogonal fluoroscopy, the operator readjusted the couch position and gating window such that the fiducial was within a setup margin (fiducial-planning target volume [f-PTV]) when RPM indicated “beam-ON.” Fifty-five pre- and post-treatment fluoroscopic movie pairs with synchronized respiratory gating signal were recorded. Fiducial motion traces were extracted from the fluoroscopic movies using a template matching algorithm and correlated with f-PTV by registering the digitally reconstructed radiographs with the fluoroscopic movies. Treatment was determined to be “accurate” if 50% of the fiducial area stayed within f-PTV while beam-ON. For movie pairs that lost gating accuracy, a MATLAB program was used to assess whether the gating window was optimized, the external-internal correlation (EIC) changed, or the patient moved between movies. A series of safety margins from 0.5 mm to 3 mm was added to f-PTV for reassessing gating accuracy. Results: A decrease in gating accuracy was observed in 44% of movie pairs from daily fluoroscopic movies of 12 abdominal patients. Three main causes for inaccurate gating were identified as change of global EIC over time (∼43%), suboptimal gating setup (∼37%), and imperfect EIC within movie (∼13%). Conclusions: Inconsistent respiratory gating accuracy may occur within 1 treatment session even with a daily adjusted gating window. To improve or maintain gating accuracy during treatment, we suggest using at least a 2.5-mm safety margin to account for gating and setup uncertainties
Consistency of vegetation index seasonality across the Amazon rainforest
Maeda, Eduardo Eiji; Moura, Yhasmin Mendes; Wagner, Fabien; Hilker, Thomas; Lyapustin, Alexei I.; Wang, Yujie; Chave, Jérôme; Mõttus, Matti; Aragão, Luiz E. O. C.; Shimabukuro, Yosio
2016-10-01
Vegetation indices (VIs) calculated from remotely sensed reflectance are widely used tools for characterizing the extent and status of vegetated areas. Recently, however, their capability to monitor the Amazon forest phenology has been intensely scrutinized. In this study, we analyze the consistency of VIs seasonal patterns obtained from two MODIS products: the Collection 5 BRDF product (MCD43) and the Multi-Angle Implementation of Atmospheric Correction algorithm (MAIAC). The spatio-temporal patterns of the VIs were also compared with field measured leaf litterfall, gross ecosystem productivity and active microwave data. Our results show that significant seasonal patterns are observed in all VIs after the removal of view-illumination effects and cloud contamination. However, we demonstrate inconsistencies in the characteristics of seasonal patterns between different VIs and MODIS products. We demonstrate that differences in the original reflectance band values form a major source of discrepancy between MODIS VI products. The MAIAC atmospheric correction algorithm significantly reduces noise signals in the red and blue bands. Another important source of discrepancy is caused by differences in the availability of clear-sky data, as the MAIAC product allows increased availability of valid pixels in the equatorial Amazon. Finally, differences in VIs seasonal patterns were also caused by MODIS collection 5 calibration degradation. The correlation of remote sensing and field data also varied spatially, leading to different temporal offsets between VIs, active microwave and field measured data. We conclude that recent improvements in the MAIAC product have led to changes in the characteristics of spatio-temporal patterns of VIs seasonality across the Amazon forest, when compared to the MCD43 product. Nevertheless, despite improved quality and reduced uncertainties in the MAIAC product, a robust biophysical interpretation of VIs seasonality is still missing.
Miner, Claire Usher; Osborne, W. Larry; Jaeger, Richard M.
1997-01-01
Uses regression analysis on career development measures to examine whether career maturity indicators are predictive of interest consistency, differentiation, and score elevation. Results indicate that interest consistency and score elevation were weakly predicted by the measure; no relationship existed between the attitudinal and cognitive…
A self-consistent semiclassical sum rule approach to the average properties of giant resonances
International Nuclear Information System (INIS)
Li Guoqiang; Xu Gongou
1990-01-01
The average energies of isovector giant resonances and the widths of isoscalar giant resonances are evaluated with the help of a self-consistent semiclassical Sum rule approach. The comparison of the present results with the experimental ones justifies the self-consistent semiclassical sum rule approach to the average properties of giant resonances
Self-consistent Modeling of Elastic Anisotropy in Shale
Kanitpanyacharoen, W.; Wenk, H.; Matthies, S.; Vasin, R.
2012-12-01
Elastic anisotropy in clay-rich sedimentary rocks has increasingly received attention because of significance for prospecting of petroleum deposits, as well as seals in the context of nuclear waste and CO2 sequestration. The orientation of component minerals and pores/fractures is a critical factor that influences elastic anisotropy. In this study, we investigate lattice and shape preferred orientation (LPO and SPO) of three shales from the North Sea in UK, the Qusaiba Formation in Saudi Arabia, and the Officer Basin in Australia (referred to as N1, Qu3, and L1905, respectively) to calculate elastic properties and compare them with experimental results. Synchrotron hard X-ray diffraction and microtomography experiments were performed to quantify LPO, weight proportions, and three-dimensional SPO of constituent minerals and pores. Our preliminary results show that the degree of LPO and total amount of clays are highest in Qu3 (3.3-6.5 m.r.d and 74vol%), moderately high in N1 (2.4-5.6 m.r.d. and 70vol%), and lowest in L1905 (2.3-2.5 m.r.d. and 42vol%). In addition, porosity in Qu3 is as low as 2% while it is up to 6% in L1605 and 8% in N1, respectively. Based on this information and single crystal elastic properties of mineral components, we apply a self-consistent averaging method to calculate macroscopic elastic properties and corresponding seismic velocities for different shales. The elastic model is then compared with measured acoustic velocities on the same samples. The P-wave velocities measured from Qu3 (4.1-5.3 km/s, 26.3%Ani.) are faster than those obtained from L1905 (3.9-4.7 km/s, 18.6%Ani.) and N1 (3.6-4.3 km/s, 17.7%Ani.). By making adjustments for pore structure (aspect ratio) and single crystal elastic properties of clay minerals, a good agreement between our calculation and the ultrasonic measurement is obtained.
A Study on the Consistency of Discretization Equation in Unsteady Heat Transfer Calculations
Directory of Open Access Journals (Sweden)
Wenhua Zhang
2013-01-01
Full Text Available The previous studies on the consistency of discretization equation mainly focused on the finite difference method, but the issue of consistency still remains with several problems far from totally solved in the actual numerical computation. For instance, the consistency problem is involved in the numerical case where the boundary variables are solved explicitly while the variables away from the boundary are solved implicitly. And when the coefficient of discretization equation of nonlinear numerical case is the function of variables, calculating the coefficient explicitly and the variables implicitly might also give rise to consistency problem. Thus the present paper mainly researches the consistency problems involved in the explicit treatment of the second and third boundary conditions and that of thermal conductivity which is the function of temperature. The numerical results indicate that the consistency problem should be paid more attention and not be neglected in the practical computation.
Selected Results from the ATLAS Experiment on its 25th Anniversary
Djama, Fares; The ATLAS collaboration
2018-01-01
The Lomonosov Conference and the ATLAS Collaboration celebrated their 25th anniversaries at a few week interval. This gave us the opportunity to present a brief history of ATLAS and to discuss some of its more important results.
A consistent response spectrum analysis including the resonance range
International Nuclear Information System (INIS)
Schmitz, D.; Simmchen, A.
1983-01-01
The report provides a complete consistent Response Spectrum Analysis for any component. The effect of supports with different excitation is taken into consideration, at is the description of the resonance ranges. It includes information explaining how the contributions of the eigenforms with higher eigenfrequencies are to be considered. Stocking of floor response spectra is also possible using the method described here. However, modified floor response spectra must now be calculated for each building mode. Once these have been prepared, the calculation of the dynamic component values is practically no more complicated than with the conventional, non-consistent methods. The consistent Response Spectrum Analysis can supply smaller and larger values than the conventional theory, a fact which can be demonstrated using simple examples. The report contains a consistent Response Spectrum Analysis (RSA), which, as far as we know, has been formulated in this way for the first time. A consistent RSA is so important because today this method is preferentially applied as an important tool for the earthquake proof of components in nuclear power plants. (orig./HP)
GRAVITATIONALLY CONSISTENT HALO CATALOGS AND MERGER TREES FOR PRECISION COSMOLOGY
International Nuclear Information System (INIS)
Behroozi, Peter S.; Wechsler, Risa H.; Wu, Hao-Yi; Busha, Michael T.; Klypin, Anatoly A.; Primack, Joel R.
2013-01-01
We present a new algorithm for generating merger trees and halo catalogs which explicitly ensures consistency of halo properties (mass, position, and velocity) across time steps. Our algorithm has demonstrated the ability to improve both the completeness (through detecting and inserting otherwise missing halos) and purity (through detecting and removing spurious objects) of both merger trees and halo catalogs. In addition, our method is able to robustly measure the self-consistency of halo finders; it is the first to directly measure the uncertainties in halo positions, halo velocities, and the halo mass function for a given halo finder based on consistency between snapshots in cosmological simulations. We use this algorithm to generate merger trees for two large simulations (Bolshoi and Consuelo) and evaluate two halo finders (ROCKSTAR and BDM). We find that both the ROCKSTAR and BDM halo finders track halos extremely well; in both, the number of halos which do not have physically consistent progenitors is at the 1%-2% level across all halo masses. Our code is publicly available at http://code.google.com/p/consistent-trees. Our trees and catalogs are publicly available at http://hipacc.ucsc.edu/Bolshoi/.
SCALCE: boosting sequence compression algorithms using locally consistent encoding.
Hach, Faraz; Numanagic, Ibrahim; Alkan, Can; Sahinalp, S Cenk
2012-12-01
The high throughput sequencing (HTS) platforms generate unprecedented amounts of data that introduce challenges for the computational infrastructure. Data management, storage and analysis have become major logistical obstacles for those adopting the new platforms. The requirement for large investment for this purpose almost signalled the end of the Sequence Read Archive hosted at the National Center for Biotechnology Information (NCBI), which holds most of the sequence data generated world wide. Currently, most HTS data are compressed through general purpose algorithms such as gzip. These algorithms are not designed for compressing data generated by the HTS platforms; for example, they do not take advantage of the specific nature of genomic sequence data, that is, limited alphabet size and high similarity among reads. Fast and efficient compression algorithms designed specifically for HTS data should be able to address some of the issues in data management, storage and communication. Such algorithms would also help with analysis provided they offer additional capabilities such as random access to any read and indexing for efficient sequence similarity search. Here we present SCALCE, a 'boosting' scheme based on Locally Consistent Parsing technique, which reorganizes the reads in a way that results in a higher compression speed and compression rate, independent of the compression algorithm in use and without using a reference genome. Our tests indicate that SCALCE can improve the compression rate achieved through gzip by a factor of 4.19-when the goal is to compress the reads alone. In fact, on SCALCE reordered reads, gzip running time can improve by a factor of 15.06 on a standard PC with a single core and 6 GB memory. Interestingly even the running time of SCALCE + gzip improves that of gzip alone by a factor of 2.09. When compared with the recently published BEETL, which aims to sort the (inverted) reads in lexicographic order for improving bzip2, SCALCE + gzip
Context-dependent individual behavioral consistency in Daphnia
DEFF Research Database (Denmark)
Heuschele, Jan; Ekvall, Mikael T.; Bianco, Giuseppe
2017-01-01
The understanding of consistent individual differences in behavior, often termed "personality," for adapting and coping with threats and novel environmental conditions has advanced considerably during the last decade. However, advancements are almost exclusively associated with higher-order animals......, whereas studies focusing on smaller aquatic organisms are still rare. Here, we show individual differences in the swimming behavior of Daphnia magna, a clonal freshwater invertebrate, before, during, and after being exposed to a lethal threat, ultraviolet radiation (UVR). We show consistency in swimming...... that of adults. Overall, we show that aquatic invertebrates are far from being identical robots, but instead they show considerable individual differences in behavior that can be attributed to both ontogenetic development and individual consistency. Our study also demonstrates, for the first time...
Consistent forcing scheme in the cascaded lattice Boltzmann method
Fei, Linlin; Luo, Kai Hong
2017-11-01
In this paper, we give an alternative derivation for the cascaded lattice Boltzmann method (CLBM) within a general multiple-relaxation-time (MRT) framework by introducing a shift matrix. When the shift matrix is a unit matrix, the CLBM degrades into an MRT LBM. Based on this, a consistent forcing scheme is developed for the CLBM. The consistency of the nonslip rule, the second-order convergence rate in space, and the property of isotropy for the consistent forcing scheme is demonstrated through numerical simulations of several canonical problems. Several existing forcing schemes previously used in the CLBM are also examined. The study clarifies the relation between MRT LBM and CLBM under a general framework.
Self-consistent approximations beyond the CPA: Part II
International Nuclear Information System (INIS)
Kaplan, T.; Gray, L.J.
1982-01-01
This paper concentrates on a self-consistent approximation for random alloys developed by Kaplan, Leath, Gray, and Diehl. The construction of the augmented space formalism for a binary alloy is sketched, and the notation to be used derived. Using the operator methods of the augmented space, the self-consistent approximation is derived for the average Green's function, and for evaluating the self-energy, taking into account the scattering by clusters of excitations. The particular cluster approximation desired is derived by treating the scattering by the excitations with S /SUB T/ exactly. Fourier transforms on the disorder-space clustersite labels solve the self-consistent set of equations. Expansion to short range order in the alloy is also discussed. A method to reduce the problem to a computationally tractable form is described
A consistent time frame for Chaucer's Canterbury Pilgrimage
Kummerer, K. R.
2001-08-01
A consistent time frame for the pilgrimage that Geoffrey Chaucer describes in The Canterbury Tales can be established if the seven celestial assertions related to the journey mentioned in the text can be reconciled with each other and the date of April 18 that is also mentioned. Past attempts to establish such a consistency for all seven celestial assertions have not been successful. The analysis herein, however, indicates that in The Canterbury Tales Chaucer accurately describes the celestial conditions he observed in the April sky above the London(Canterbury region of England in the latter half of the fourteenth century. All seven celestial assertions are in agreement with each other and consistent with the April 18 date. The actual words of Chaucer indicate that the Canterbury journey began during the 'seson' he defines in the General Prologue and ends under the light of the full Moon on the night of April 18, 1391.
An approach to a self-consistent nuclear energy system
International Nuclear Information System (INIS)
Fujii-e, Yoichi; Arie, Kazuo; Endo, Hiroshi
1992-01-01
A nuclear energy system should provide a stable supply of energy without endangering the environment or humans. If there is fear about exhausting world energy resources, accumulating radionuclides, and nuclear reactor safety, tension is created in human society. Nuclear energy systems of the future should be able to eliminate fear from people's minds. In other words, the whole system, including the nuclear fuel cycle, should be self-consistent. This is the ultimate goal of nuclear energy. If it can be realized, public acceptance of nuclear energy will increase significantly. In a self-consistent nuclear energy system, misunderstandings between experts on nuclear energy and the public should be minimized. The way to achieve this goal is to explain using simple logic. This paper proposes specific targets for self-consistent nuclear energy systems and shows that the fast breeder reactor (FBR) lies on the route to attaining the final goal
Consistent forcing scheme in the cascaded lattice Boltzmann method.
Fei, Linlin; Luo, Kai Hong
2017-11-01
In this paper, we give an alternative derivation for the cascaded lattice Boltzmann method (CLBM) within a general multiple-relaxation-time (MRT) framework by introducing a shift matrix. When the shift matrix is a unit matrix, the CLBM degrades into an MRT LBM. Based on this, a consistent forcing scheme is developed for the CLBM. The consistency of the nonslip rule, the second-order convergence rate in space, and the property of isotropy for the consistent forcing scheme is demonstrated through numerical simulations of several canonical problems. Several existing forcing schemes previously used in the CLBM are also examined. The study clarifies the relation between MRT LBM and CLBM under a general framework.
Consistency and Reconciliation Model In Regional Development Planning
Directory of Open Access Journals (Sweden)
Dina Suryawati
2016-10-01
Full Text Available The aim of this study was to identify the problems and determine the conceptual model of regional development planning. Regional development planning is a systemic, complex and unstructured process. Therefore, this study used soft systems methodology to outline unstructured issues with a structured approach. The conceptual models that were successfully constructed in this study are a model of consistency and a model of reconciliation. Regional development planning is a process that is well-integrated with central planning and inter-regional planning documents. Integration and consistency of regional planning documents are very important in order to achieve the development goals that have been set. On the other hand, the process of development planning in the region involves technocratic system, that is, both top-down and bottom-up system of participation. Both must be balanced, do not overlap and do not dominate each other. regional, development, planning, consistency, reconciliation
Bootstrap-Based Inference for Cube Root Consistent Estimators
DEFF Research Database (Denmark)
Cattaneo, Matias D.; Jansson, Michael; Nagasawa, Kenichi
This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known...... to be inconsistent. Our method restores consistency of the nonparametric bootstrap by altering the shape of the criterion function defining the estimator whose distribution we seek to approximate. This modification leads to a generic and easy-to-implement resampling method for inference that is conceptually distinct...... from other available distributional approximations based on some form of modified bootstrap. We offer simulation evidence showcasing the performance of our inference method in finite samples. An extension of our methodology to general M-estimation problems is also discussed....
Self-consistent modelling of resonant tunnelling structures
DEFF Research Database (Denmark)
Fiig, T.; Jauho, A.P.
1992-01-01
We report a comprehensive study of the effects of self-consistency on the I-V-characteristics of resonant tunnelling structures. The calculational method is based on a simultaneous solution of the effective-mass Schrödinger equation and the Poisson equation, and the current is evaluated...... applied voltages and carrier densities at the emitter-barrier interface. We include the two-dimensional accumulation layer charge and the quantum well charge in our self-consistent scheme. We discuss the evaluation of the current contribution originating from the two-dimensional accumulation layer charges......, and our qualitative estimates seem consistent with recent experimental studies. The intrinsic bistability of resonant tunnelling diodes is analyzed within several different approximation schemes....
The cluster bootstrap consistency in generalized estimating equations
Cheng, Guang
2013-03-01
The cluster bootstrap resamples clusters or subjects instead of individual observations in order to preserve the dependence within each cluster or subject. In this paper, we provide a theoretical justification of using the cluster bootstrap for the inferences of the generalized estimating equations (GEE) for clustered/longitudinal data. Under the general exchangeable bootstrap weights, we show that the cluster bootstrap yields a consistent approximation of the distribution of the regression estimate, and a consistent approximation of the confidence sets. We also show that a computationally more efficient one-step version of the cluster bootstrap provides asymptotically equivalent inference. © 2012.
Consistency in the description of diffusion in compacted bentonite
International Nuclear Information System (INIS)
Lehikoinen, J.; Muurinen, A.
2009-01-01
A macro-level diffusion model, which aims to provide a unifying framework for explaining the experimentally observed co-ion exclusion and greatly controversial counter-ion surface diffusion in a consistent fashion, is presented. It is explained in detail why a term accounting for the non-zero mobility of the counter-ion surface excess is required in the mathematical form of the macroscopic diffusion flux. The prerequisites for the consistency of the model and the problems associated with the interpretation of diffusion in such complex pore geometries as in compacted smectite clays are discussed. (author)
The consistency service of the ATLAS Distributed Data Management system
Serfon, C; The ATLAS collaboration
2011-01-01
With the continuously increasing volume of data produced by ATLAS and stored on the WLCG sites, the probability of data corruption or data losses, due to software and hardware failures is increasing. In order to ensure the consistency of all data produced by ATLAS a Consistency Service has been developed as part of the DQ2 Distributed Data Management system. This service is fed by the different ATLAS tools, i.e. the analysis tools, production tools, DQ2 site services or by site administrators that report corrupted or lost files. It automatically corrects the errors reported and informs the users in case of irrecoverable file loss.
The Consistency Service of the ATLAS Distributed Data Management system
Serfon, C; The ATLAS collaboration
2010-01-01
With the continuously increasing volume of data produced by ATLAS and stored on the WLCG sites, the probability of data corruption or data losses, due to software and hardware failure is increasing. In order to ensure the consistency of all data produced by ATLAS a Consistency Service has been developed as part of the DQ2 Distributed Data Management system. This service is fed by the different ATLAS tools, i.e. the analysis tools, production tools, DQ2 site services or by site administrators that report corrupted or lost files. It automatically correct the errors reported and informs the users in case of irrecoverable file loss.
Consistency among integral measurements of aggregate decay heat power
Energy Technology Data Exchange (ETDEWEB)
Takeuchi, H.; Sagisaka, M.; Oyamatsu, K.; Kukita, Y. [Nagoya Univ. (Japan)
1998-03-01
Persisting discrepancies between summation calculations and integral measurements force us to assume large uncertainties in the recommended decay heat power. In this paper, we develop a hybrid method to calculate the decay heat power of a fissioning system from those of different fissioning systems. Then, this method is applied to examine consistency among measured decay heat powers of {sup 232}Th, {sup 233}U, {sup 235}U, {sup 238}U and {sup 239}Pu at YAYOI. The consistency among the measured values are found to be satisfied for the {beta} component and fairly well for the {gamma} component, except for cooling times longer than 4000 s. (author)
Standard Model Vacuum Stability and Weyl Consistency Conditions
DEFF Research Database (Denmark)
Antipin, Oleg; Gillioz, Marc; Krog, Jens
2013-01-01
At high energy the standard model possesses conformal symmetry at the classical level. This is reflected at the quantum level by relations between the different beta functions of the model. These relations are known as the Weyl consistency conditions. We show that it is possible to satisfy them...... order by order in perturbation theory, provided that a suitable coupling constant counting scheme is used. As a direct phenomenological application, we study the stability of the standard model vacuum at high energies and compare with previous computations violating the Weyl consistency conditions....
STP: A mathematically and physically consistent library of steam properties
International Nuclear Information System (INIS)
Aguilar, F.; Hutter, A.C.; Tuttle, P.G.
1982-01-01
A new FORTRAN library of subroutines has been developed from the fundamental equation of Keenan et al. to evaluate a large set of water properties including derivatives such as sound speed and isothermal compressibility. The STP library uses the true saturation envelope of the Keenan et al. fundamental equation. The evaluation of the true envelope by a continuation method is explained. This envelope, along with other design features, imparts an exceptionally high degree of thermodynamic and mathematical consistency to the STP library, even at the critical point. Accuracy and smoothness, library self-consistency, and designed user convenience make the STP library a reliable and versatile water property package
Weyl consistency conditions in non-relativistic quantum field theory
Energy Technology Data Exchange (ETDEWEB)
Pal, Sridip; Grinstein, Benjamín [Department of Physics, University of California,San Diego, 9500 Gilman Drive, La Jolla, CA 92093 (United States)
2016-12-05
Weyl consistency conditions have been used in unitary relativistic quantum field theory to impose constraints on the renormalization group flow of certain quantities. We classify the Weyl anomalies and their renormalization scheme ambiguities for generic non-relativistic theories in 2+1 dimensions with anisotropic scaling exponent z=2; the extension to other values of z are discussed as well. We give the consistency conditions among these anomalies. As an application we find several candidates for a C-theorem. We comment on possible candidates for a C-theorem in higher dimensions.
A Van Atta reflector consisting of half-wave dipoles
DEFF Research Database (Denmark)
Appel-Hansen, Jørgen
1966-01-01
The reradiation pattern of a passive Van Atta reflector consisting of half-wave dipoles is investigated. The character of the reradiation pattern first is deduced by qualitative and physical considerations. Various types of array elements are considered and several geometrical configurations...... of these elements are outlined. Following this, an analysis is made of the reradiation pattern of a linear Van Atta array consisting of four equispaced half-wave dipoles. The general form of the reradiation pattern is studied analytically. The influence of scattering and coupling is determined and the dependence...
A self-consistent theory of the magnetic polaron
International Nuclear Information System (INIS)
Marvakov, D.I.; Kuzemsky, A.L.; Vlahov, J.P.
1984-10-01
A finite temperature self-consistent theory of magnetic polaron in the s-f model of ferromagnetic semiconductors is developed. The calculations are based on the novel approach of the thermodynamic two-time Green function methods. This approach consists in the introduction of the ''irreducible'' Green functions (IGF) and derivation of the exact Dyson equation and exact self-energy operator. It is shown that IGF method gives a unified and natural approach for a calculation of the magnetic polaron states by taking explicitly into account the damping effects and finite lifetime. (author)
Evidence for Consistency of the Glycation Gap in Diabetes
Nayak, Ananth U.; Holland, Martin R.; Macdonald, David R.; Nevill, Alan; Singh, Baldev M.
2011-01-01
OBJECTIVE Discordance between HbA1c and fructosamine estimations in the assessment of glycemia is often encountered. A number of mechanisms might explain such discordance, but whether it is consistent is uncertain. This study aims to coanalyze paired glycosylated hemoglobin (HbA1c)-fructosamine estimations by using fructosamine to determine a predicted HbA1c, to calculate a glycation gap (G-gap) and to determine whether the G-gap is consistent over time. RESEARCH DESIGN AND METHODS We include...
Diagnostic language consistency among multicultural English-speaking nurses.
Wieck, K L
1996-01-01
Cultural differences among nurses may influence the choice of terminology applicable to use of a nursing diagnostic statement. This study explored whether defining characteristics are consistently applied by culturally varied nurses in an English language setting. Two diagnoses, pain, and high risk for altered skin integrity, were studied within six cultures: African, Asian, Filipino, East Indian, African-American, and Anglo-American nurses. Overall, there was consistency between the cultural groups. Analysis of variance for the pain scale demonstrated differences among cultures on two characteristics of pain, restlessness and grimace. The only difference on the high risk for altered skin integrity scale was found on the distructor, supple skin.
Towards a robust and consistent middle Eocene astronomical timescale
Boulila, Slah; Vahlenkamp, Maximilian; De Vleeschouwer, David; Laskar, Jacques; Yamamoto, Yuhji; Pälike, Heiko; Kirtland Turner, Sandra; Sexton, Philip F.; Westerhold, Thomas; Röhl, Ursula
2018-03-01
Until now, the middle Eocene has remained a poorly constrained interval of efforts to produce an astrochronological timescale for the entire Cenozoic. This has given rise to a so-called "Eocene astronomical timescale gap" (Vandenberghe et al., 2012). A high-resolution astrochronological calibration for this interval has proven to be difficult to realize, mainly because carbonate-rich deep-marine sequences of this age are scarce. In this paper, we present records from middle Eocene carbonate-rich sequences from the North Atlantic Southeast Newfoundland Ridge (IODP Exp. 342, Sites U1408 and U1410), of which the cyclical sedimentary patterns allow for an orbital calibration of the geologic timescale between ∼38 and ∼48 Ma. These carbonate-rich cyclic sediments at Sites U1408 and U1410 were deposited as drift deposits and exhibit prominent lithological alternations (couplets) between greenish nannofossil-rich clay and white nannofossil ooze. The principal lithological couplet is driven by the obliquity of Earth's axial tilt, and the intensity of their expression is modulated by a cyclicity of about 173 kyr. This cyclicity corresponds to the interference of secular frequencies s3 and s6 (related to the precession of nodes of the Earth and Saturn, respectively). This 173-kyr obliquity amplitude modulation cycle is exceptionally well recorded in the XRF (X-ray fluorescence)-derived Ca/Fe ratio. In this work, we first demonstrate the stability of the (s3-s6) cycles using the latest astronomical solutions. Results show that this orbital component is stable back to at least 50 Ma, and can thus serve as a powerful geochronometer in the mid-Eocene portion of the Cenozoic timescale. We then exploit this potential by calibrating the geochronology of the recovered middle Eocene timescale between magnetic polarity Chrons C18n.1n and C21n. Comparison with previous timescales shows similarities, but also notable differences in durations of certain magnetic polarity chrons. We
Elastic constants of the hard disc system in the self-consistent free volume approximation
International Nuclear Information System (INIS)
Wojciechowski, K.W.
1990-09-01
Elastic moduli of the two dimensional hard disc crystal are determined exactly within the Kirkwood self-consistent free volume approximation and compared with the Monte Carlo simulation results. (author). 22 refs, 1 fig., 1 tab
Behavioral Consistency of C and Verilog Programs Using Bounded Model Checking
National Research Council Canada - National Science Library
Clarke, Edmund; Kroening, Daniel; Yorav, Karen
2003-01-01
.... We describe experimental results on various reactive present an algorithm that checks behavioral consistency between an ANSI-C program and a circuit given in Verilog using Bounded Model Checking...
Affective, Cognitive, and Behavioral Consistency of Chinese-Malay Interracial Attitudes
Rabushka, Alvin
1970-01-01
This study, part of an overall study of political and social integration in Malaya carried out in 1966-67, has resulted in findings that vary from consistency theory research in developed countries. (DB)
The importance of physician knowledge of autism spectrum disorder: results of a parent survey
Directory of Open Access Journals (Sweden)
Scarpa Angela
2007-11-01
Full Text Available Abstract Background Early diagnosis and referral to treatment prior to age 3–5 years improves the prognosis of children with Autism Spectrum Disorder (ASD. However, ASD is often not diagnosed until age 3–4 years, and medical providers may lack training to offer caregivers evidence-based treatment recommendations. This study tested hypotheses that 1 children with ASD would be diagnosed between ages 3–4 years (replicating prior work, 2 caregivers would receive little information beyond the diagnosis from their medical providers, and 3 caregivers would turn to other sources, outside of their local health care professionals, to learn more about ASD. Methods 146 ASD caregivers responded to an online survey that consisted of questions about demographics, the diagnostic process, sources of information/support, and the need and availability of local services for ASDs. Hypotheses were tested using descriptives, regression analyses, analyses of variance, and chi-squared. Results The average age of diagnosis was 4 years, 10 months and the mode was 3 years. While approximately 40% of professionals gave additional information about ASD after diagnosis and 15–34% gave advice on medical/educational programs, only 6% referred to an autism specialist and 18% gave no further information. The diagnosis of Autism was made at earlier ages than Asperger's Disorder or PDD-NOS. Developmental pediatricians (relative to psychiatrists/primary care physicians, neurologists, and psychologists were associated with the lowest age of diagnosis and were most likely to distribute additional information. Caregivers most often reported turning to the media (i.e., internet, books, videos, conferences, and other parents to learn more about ASD. Conclusion The average age of ASD diagnosis (4 years, 10 months was later than optimal if children are to receive the most benefit from early intervention. Most professionals gave caregivers further information about ASDs, especially
The least weighted squares II. Consistency and asymptotic normality
Czech Academy of Sciences Publication Activity Database
Víšek, Jan Ámos
2002-01-01
Roč. 9, č. 16 (2002), s. 1-28 ISSN 1212-074X R&D Projects: GA AV ČR KSK1019101 Grant - others:GA UK(CR) 255/2000/A EK /FSV Institutional research plan: CEZ:AV0Z1075907 Keywords : robust regression * consistency * asymptotic normality Subject RIV: BA - General Mathematics
Consistency relation for the Lorentz invariant single-field inflation
International Nuclear Information System (INIS)
Huang, Qing-Guo
2010-01-01
In this paper we compute the sizes of equilateral and orthogonal shape bispectrum for the general Lorentz invariant single-field inflation. The stability of field theory implies a non-negative square of sound speed which leads to a consistency relation between the sizes of orthogonal and equilateral shape bispectrum, namely f NL orth. ≤ −0.054f NL equil. . In particular, for the single-field Dirac-Born-Infeld (DBI) inflation, the consistency relation becomes f NL orth. = 0.070f NL equil. ≤ 0. These consistency relations are also valid in the mixed scenario where the quantum fluctuations of some other light scalar fields contribute to a part of total curvature perturbation on the super-horizon scale and may generate a local form bispectrum. A distinguishing prediction of the mixed scenario is τ NL loc. > ((6/5)f NL loc. ) 2 . Comparing these consistency relations to WMAP 7yr data, there is still a big room for the Lorentz invariant inflation, but DBI inflation has been disfavored at more than 68% CL
Short-Cut Estimators of Criterion-Referenced Test Consistency.
Brown, James Dean
1990-01-01
Presents simplified methods for deriving estimates of the consistency of criterion-referenced, English-as-a-Second-Language tests, including (1) the threshold loss agreement approach using agreement or kappa coefficients, (2) the squared-error loss agreement approach using the phi(lambda) dependability approach, and (3) the domain score…
SOCIAL COMPARISON, SELF-CONSISTENCY AND THE PRESENTATION OF SELF.
MORSE, STANLEY J.; GERGEN, KENNETH J.
TO DISCOVER HOW A PERSON'S (P) SELF-CONCEPT IS AFFECTED BY THE CHARACTERISTICS OF ANOTHER (O) WHO SUDDENLY APPEARS IN THE SAME SOCIAL ENVIRONMENT, SEVERAL QUESTIONNAIRES, INCLUDING THE GERGEN-MORSE (1967) SELF-CONSISTENCY SCALE AND HALF THE COOPERSMITH SELF-ESTEEM INVENTORY, WERE ADMINISTERED TO 78 UNDERGRADUATE MEN WHO HAD ANSWERED AN AD FOR WORK…
Delimiting coefficient alpha from internal consistency and unidimensionality
Sijtsma, K.
2015-01-01
I discuss the contribution by Davenport, Davison, Liou, & Love (2015) in which they relate reliability represented by coefficient α to formal definitions of internal consistency and unidimensionality, both proposed by Cronbach (1951). I argue that coefficient α is a lower bound to reliability and
Challenges of Predictability and Consistency in the First ...
African Journals Online (AJOL)
This article aims to investigate some features of Endemann's (1911) Wörterbuch der Sotho-Sprache (Dictionary of the Sotho language) with the focus on challenges of predictability and consistency in the lemmatization approach, the access alphabet, cross references and article treatments. The dictionary has hitherto ...
The Impact of Orthographic Consistency on German Spoken Word Identification
Beyermann, Sandra; Penke, Martina
2014-01-01
An auditory lexical decision experiment was conducted to find out whether sound-to-spelling consistency has an impact on German spoken word processing, and whether such an impact is different at different stages of reading development. Four groups of readers (school children in the second, third and fifth grades, and university students)…
Final Report Fermionic Symmetries and Self consistent Shell Model
International Nuclear Information System (INIS)
Zamick, Larry
2008-01-01
In this final report in the field of theoretical nuclear physics we note important accomplishments.We were confronted with 'anomoulous' magnetic moments by the experimetalists and were able to expain them. We found unexpected partial dynamical symmetries--completely unknown before, and were able to a large extent to expain them. The importance of a self consistent shell model was emphasized.
Using the Perceptron Algorithm to Find Consistent Hypotheses
Anthony, M.; Shawe-Taylor, J.
1993-01-01
The perceptron learning algorithm yields quite naturally an algorithm for finding a linearly separable boolean function consistent with a sample of such a function. Using the idea of a specifying sample, we give a simple proof that this algorithm is not efficient, in general.
Consistent seasonal snow cover depth and duration variability over ...
Indian Academy of Sciences (India)
Decline in consistent seasonal snow cover depth, duration and changing snow cover build- up pattern over the WH in recent decades indicate that WH has undergone considerable climate change and winter weather patterns are changing in the WH. 1. Introduction. Mountainous regions around the globe are storehouses.
Is There a Future for Education Consistent with Agenda 21?
Smyth, John
1999-01-01
Discusses recent experiences in developing and implementing strategies for education consistent with the concept of sustainable development at two different levels: (1) the international level characterized by Agenda 21 along with the efforts of the United Nations Commission on Sustainable Development to foster its progress; and (2) the national…
Consistent dynamical and statistical description of fission and comparison
Energy Technology Data Exchange (ETDEWEB)
Shunuan, Wang [Chinese Nuclear Data Center, Beijing, BJ (China)
1996-06-01
The research survey of consistent dynamical and statistical description of fission is briefly introduced. The channel theory of fission with diffusive dynamics based on Bohr channel theory of fission and Fokker-Planck equation and Kramers-modified Bohr-Wheeler expression according to Strutinsky method given by P.Frobrich et al. are compared and analyzed. (2 figs.).
Consistency of the Self-Schema in Depression.
Ross, Michael J.; Mueller, John H.
Depressed individuals may filter or distort environmental information in direct relationship to their self perceptions. To investigate the degree of uncertainty about oneself and others, as measured by consistent/inconsistent responses, 72 college students (32 depressed and 40 nondepressed) rated selected adjectives from the Derry and Kuiper…
Composition consisting of a dendrimer and an active substance
1995-01-01
The invention relates to a composition consisting of a dendrimer provided with blocking agents and an active substance occluded in the dendrimer. According to the invention a blocking agent is a compound which is sterically of sufficient size, which readily enters into a chemical bond with the
A consistent analysis for the quark condensate in QCD
International Nuclear Information System (INIS)
Huang Zheng; Huang Tao
1988-08-01
The dynamical symmetry breaking in QCD is analysed based on the vacuum condensates. A self-consistent equation for the quark condensate (φ φ) is derived. A nontrivial solution for (φ φ) ≠ 0 is given in terms of the QCD scale parameter A
The consistency assessment of topological relations in cartographic generalization
Zheng, Chunyan; Guo, Qingsheng; Du, Xiaochu
2006-10-01
The field of research in the generalization assessment has been less studied than the generalization process itself, and it is very important to keep topological relation consistency for meeting generalization quality. This paper proposes a methodology to assess the quality of generalized map from topological relations consistency. Taking roads (including railway) and residential areas for examples, from the viewpoint of the spatial cognition, some issues about topological consistency in different map scales are analyzed. The statistic information about the inconsistent topological relations can be obtained by comparing the two matrices: one is the matrix for the topological relations in the generalized map; the other is the theoretical matrix for the topological relations that should be maintained after generalization. Based on the fuzzy set theory and the classification of map object types, the consistency evaluation model of topological relations is established. The paper proves the feasibility of the method through the example about how to evaluate the local topological relations between simple roads and residential area finally.
Numerical consistency check between two approaches to radiative ...
Indian Academy of Sciences (India)
approaches for a consistency check on numerical accuracy, and find out the stabil- ... ln(MR/1 GeV) to top-quark mass scale t0(= ln(mt/1 GeV)) where t0 ≤ t ≤ tR, we ..... It is in general to tone down the solar mixing angle through further fine.
Consistency or Discrepancy? Rethinking Schools from Organizational Hypocrisy to Integrity
Kiliçoglu, Gökhan
2017-01-01
Consistency in statements, decisions and practices is highly important for both organization members and the image of an organization. It is expected from organizations, especially from their administrators, to "walk the talk"--in other words, to try to practise what they preach. However, in the process of gaining legitimacy and adapting…
Consistency Check for the Bin Packing Constraint Revisited
Dupuis, Julien; Schaus, Pierre; Deville, Yves
The bin packing problem (BP) consists in finding the minimum number of bins necessary to pack a set of items so that the total size of the items in each bin does not exceed the bin capacity C. The bin capacity is common for all the bins.
Matrix analysis for associated consistency in cooperative game theory
Xu, G.; Driessen, Theo; Sun, H.; Sun, H.
Hamiache's recent axiomatization of the well-known Shapley value for TU games states that the Shapley value is the unique solution verifying the following three axioms: the inessential game property, continuity and associated consistency. Driessen extended Hamiache's axiomatization to the enlarged
Matrix analysis for associated consistency in cooperative game theory
Xu Genjiu, G.; Driessen, Theo; Sun, H.; Sun, H.
Hamiache axiomatized the Shapley value as the unique solution verifying the inessential game property, continuity and associated consistency. Driessen extended Hamiache’s axiomatization to the enlarged class of efficient, symmetric, and linear values. In this paper, we introduce the notion of row
Consistent measurements comparing the drift features of noble gas mixtures
Becker, U; Fortunato, E M; Kirchner, J; Rosera, K; Uchida, Y
1999-01-01
We present a consistent set of measurements of electron drift velocities and Lorentz deflection angles for all noble gases with methane and ethane as quenchers in magnetic fields up to 0.8 T. Empirical descriptions are also presented. Details on the World Wide Web allow for guided design and optimization of future detectors.
Structural covariance networks across healthy young adults and their consistency.
Guo, Xiaojuan; Wang, Yan; Guo, Taomei; Chen, Kewei; Zhang, Jiacai; Li, Ke; Jin, Zhen; Yao, Li
2015-08-01
To investigate structural covariance networks (SCNs) as measured by regional gray matter volumes with structural magnetic resonance imaging (MRI) from healthy young adults, and to examine their consistency and stability. Two independent cohorts were included in this study: Group 1 (82 healthy subjects aged 18-28 years) and Group 2 (109 healthy subjects aged 20-28 years). Structural MRI data were acquired at 3.0T and 1.5T using a magnetization prepared rapid-acquisition gradient echo sequence for these two groups, respectively. We applied independent component analysis (ICA) to construct SCNs and further applied the spatial overlap ratio and correlation coefficient to evaluate the spatial consistency of the SCNs between these two datasets. Seven and six independent components were identified for Group 1 and Group 2, respectively. Moreover, six SCNs including the posterior default mode network, the visual and auditory networks consistently existed across the two datasets. The overlap ratios and correlation coefficients of the visual network reached the maximums of 72% and 0.71. This study demonstrates the existence of consistent SCNs corresponding to general functional networks. These structural covariance findings may provide insight into the underlying organizational principles of brain anatomy. © 2014 Wiley Periodicals, Inc.
Consistency in behavior of the CEO regarding corporate social responsibility
Elving, W.J.L.; Kartal, D.
2012-01-01
Purpose - When corporations adopt a corporate social responsibility (CSR) program and use and name it in their external communications, their members should act in line with CSR. The purpose of this paper is to present an experiment in which the consistent or inconsistent behavior of a CEO was
Self-consistent description of the isospin mixing
International Nuclear Information System (INIS)
Gabrakov, S.I.; Pyatov, N.I.; Baznat, M.I.; Salamov, D.I.
1978-03-01
The properties of collective 0 + states built of unlike particle-hole excitations in spherical nuclei have been investigated in a self-consistent microscopic approach. These states arise when the broken isospin symmetry of the nuclear shell model Hamiltonian is restored. The numerical calculations were performed with Woods-Saxon wave functions
Potential application of the consistency approach for vaccine potency testing.
Arciniega, J; Sirota, L A
2012-01-01
The Consistency Approach offers the possibility of reducing the number of animals used for a potency test. However, it is critical to assess the effect that such reduction may have on assay performance. Consistency of production, sometimes referred to as consistency of manufacture or manufacturing, is an old concept implicit in regulation, which aims to ensure the uninterrupted release of safe and effective products. Consistency of manufacture can be described in terms of process capability, or the ability of a process to produce output within specification limits. For example, the standard method for potency testing of inactivated rabies vaccines is a multiple-dilution vaccination challenge test in mice that gives a quantitative, although highly variable estimate. On the other hand, a single-dilution test that does not give a quantitative estimate, but rather shows if the vaccine meets the specification has been proposed. This simplified test can lead to a considerable reduction in the number of animals used. However, traditional indices of process capability assume that the output population (potency values) is normally distributed, which clearly is not the case for the simplified approach. Appropriate computation of capability indices for the latter case will require special statistical considerations.
Energy Technology Data Exchange (ETDEWEB)
Balakrishna, C; Sarma, B S [Defence Research and Development Laboratory, Hyderabad (India)
1989-02-01
A formulation for axisymmetric shell analysis under asymmetric load based on Fourier series representation and using field consistent 3 noded curved axisymmetric shell element is presented. Different field inconsistent/consistent interpolations for an element based on shear flexible theory have been studied for thick and thin shells under asymmetric loads. Various examples covering axisymmetric as well as asymmetric loading cases have been analyzed and numerical results show a good agreement with the available results in the case of thin shells. 12 refs.
Performance and consistency of indicator groups in two biodiversity hotspots.
Directory of Open Access Journals (Sweden)
Joaquim Trindade-Filho
Full Text Available In a world limited by data availability and limited funds for conservation, scientists and practitioners must use indicator groups to define spatial conservation priorities. Several studies have evaluated the effectiveness of indicator groups, but still little is known about the consistency in performance of these groups in different regions, which would allow their a priori selection.We systematically examined the effectiveness and the consistency of nine indicator groups in representing mammal species in two top-ranked Biodiversity Hotspots (BH: the Brazilian Cerrado and the Atlantic Forest. To test for group effectiveness we first found the best sets of sites able to maximize the representation of each indicator group in the BH and then calculated the average representation of different target species by the indicator groups in the BH. We considered consistent indicator groups whose representation of target species was not statistically different between BH. We called effective those groups that outperformed the target-species representation achieved by random sets of species. Effective indicator groups required the selection of less than 2% of the BH area for representing target species. Restricted-range species were the most effective indicators for the representation of all mammal diversity as well as target species. It was also the only group with high consistency.We show that several indicator groups could be applied as shortcuts for representing mammal species in the Cerrado and the Atlantic Forest to develop conservation plans, however, only restricted-range species consistently held as the most effective indicator group for such a task. This group is of particular importance in conservation planning as it captures high diversity of endemic and endangered species.
Performance and consistency of indicator groups in two biodiversity hotspots.
Trindade-Filho, Joaquim; Loyola, Rafael Dias
2011-01-01
In a world limited by data availability and limited funds for conservation, scientists and practitioners must use indicator groups to define spatial conservation priorities. Several studies have evaluated the effectiveness of indicator groups, but still little is known about the consistency in performance of these groups in different regions, which would allow their a priori selection. We systematically examined the effectiveness and the consistency of nine indicator groups in representing mammal species in two top-ranked Biodiversity Hotspots (BH): the Brazilian Cerrado and the Atlantic Forest. To test for group effectiveness we first found the best sets of sites able to maximize the representation of each indicator group in the BH and then calculated the average representation of different target species by the indicator groups in the BH. We considered consistent indicator groups whose representation of target species was not statistically different between BH. We called effective those groups that outperformed the target-species representation achieved by random sets of species. Effective indicator groups required the selection of less than 2% of the BH area for representing target species. Restricted-range species were the most effective indicators for the representation of all mammal diversity as well as target species. It was also the only group with high consistency. We show that several indicator groups could be applied as shortcuts for representing mammal species in the Cerrado and the Atlantic Forest to develop conservation plans, however, only restricted-range species consistently held as the most effective indicator group for such a task. This group is of particular importance in conservation planning as it captures high diversity of endemic and endangered species.
Castanheira, Maria Filipa; Martínez Páramo, Sonia; Figueiredo, F; Cerqueira, Marco; Millot, Sandie; Oliveira, Catarina C V; Martins, Catarina I M; Conceição, Luís E C
2016-10-01
Individual differences in behaviour and physiological responses to stress are associated with evolutionary adaptive variation and thus raw material for evolution. In farmed animals, the interest in consistent trait associations, i.e. coping styles, has increased dramatically over the last years. However, one of limitations of the available knowledge, regarding the temporal consistency, is that it refers always to short-term consistency (usually few weeks). The present study used an escape response during a net restraining test, previously shown to be an indicative of coping styles in seabream, to investigate long-term consistency of coping styles both over time and during different life history stages. Results showed both short-term (14 days) consistency and long-term (8 months) consistency of escape response. However, we did not found consistency in the same behaviour after sexual maturation when the restraining test was repeated 16, 22 and 23 months after the first test was performed. In conclusion, this study showed consistent behaviour traits in seabream when juveniles, and a loss of this behavioural traits when adults. Therefore, these results underline that adding a life story approach to data interpretation as an essential step forward towards coping styles foreground. Furthermore, a fine-tuning of aquaculture rearing strategies to adapt to different coping strategies may need to be adjusted differently at early stages of development and adults to improve the welfare of farmed fish.
Self-consistent ECCD calculations with bootstrap current
International Nuclear Information System (INIS)
Decker, J.; Bers, A.; Ram, A. K; Peysson, Y.
2003-01-01
To achieve high performance, steady-state operation in tokamaks, it is increasingly important to find the appropriate means for modifying and sustaining the pressure and magnetic shear profiles in the plasma. In such advanced scenarios, especially in the vicinity of internal transport barrier, RF induced currents have to be calculated self-consistently with the bootstrap current, thus taking into account possible synergistic effects resulting from the momentum space distortion of the electron distribution function f e . Since RF waves can cause the distribution of electrons to become non-Maxwellian, the associated changes in parallel diffusion of momentum between trapped and passing particles can be expected to modify the bootstrap current fraction; conversely, the bootstrap current distribution function can enhance the current driven by RF waves. For this purpose, a new, fast and fully implicit solver has been recently developed to carry out computations including new and detailed evaluations of the interactions between bootstrap current (BC) and Electron Cyclotron current drive (ECCD). Moreover, Ohkawa current drive (OKCD) appears to be an efficient method for driving current when the fraction of trapped particles is large. OKCD in the presence of BC is also investigated. Here, results are illustrated around projected tokamak parameters in high performance scenarios of AlcatorC-MOD. It is shown that by increasing n // , the EC wave penetration into the bulk of the electron distribution is greater, and since the resonance extends up to high p // values, this situation is the usual ECCD based on the Fisch-Boozer mechanism concerning passing particles. However, because of the close vicinity of the trapped boundary at r/a=0.7, this process is counterbalanced by the Ohkawa effect, possibly leading to a negative net current. Therefore, by injecting the EC wave in the opposite toroidal direction (n // RF by OKCD may be 70% larger than that of ECCD, with a choice of EC
High consistency cellulase treatment of hardwood prehydrolysis kraft based dissolving pulp.
Wang, Qiang; Liu, Shanshan; Yang, Guihua; Chen, Jiachuan; Ni, Yonghao
2015-01-01
For enzymatic treatment of dissolving pulp, there is a need to improve the process to facilitate its commercialization. For this purpose, the high consistency cellulase treatment was conducted based on the hypothesis that a high cellulose concentration would favor the interactions of cellulase and cellulose, thus improves the cellulase efficiency while decreasing the water usage. The results showed that compared with a low consistency of 3%, the high consistency of 20% led to 24% increases of cellulase adsorption ratio. As a result, the viscosity decrease and Fock reactivity increase at consistency of 20% were enhanced from 510 mL/g and 70.3% to 471 mL/g and 77.6%, respectively, compared with low consistency of 3% at 24h. The results on other properties such as alpha cellulose, alkali solubility and molecular weight distribution also supported the conclusion that a high consistency of cellulase treatment was more effective than a low pulp consistency process. Copyright © 2015 Elsevier Ltd. All rights reserved.
Dispersion Differences and Consistency of Artificial Periodic Structures.
Cheng, Zhi-Bao; Lin, Wen-Kai; Shi, Zhi-Fei
2017-10-01
Dispersion differences and consistency of artificial periodic structures, including phononic crystals, elastic metamaterials, as well as periodic structures composited of phononic crystals and elastic metamaterials, are investigated in this paper. By developing a K(ω) method, complex dispersion relations and group/phase velocity curves of both the single-mechanism periodic structures and the mixing-mechanism periodic structures are calculated at first, from which dispersion differences of artificial periodic structures are discussed. Then, based on a unified formulation, dispersion consistency of artificial periodic structures is investigated. Through a comprehensive comparison study, the correctness for the unified formulation is verified. Mathematical derivations of the unified formulation for different artificial periodic structures are presented. Furthermore, physical meanings of the unified formulation are discussed in the energy-state space.
Rotating D0-branes and consistent truncations of supergravity
International Nuclear Information System (INIS)
Anabalón, Andrés; Ortiz, Thomas; Samtleben, Henning
2013-01-01
The fluctuations around the D0-brane near-horizon geometry are described by two-dimensional SO(9) gauged maximal supergravity. We work out the U(1) 4 truncation of this theory whose scalar sector consists of five dilaton and four axion fields. We construct the full non-linear Kaluza–Klein ansatz for the embedding of the dilaton sector into type IIA supergravity. This yields a consistent truncation around a geometry which is the warped product of a two-dimensional domain wall and the sphere S 8 . As an application, we consider the solutions corresponding to rotating D0-branes which in the near-horizon limit approach AdS 2 ×M 8 geometries, and discuss their thermodynamical properties. More generally, we study the appearance of such solutions in the presence of non-vanishing axion fields
Substituting fields within the action: Consistency issues and some applications
International Nuclear Information System (INIS)
Pons, Josep M.
2010-01-01
In field theory, as well as in mechanics, the substitution of some fields in terms of other fields at the level of the action raises an issue of consistency with respect to the equations of motion. We discuss this issue and give an expression which neatly displays the difference between doing the substitution at the level of the Lagrangian or at the level of the equations of motion. Both operations do not commute in general. A very relevant exception is the case of auxiliary variables, which are discussed in detail together with some of their relevant applications. We discuss the conditions for the preservation of symmetries--Noether as well as non-Noether--under the reduction of degrees of freedom provided by the mechanism of substitution. We also examine how the gauge fixing procedures fit in our framework and give simple examples on the issue of consistency in this case.
Design of a Turbulence Generator of Medium Consistency Pulp Pumps
Directory of Open Access Journals (Sweden)
Hong Li
2012-01-01
Full Text Available The turbulence generator is a key component of medium consistency centrifugal pulp pumps, with functions to fluidize the medium consistency pulp and to separate gas from the liquid. Structure sizes of the generator affect the hydraulic performance. The radius and the blade laying angle are two important structural sizes of a turbulence generator. Starting with the research on the flow inside and shearing characteristics of the MC pulp, a simple mathematical model at the flow section of the shearing chamber is built, and the formula and procedure to calculate the radius of the turbulence generator are established. The blade laying angle is referenced from the turbine agitator which has the similar shape with the turbulence generator, and the CFD simulation is applied to study the different flow fields with different blade laying angles. Then the recommended blade laying angle of the turbulence generator is formed to be between 60° and 75°.
Consistency Across Standards or Standards in a New Business Model
Russo, Dane M.
2010-01-01
Presentation topics include: standards in a changing business model, the new National Space Policy is driving change, a new paradigm for human spaceflight, consistency across standards, the purpose of standards, danger of over-prescriptive standards, a balance is needed (between prescriptive and general standards), enabling versus inhibiting, characteristics of success-oriented standards, characteristics of success-oriented standards, and conclusions. Additional slides include NASA Procedural Requirements 8705.2B identifies human rating standards and requirements, draft health and medical standards for human rating, what's been done, government oversight models, examples of consistency from anthropometry, examples of inconsistency from air quality and appendices of government and non-governmental human factors standards.
Quantitative verification of ab initio self-consistent laser theory.
Ge, Li; Tandy, Robert J; Stone, A D; Türeci, Hakan E
2008-10-13
We generalize and test the recent "ab initio" self-consistent (AISC) time-independent semiclassical laser theory. This self-consistent formalism generates all the stationary lasing properties in the multimode regime (frequencies, thresholds, internal and external fields, output power and emission pattern) from simple inputs: the dielectric function of the passive cavity, the atomic transition frequency, and the transverse relaxation time of the lasing transition.We find that the theory gives excellent quantitative agreement with full time-dependent simulations of the Maxwell-Bloch equations after it has been generalized to drop the slowly-varying envelope approximation. The theory is infinite order in the non-linear hole-burning interaction; the widely used third order approximation is shown to fail badly.
Lagrangian space consistency relation for large scale structure
International Nuclear Information System (INIS)
Horn, Bart; Hui, Lam; Xiao, Xiao
2015-01-01
Consistency relations, which relate the squeezed limit of an (N+1)-point correlation function to an N-point function, are non-perturbative symmetry statements that hold even if the associated high momentum modes are deep in the nonlinear regime and astrophysically complex. Recently, Kehagias and Riotto and Peloso and Pietroni discovered a consistency relation applicable to large scale structure. We show that this can be recast into a simple physical statement in Lagrangian space: that the squeezed correlation function (suitably normalized) vanishes. This holds regardless of whether the correlation observables are at the same time or not, and regardless of whether multiple-streaming is present. The simplicity of this statement suggests that an analytic understanding of large scale structure in the nonlinear regime may be particularly promising in Lagrangian space
Rotating D0-branes and consistent truncations of supergravity
Energy Technology Data Exchange (ETDEWEB)
Anabalón, Andrés [Departamento de Ciencias, Facultad de Artes Liberales, Facultad de Ingeniería y Ciencias, Universidad Adolfo Ibáñez, Av. Padre Hurtado 750, Viña del Mar (Chile); Université de Lyon, Laboratoire de Physique, UMR 5672, CNRS École Normale Supérieure de Lyon 46, allée d' Italie, F-69364 Lyon cedex 07 (France); Ortiz, Thomas; Samtleben, Henning [Université de Lyon, Laboratoire de Physique, UMR 5672, CNRS École Normale Supérieure de Lyon 46, allée d' Italie, F-69364 Lyon cedex 07 (France)
2013-12-18
The fluctuations around the D0-brane near-horizon geometry are described by two-dimensional SO(9) gauged maximal supergravity. We work out the U(1){sup 4} truncation of this theory whose scalar sector consists of five dilaton and four axion fields. We construct the full non-linear Kaluza–Klein ansatz for the embedding of the dilaton sector into type IIA supergravity. This yields a consistent truncation around a geometry which is the warped product of a two-dimensional domain wall and the sphere S{sup 8}. As an application, we consider the solutions corresponding to rotating D0-branes which in the near-horizon limit approach AdS{sub 2}×M{sub 8} geometries, and discuss their thermodynamical properties. More generally, we study the appearance of such solutions in the presence of non-vanishing axion fields.
Time-Consistent and Market-Consistent Evaluations (replaced by CentER DP 2012-086)
Pelsser, A.; Stadje, M.A.
2011-01-01
We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from
Time-Consistent and Market-Consistent Evaluations (Revised version of CentER DP 2011-063)
Pelsser, A.; Stadje, M.A.
2012-01-01
Abstract: We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from
Consistency of eye movements in MOT using horizontally flipped trials
Czech Academy of Sciences Publication Activity Database
Děchtěrenko, F.; Lukavský, Jiří
2013-01-01
Roč. 42, Suppl (2013), s. 42-42 ISSN 0301-0066. [36th European Conference on Visual Perception. 25.08.2013.-29.08.2013, Brémy] R&D Projects: GA ČR GA13-28709S Institutional support: RVO:68081740 Keywords : eye movements * symmetry * consistency Subject RIV: AN - Psychology http://www.ecvp.uni-bremen.de/~ecvpprog/abstract164.html
Overspecification of colour, pattern, and size: Salience, absoluteness, and consistency
Sammie eTarenskeen; Mirjam eBroersma; Mirjam eBroersma; Bart eGeurts
2015-01-01
The rates of overspecification of colour, pattern, and size are compared, to investigate how salience and absoluteness contribute to the production of overspecification. Colour and pattern are absolute attributes, whereas size is relative and less salient. Additionally, a tendency towards consistent responses is assessed. Using a within-participants design, we find similar rates of colour and pattern overspecification, which are both higher than the rate of size overspecification. Using a bet...
Spectrally Consistent Satellite Image Fusion with Improved Image Priors
DEFF Research Database (Denmark)
Nielsen, Allan Aasbjerg; Aanæs, Henrik; Jensen, Thomas B.S.
2006-01-01
Here an improvement to our previous framework for satellite image fusion is presented. A framework purely based on the sensor physics and on prior assumptions on the fused image. The contributions of this paper are two fold. Firstly, a method for ensuring 100% spectrally consistency is proposed......, even when more sophisticated image priors are applied. Secondly, a better image prior is introduced, via data-dependent image smoothing....
Monetary Poverty, Material Deprivation and Consistent Poverty in Portugal
Carlos Farinha Rodrigues; Isabel Andrade
2012-01-01
In this paper we use the Portuguese component of the European Union Statistics on Income and Living Conditions {EU-SILC) to develop a measure of consistent poverty in Portugal. It is widely agreed that being poor does not simply mean not having enough monetary resources. It also reflects a lack of access to the resources required to enjoy a minimum standard of living and participation in the society one belor]gs to. The coexistence of material deprivation and monetary poverty leads ...
Consistency requirements on Δ contributions to the NN potential
International Nuclear Information System (INIS)
Rinat, A.S.
1982-04-01
We discuss theories leading to intermediate state NΔ and ΔΔ contributions to Vsub(NN). We focus on the customary addition of Lsub(ΔNπ)' to Lsub(πNN)' in a conventional field theory and argue that overcounting of contributions to tsub(πN) and Vsub(NN) will be the rule. We then discuss the cloudy bag model where a similar interaction naturally arises and which leads to a consistent theory. (author)
Quark mean field theory and consistency with nuclear matter
International Nuclear Information System (INIS)
Dey, J.; Tomio, L.; Dey, M.; Frederico, T.
1989-01-01
1/N c expansion in QCD (with N c the number of colours) suggests using a potential from meson sector (e.g. Richardson) for baryons. For light quarks a σ field has to be introduced to ensure chiral symmetry breaking ( χ SB). It is found that nuclear matter properties can be used to pin down the χ SB-modelling. All masses, M Ν , m σ , m ω are found to scale with density. The equations are solved self consistently. (author)
Self-consistent T-matrix theory of superconductivity
Czech Academy of Sciences Publication Activity Database
Šopík, B.; Lipavský, Pavel; Männel, M.; Morawetz, K.; Matlock, P.
2011-01-01
Roč. 84, č. 9 (2011), 094529/1-094529/13 ISSN 1098-0121 R&D Projects: GA ČR GAP204/10/0212; GA ČR(CZ) GAP204/11/0015 Institutional research plan: CEZ:AV0Z10100521 Keywords : superconductivity * T-matrix * superconducting gap * restricted self-consistency Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 3.691, year: 2011
Overspecification of color, pattern, and size: salience, absoluteness, and consistency
Tarenskeen, S.L.; Broersma, M.; Geurts, B.
2015-01-01
The rates of overspecification of color, pattern, and size are compared, to investigate how salience and absoluteness contribute to the production of overspecification. Color and pattern are absolute and salient attributes, whereas size is relative and less salient. Additionally, a tendency toward consistent responses is assessed. Using a within-participants design, we find similar rates of color and pattern overspecification, which are both higher than the rate of size overspecification. Usi...
The consistent histories interpretation of quantum fields in curved spacetime
International Nuclear Information System (INIS)
Blencowe, M.
1991-01-01
As an initial attempt to address some of the foundation problems of quantum mechanics, the author formulates the consistent histories interpretation of quantum field theory on a globally hyperbolic curved space time. He then constructs quasiclassical histories for a free, massive scalar field. In the final part, he points out the shortcomings of the theory and conjecture that one must take into account the fact that gravity is quantized in order to overcome them
Modeling a Consistent Behavior of PLC-Sensors
Directory of Open Access Journals (Sweden)
E. V. Kuzmin
2014-01-01
Full Text Available The article extends the cycle of papers dedicated to programming and verificatoin of PLC-programs by LTL-specification. This approach provides the availability of correctness analysis of PLC-programs by the model checking method.The model checking method needs to construct a finite model of a PLC program. For successful verification of required properties it is important to take into consideration that not all combinations of input signals from the sensors can occur while PLC works with a control object. This fact requires more advertence to the construction of the PLC-program model.In this paper we propose to describe a consistent behavior of sensors by three groups of LTL-formulas. They will affect the program model, approximating it to the actual behavior of the PLC program. The idea of LTL-requirements is shown by an example.A PLC program is a description of reactions on input signals from sensors, switches and buttons. In constructing a PLC-program model, the approach to modeling a consistent behavior of PLC sensors allows to focus on modeling precisely these reactions without an extension of the program model by additional structures for realization of a realistic behavior of sensors. The consistent behavior of sensors is taken into account only at the stage of checking a conformity of the programming model to required properties, i. e. a property satisfaction proof for the constructed model occurs with the condition that the model contains only such executions of the program that comply with the consistent behavior of sensors.
Overspecification of colour, pattern, and size: Salience, absoluteness, and consistency
Directory of Open Access Journals (Sweden)
Sammie eTarenskeen
2015-11-01
Full Text Available The rates of overspecification of colour, pattern, and size are compared, to investigate how salience and absoluteness contribute to the production of overspecification. Colour and pattern are absolute attributes, whereas size is relative and less salient. Additionally, a tendency towards consistent responses is assessed. Using a within-participants design, we find similar rates of colour and pattern overspecification, which are both higher than the rate of size overspecification. Using a between-participants design, however, we find similar rates of pattern and size overspecification, which are both lower than the rate of colour overspecification. This indicates that although many speakers are more likely to include colour than pattern (probably because colour is more salient, they may also treat pattern like colour due to a tendency towards consistency. We find no increase in size overspecification when the salience of size is increased, suggesting that speakers are more likely to include absolute than relative attributes. However, we do find an increase in size overspecification when mentioning the attributes is triggered, which again shows that speakers tend refer in a consistent manner, and that there are circumstances in which even size overspecification is frequently produced.
Temporal and contextual consistency of leadership in homing pigeon flocks.
Directory of Open Access Journals (Sweden)
Carlos D Santos
Full Text Available Organized flight of homing pigeons (Columba livia was previously shown to rely on simple leadership rules between flock mates, yet the stability of this social structuring over time and across different contexts remains unclear. We quantified the repeatability of leadership-based flock structures within a flight and across multiple flights conducted with the same animals. We compared two contexts of flock composition: flocks of birds of the same age and flight experience; and, flocks of birds of different ages and flight experience. All flocks displayed consistent leadership-based structures over time, showing that individuals have stable roles in the navigational decisions of the flock. However, flocks of balanced age and flight experience exhibited reduced leadership stability, indicating that these factors promote flock structuring. Our study empirically demonstrates that leadership and followership are consistent behaviours in homing pigeon flocks, but such consistency is affected by the heterogeneity of individual flight experiences and/or age. Similar evidence from other species suggests leadership as an important mechanism for coordinated motion in small groups of animals with strong social bonds.
Consistency checks in beam emission modeling for neutral beam injectors
International Nuclear Information System (INIS)
Punyapu, Bharathi; Vattipalle, Prahlad; Sharma, Sanjeev Kumar; Baruah, Ujjwal Kumar; Crowley, Brendan
2015-01-01
In positive neutral beam systems, the beam parameters such as ion species fractions, power fractions and beam divergence are routinely measured using Doppler shifted beam emission spectrum. The accuracy with which these parameters are estimated depend on the accuracy of the atomic modeling involved in these estimations. In this work, an effective procedure to check the consistency of the beam emission modeling in neutral beam injectors is proposed. As a first consistency check, at a constant beam voltage and current, the intensity of the beam emission spectrum is measured by varying the pressure in the neutralizer. Then, the scaling of measured intensity of un-shifted (target) and Doppler shifted intensities (projectile) of the beam emission spectrum at these pressure values are studied. If the un-shifted component scales with pressure, then the intensity of this component will be used as a second consistency check on the beam emission modeling. As a further check, the modeled beam fractions and emission cross sections of projectile and target are used to predict the intensity of the un-shifted component and then compared with the value of measured target intensity. An agreement between the predicted and measured target intensities provide the degree of discrepancy in the beam emission modeling. In order to test this methodology, a systematic analysis of Doppler shift spectroscopy data obtained on the JET neutral beam test stand data was carried out
A dynamical mechanism for large volumes with consistent couplings
Energy Technology Data Exchange (ETDEWEB)
Abel, Steven [IPPP, Durham University,Durham, DH1 3LE (United Kingdom)
2016-11-14
A mechanism for addressing the “decompactification problem” is proposed, which consists of balancing the vacuum energy in Scherk-Schwarzed theories against contributions coming from non-perturbative physics. Universality of threshold corrections ensures that, in such situations, the stable minimum will have consistent gauge couplings for any gauge group that shares the same N=2 beta function for the bulk excitations as the gauge group that takes part in the minimisation. Scherk-Schwarz compactification from 6D to 4D in heterotic strings is discussed explicitly, together with two alternative possibilities for the non-perturbative physics, namely metastable SQCD vacua and a single gaugino condensate. In the former case, it is shown that modular symmetries gives various consistency checks, and allow one to follow soft-terms, playing a similar role to R-symmetry in global SQCD. The latter case is particularly attractive when there is nett Bose-Fermi degeneracy in the massless sector. In such cases, because the original Casimir energy is generated entirely by excited and/or non-physical string modes, it is completely immune to the non-perturbative IR physics. Such a separation between UV and IR contributions to the potential greatly simplifies the analysis of stabilisation, and is a general possibility that has not been considered before.
Consistency of variables in PCS and JASTRO great area database
International Nuclear Information System (INIS)
Nishino, Tomohiro; Teshima, Teruki; Abe, Mitsuyuki
1998-01-01
To examine whether the Patterns of Care Study (PCS) reflects the data for the major areas in Japan, the consistency of variables in the PCS and in the major area database of the Japanese Society for Therapeutic Radiology and Oncology (JASTRO) were compared. Patients with esophageal or uterine cervical cancer were sampled from the PCS and JASTRO databases. From the JASTRO database, 147 patients with esophageal cancer and 95 patients with uterine cervical cancer were selected according to the eligibility criteria for the PCS. From the PCS, 455 esophageal and 432 uterine cervical cancer patients were surveyed. Six items for esophageal cancer and five items for uterine cervical cancer were selected for a comparative analysis of PCS and JASTRO databases. Esophageal cancer: Age (p=.0777), combination of radiation and surgery (p=.2136), and energy of the external beam (p=.6400) were consistent for PCS and JASTRO. However, the dose of the external beam for the non-surgery group showed inconsistency (p=.0467). Uterine cervical cancer: Age (p=.6301) and clinical stage (p=.8555) were consistent for the two sets of data. However, the energy of the external beam (p<.0001), dose rate of brachytherapy (p<.0001), and brachytherapy utilization by clinical stage (p<.0001) showed inconsistencies. It appears possible that the JASTRO major area database could not account for all patients' backgrounds and factors and that both surveys might have an imbalance in the stratification of institutions including differences in equipment and staffing patterns. (author)
Self-assessment: Strategy for higher standards, consistency, and performance
International Nuclear Information System (INIS)
Ide, W.E.
1996-01-01
In late 1994, Palo Verde operations underwent a transformation from a unitized structure to a single functional unit. It was necessary to build consistency in watchstanding practices and create a shared mission. Because there was a lack of focus on actual plant operations and because personnel were deeply involved with administrative tasks, command and control of evolutions were weak. Improvement was needed. Consistent performance standards have been set for all three operating units. These expectation focus on nuclear, radiological, and industrial safety. Straightforward descriptions of watchstanding and monitoring practices have been provided to all department personnel. The desired professional and leadership qualities for employee conduct have been defined and communicated thoroughly. A healthy and competitive atmosphere developed with the successful implementation of these standards. Overall performance improved. The auxiliary operators demonstrated increased pride and ownership in the performance of their work activities. In addition, their morale improved. Crew teamwork improved as well as the quality of shift briefs. There was a decrease in the noise level and the administrative functions in the control room. The use of self-assessment helped to anchor and define higher and more consistent standards. The proof of Palo Verde's success was evident when an Institute of Nuclear Power Operations finding was turned into a strength within 1 yr
Wide baseline stereo matching based on double topological relationship consistency
Zou, Xiaohong; Liu, Bin; Song, Xiaoxue; Liu, Yang
2009-07-01
Stereo matching is one of the most important branches in computer vision. In this paper, an algorithm is proposed for wide-baseline stereo vision matching. Here, a novel scheme is presented called double topological relationship consistency (DCTR). The combination of double topological configuration includes the consistency of first topological relationship (CFTR) and the consistency of second topological relationship (CSTR). It not only sets up a more advanced model on matching, but discards mismatches by iteratively computing the fitness of the feature matches and overcomes many problems of traditional methods depending on the powerful invariance to changes in the scale, rotation or illumination across large view changes and even occlusions. Experimental examples are shown where the two cameras have been located in very different orientations. Also, epipolar geometry can be recovered using RANSAC by far the most widely method adopted possibly. By the method, we can obtain correspondences with high precision on wide baseline matching problems. Finally, the effectiveness and reliability of this method are demonstrated in wide-baseline experiments on the image pairs.
Consistently Trained Artificial Neural Network for Automatic Ship Berthing Control
Directory of Open Access Journals (Sweden)
Y.A. Ahmed
2015-09-01
Full Text Available In this paper, consistently trained Artificial Neural Network controller for automatic ship berthing is discussed. Minimum time course changing manoeuvre is utilised to ensure such consistency and a new concept named ‘virtual window’ is introduced. Such consistent teaching data are then used to train two separate multi-layered feed forward neural networks for command rudder and propeller revolution output. After proper training, several known and unknown conditions are tested to judge the effectiveness of the proposed controller using Monte Carlo simulations. After getting acceptable percentages of success, the trained networks are implemented for the free running experiment system to judge the network’s real time response for Esso Osaka 3-m model ship. The network’s behaviour during such experiments is also investigated for possible effect of initial conditions as well as wind disturbances. Moreover, since the final goal point of the proposed controller is set at some distance from the actual pier to ensure safety, therefore a study on automatic tug assistance is also discussed for the final alignment of the ship with actual pier.
Consistency in color parameters of a commonly used shade guide.
Tashkandi, Esam
2010-01-01
The use of shade guides to assess the color of natural teeth subjectively remains one of the most common means for dental shade assessment. Any variation in the color parameters of the different shade guides may lead to significant clinical implications. Particularly, since the communication between the clinic and the dental laboratory is based on using the shade guide designation. The purpose of this study was to investigate the consistency of the L∗a∗b∗ color parameters of a sample of a commonly used shade guide. The color parameters of a total of 100 VITAPAN Classical Vacuum shade guide (VITA Zahnfabrik, Bad Säckingen, Germany(were measured using a X-Rite ColorEye 7000A Spectrophotometer (Grand Rapids, Michigan, USA). Each shade guide consists of 16 tabs with different designations. Each shade tab was measured five times and the average values were calculated. The ΔE between the average L∗a∗b∗ value for each shade tab and the average of the 100 shade tabs of the same designation was calculated. Using the Student t-test analysis, no significant differences were found among the measured sample. There is a high consistency level in terms of color parameters of the measured VITAPAN Classical Vacuum shade guide sample tested.
Consistent Kaluza-Klein truncations via exceptional field theory
Energy Technology Data Exchange (ETDEWEB)
Hohm, Olaf [Center for Theoretical Physics, Massachusetts Institute of Technology,Cambridge, MA 02139 (United States); Samtleben, Henning [Université de Lyon, Laboratoire de Physique, UMR 5672, CNRS,École Normale Supérieure de Lyon, 46, allée d’Italie, F-69364 Lyon cedex 07 (France)
2015-01-26
We present the generalized Scherk-Schwarz reduction ansatz for the full supersymmetric exceptional field theory in terms of group valued twist matrices subject to consistency equations. With this ansatz the field equations precisely reduce to those of lower-dimensional gauged supergravity parametrized by an embedding tensor. We explicitly construct a family of twist matrices as solutions of the consistency equations. They induce gauged supergravities with gauge groups SO(p,q) and CSO(p,q,r). Geometrically, they describe compactifications on internal spaces given by spheres and (warped) hyperboloides H{sup p,q}, thus extending the applicability of generalized Scherk-Schwarz reductions beyond homogeneous spaces. Together with the dictionary that relates exceptional field theory to D=11 and IIB supergravity, respectively, the construction defines an entire new family of consistent truncations of the original theories. These include not only compactifications on spheres of different dimensions (such as AdS{sub 5}×S{sup 5}), but also various hyperboloid compactifications giving rise to a higher-dimensional embedding of supergravities with non-compact and non-semisimple gauge groups.
Marginal Consistency: Upper-Bounding Partition Functions over Commutative Semirings.
Werner, Tomás
2015-07-01
Many inference tasks in pattern recognition and artificial intelligence lead to partition functions in which addition and multiplication are abstract binary operations forming a commutative semiring. By generalizing max-sum diffusion (one of convergent message passing algorithms for approximate MAP inference in graphical models), we propose an iterative algorithm to upper bound such partition functions over commutative semirings. The iteration of the algorithm is remarkably simple: change any two factors of the partition function such that their product remains the same and their overlapping marginals become equal. In many commutative semirings, repeating this iteration for different pairs of factors converges to a fixed point when the overlapping marginals of every pair of factors coincide. We call this state marginal consistency. During that, an upper bound on the partition function monotonically decreases. This abstract algorithm unifies several existing algorithms, including max-sum diffusion and basic constraint propagation (or local consistency) algorithms in constraint programming. We further construct a hierarchy of marginal consistencies of increasingly higher levels and show than any such level can be enforced by adding identity factors of higher arity (order). Finally, we discuss instances of the framework for several semirings, including the distributive lattice and the max-sum and sum-product semirings.
Consistency relation in power law G-inflation
International Nuclear Information System (INIS)
Unnikrishnan, Sanil; Shankaranarayanan, S.
2014-01-01
In the standard inflationary scenario based on a minimally coupled scalar field, canonical or non-canonical, the subluminal propagation of speed of scalar perturbations ensures the following consistency relation: r ≤ −8n T , where r is the tensor-to-scalar-ratio and n T is the spectral index for tensor perturbations. However, recently, it has been demonstrated that this consistency relation could be violated in Galilean inflation models even in the absence of superluminal propagation of scalar perturbations. It is therefore interesting to investigate whether the subluminal propagation of scalar field perturbations impose any bound on the ratio r/|n T | in G-inflation models. In this paper, we derive the consistency relation for a class of G-inflation models that lead to power law inflation. Within these class of models, it turns out that one can have r > −8n T or r ≤ −8n T depending on the model parameters. However, the subluminal propagation of speed of scalar field perturbations, as required by causality, restricts r ≤ −(32/3) n T
On Consistency Test Method of Expert Opinion in Ecological Security Assessment.
Gong, Zaiwu; Wang, Lihong
2017-09-04
To reflect the initiative design and initiative of human security management and safety warning, ecological safety assessment is of great value. In the comprehensive evaluation of regional ecological security with the participation of experts, the expert's individual judgment level, ability and the consistency of the expert's overall opinion will have a very important influence on the evaluation result. This paper studies the consistency measure and consensus measure based on the multiplicative and additive consistency property of fuzzy preference relation (FPR). We firstly propose the optimization methods to obtain the optimal multiplicative consistent and additively consistent FPRs of individual and group judgments, respectively. Then, we put forward a consistency measure by computing the distance between the original individual judgment and the optimal individual estimation, along with a consensus measure by computing the distance between the original collective judgment and the optimal collective estimation. In the end, we make a case study on ecological security for five cities. Result shows that the optimal FPRs are helpful in measuring the consistency degree of individual judgment and the consensus degree of collective judgment.
Krix, Alana C; Sauerland, Melanie; Lorei, Clemens; Rispens, Imke
2015-01-01
In the legal system, inconsistencies in eyewitness accounts are often used to discredit witnesses' credibility. This is at odds with research findings showing that witnesses frequently report reminiscent details (details previously unrecalled) at an accuracy rate that is nearly as high as for consistently recalled information. The present study sought to put the validity of beliefs about recall consistency to a test by directly comparing them with actual memory performance in two recall attempts. All participants watched a film of a staged theft. Subsequently, the memory group (N = 84) provided one statement immediately after the film (either with the Self-Administered Interview or free recall) and one after a one-week delay. The estimation group (N = 81) consisting of experienced police detectives estimated the recall performance of the memory group. The results showed that actual recall performance was consistently underestimated. Also, a sharp decline of memory performance between recall attempts was assumed by the estimation group whereas actual accuracy remained stable. While reminiscent details were almost as accurate as consistent details, they were estimated to be much less accurate than consistent information and as inaccurate as direct contradictions. The police detectives expressed a great concern that reminiscence was the result of suggestive external influences. In conclusion, it seems that experienced police detectives hold many implicit beliefs about recall consistency that do not correspond with actual recall performance. Recommendations for police trainings are provided. These aim at fostering a differentiated view on eyewitness performance and the inclusion of more comprehensive classes on human memory structure.
International Nuclear Information System (INIS)
Saraiva, Fabio Petersen; Costa, Patricia Grativol; Inomata, Daniela Lumi; Melo, Carlos Sergio Nascimento; Helal Junior, John; Nakashima, Yoshitaka
2007-01-01
Objectives: To investigate optical coherence tomography consistency on foveal thickness, foveal volume, and macular volume measurements in patients with and without diffuse diabetic macular edema. Introduction: Optical coherence tomography represents an objective technique that provides cross-sectional tomographs of retinal structure in vivo. However, it is expected that poor fixation ability, as seen in diabetic macular edema, could alter its results. Several authors have discussed the reproducibility of optical coherence tomography, but only a few have addressed the topic with respect to diabetic maculopathy. Methods: The study recruited diabetic patients without clinically evident retinopathy (control group) and with diffuse macular edema (case group). Only one eye of each patient was evaluated. Five consecutive fast macular scans were taken using Ocular Coherence Tomography 3; the 6 mm macular map was chosen. The consistency in measurements of foveal thickness, foveal volume, and total macular volume for both groups was evaluated using the Pearson's coefficient of variation. The T-test for independent samples was used in order to compare measurements of both groups. Results: Each group consisted of 20 patients. All measurements had a coefficient of variation less than 10%. The most consistent parameter for both groups was the total macular volume. Discussion: Consistency in measurement is a mainstay of any test. A test is unreliable if its measurements can not be correctly repeated. We found a good index of consistency, even considering patients with an unstable gaze. Conclusions: Optical coherence tomography is a consistent method for diabetic subjects with diffuse macular edema. (author)
Energy Technology Data Exchange (ETDEWEB)
Saraiva, Fabio Petersen; Costa, Patricia Grativol; Inomata, Daniela Lumi; Melo, Carlos Sergio Nascimento; Helal Junior, John; Nakashima, Yoshitaka [Universidade de Sao Paulo (USP), SP (Brazil). Hospital das Clinicas. Dept. de Oftalmologia]. E-mail: fabiopetersen@yahoo.com.br
2007-07-01
Objectives: To investigate optical coherence tomography consistency on foveal thickness, foveal volume, and macular volume measurements in patients with and without diffuse diabetic macular edema. Introduction: Optical coherence tomography represents an objective technique that provides cross-sectional tomographs of retinal structure in vivo. However, it is expected that poor fixation ability, as seen in diabetic macular edema, could alter its results. Several authors have discussed the reproducibility of optical coherence tomography, but only a few have addressed the topic with respect to diabetic maculopathy. Methods: The study recruited diabetic patients without clinically evident retinopathy (control group) and with diffuse macular edema (case group). Only one eye of each patient was evaluated. Five consecutive fast macular scans were taken using Ocular Coherence Tomography 3; the 6 mm macular map was chosen. The consistency in measurements of foveal thickness, foveal volume, and total macular volume for both groups was evaluated using the Pearson's coefficient of variation. The T-test for independent samples was used in order to compare measurements of both groups. Results: Each group consisted of 20 patients. All measurements had a coefficient of variation less than 10%. The most consistent parameter for both groups was the total macular volume. Discussion: Consistency in measurement is a mainstay of any test. A test is unreliable if its measurements can not be correctly repeated. We found a good index of consistency, even considering patients with an unstable gaze. Conclusions: Optical coherence tomography is a consistent method for diabetic subjects with diffuse macular edema. (author)
Development of a Consistent and Reproducible Porcine Scald Burn Model
Kempf, Margit; Kimble, Roy; Cuttle, Leila
2016-01-01
There are very few porcine burn models that replicate scald injuries similar to those encountered by children. We have developed a robust porcine burn model capable of creating reproducible scald burns for a wide range of burn conditions. The study was conducted with juvenile Large White pigs, creating replicates of burn combinations; 50°C for 1, 2, 5 and 10 minutes and 60°C, 70°C, 80°C and 90°C for 5 seconds. Visual wound examination, biopsies and Laser Doppler Imaging were performed at 1, 24 hours and at 3 and 7 days post-burn. A consistent water temperature was maintained within the scald device for long durations (49.8 ± 0.1°C when set at 50°C). The macroscopic and histologic appearance was consistent between replicates of burn conditions. For 50°C water, 10 minute duration burns showed significantly deeper tissue injury than all shorter durations at 24 hours post-burn (p ≤ 0.0001), with damage seen to increase until day 3 post-burn. For 5 second duration burns, by day 7 post-burn the 80°C and 90°C scalds had damage detected significantly deeper in the tissue than the 70°C scalds (p ≤ 0.001). A reliable and safe model of porcine scald burn injury has been successfully developed. The novel apparatus with continually refreshed water improves consistency of scald creation for long exposure times. This model allows the pathophysiology of scald burn wound creation and progression to be examined. PMID:27612153
Flood damage curves for consistent global risk assessments
de Moel, Hans; Huizinga, Jan; Szewczyk, Wojtek
2016-04-01
Assessing potential damage of flood events is an important component in flood risk management. Determining direct flood damage is commonly done using depth-damage curves, which denote the flood damage that would occur at specific water depths per asset or land-use class. Many countries around the world have developed flood damage models using such curves which are based on analysis of past flood events and/or on expert judgement. However, such damage curves are not available for all regions, which hampers damage assessments in those regions. Moreover, due to different methodologies employed for various damage models in different countries, damage assessments cannot be directly compared with each other, obstructing also supra-national flood damage assessments. To address these problems, a globally consistent dataset of depth-damage curves has been developed. This dataset contains damage curves depicting percent of damage as a function of water depth as well as maximum damage values for a variety of assets and land use classes (i.e. residential, commercial, agriculture). Based on an extensive literature survey concave damage curves have been developed for each continent, while differentiation in flood damage between countries is established by determining maximum damage values at the country scale. These maximum damage values are based on construction cost surveys from multinational construction companies, which provide a coherent set of detailed building cost data across dozens of countries. A consistent set of maximum flood damage values for all countries was computed using statistical regressions with socio-economic World Development Indicators from the World Bank. Further, based on insights from the literature survey, guidance is also given on how the damage curves and maximum damage values can be adjusted for specific local circumstances, such as urban vs. rural locations, use of specific building material, etc. This dataset can be used for consistent supra
Coagulation of Agglomerates Consisting of Polydisperse Primary Particles.
Goudeli, E; Eggersdorfer, M L; Pratsinis, S E
2016-09-13
The ballistic agglomeration of polydisperse particles is investigated by an event-driven (ED) method and compared to the coagulation of spherical particles and agglomerates consisting of monodisperse primary particles (PPs). It is shown for the first time to our knowledge that increasing the width or polydispersity of the PP size distribution initially accelerates the coagulation rate of their agglomerates but delays the attainment of their asymptotic fractal-like structure and self-preserving size distribution (SPSD) without altering them, provided that sufficiently large numbers of PPs are employed. For example, the standard asymptotic mass fractal dimension, Df, of 1.91 is attained when clusters are formed containing, on average, about 15 monodisperse PPs, consistent with fractal theory and the literature. In contrast, when polydisperse PPs with a geometric standard deviation of 3 are employed, about 500 PPs are needed to attain that Df. Even though the same asymptotic Df and mass-mobility exponent, Dfm, are attained regardless of PP polydispersity, the asymptotic prefactors or lacunarities of Df and Dfm increase with PP polydispersity. For monodisperse PPs, the average agglomerate radius of gyration, rg, becomes larger than the mobility radius, rm, when agglomerates consist of more than 15 PPs. Increasing PP polydispersity increases that number of PPs similarly to the above for the attainment of the asymptotic Df or Dfm. The agglomeration kinetics are quantified by the overall collision frequency function. When the SPSD is attained, the collision frequency is independent of PP polydispersity. Accounting for the SPSD polydispersity in the overall agglomerate collision frequency is in good agreement with that frequency from detailed ED simulations once the SPSD is reached. Most importantly, the coagulation of agglomerates is described well by a monodisperse model for agglomerate and PP sizes, whereas the detailed agglomerate size distribution can be obtained by
Rape Myth Consistency and Gender Differences in Perceiving Rape Victims: A Meta-Analysis.
Hockett, Jericho M; Smith, Sara J; Klausing, Cathleen D; Saucier, Donald A
2016-02-01
An overview discusses feminist analyses of oppression, attitudes toward rape victims, and previously studied predictors of individuals' attitudes toward rape victims. To better understand such attitudes, this meta-analysis examines the moderating influences of various rape victim, perpetrator, and crime characteristics' rape myth consistency on gender differences in individuals' perceptions of rape victims (i.e., victim responsibility and blame attributions and rape minimizing attitudes). Consistent with feminist theoretical predictions, results indicated that, overall, men perceived rape victims more negatively than women did. However, this sex difference was moderated by the rape myth consistency within the rape vignettes. Implications for research are discussed. © The Author(s) 2015.
Self-consistent tight-binding model of B and N doping in graphene
DEFF Research Database (Denmark)
Pedersen, Thomas Garm; Pedersen, Jesper Goor
2013-01-01
. The impurity potential depends sensitively on the impurity occupancy, leading to a self-consistency requirement. We solve this problem using the impurity Green's function and determine the self-consistent local density of states at the impurity site and, thereby, identify acceptor and donor energy resonances.......Boron and nitrogen substitutional impurities in graphene are analyzed using a self-consistent tight-binding approach. An analytical result for the impurity Green's function is derived taking broken electron-hole symmetry into account and validated by comparison to numerical diagonalization...
Studies on the consistency of internally taken contrast medium for pancreas CT
Energy Technology Data Exchange (ETDEWEB)
Matsushima, Kishio; Mimura, Seiichi; Tahara, Seiji; Kitayama, Takuichi; Inamura, Keiji; Mikami, Yasutaka; Hashimoto, Keiji; Hiraki, Yoshio; Aono, Kaname
1985-02-01
A problem of Pancreatic CT scanning is the discrimination between the pancreas and the adjacent gastrointestinal tract. Generally we administer a dilution of gastrografin internally to make the discrimination. The degree of dilution has been decided by experience at each hospital. When the consistency of the contrast medium is low in density, an enhancement effect cannot be expected, but when the consistency is high, artifacts appear. We have experimented on the degree of the dilution and CT-No to decide the optimum consistency of gastrografin for the diagnosis of pancreatic disease. Statistical analysis of the results show the optimum dilution of gastrografin to be 1.5%.
DEFF Research Database (Denmark)
Norman, Patrick; Bishop, David M.; Jensen, Hans Jørgen Aa
2001-01-01
Computationally tractable expressions for the evaluation of the linear response function in the multiconfigurational self-consistent field approximation were derived and implemented. The finite lifetime of the electronically excited states was considered and the linear response function was shown...... to be convergent in the whole frequency region. This was achieved through the incorporation of phenomenological damping factors that lead to complex response function values....
Self-consistent theory of a harmonic gyroklystron with a minimum Q cavity
International Nuclear Information System (INIS)
Tran, T.M.; Kreischer, K.E.; Temkin, R.J.
1986-01-01
In this paper, the energy extraction stage of the gyroklystron [in Advances in Electronics and Electron Physics, edited by C. Marton (Academic, New York, 1979), Vol. 1, pp. 1--54], with a minimum Q cavity is investigated by using a self-consistent radio-frequency (rf) field model. In the low-field, low-current limit, expressions for the self-consistent field and the resulting energy extraction efficiency are derived analytically for an arbitrary cyclotron harmonic number. To our knowledge, these are the first analytic results for the self-consistent field structure and efficiency of a gyrotron device. The large signal regime analysis is carried out by numerically integrating the coupled self-consistent equations. Several examples in this regime are presented
Consistency of FMEA used in the validation of analytical procedures
DEFF Research Database (Denmark)
Oldenhof, M.T.; van Leeuwen, J.F.; Nauta, Maarten
2011-01-01
is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating......In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection...
Self-consistent potential variations in magnetic wells
International Nuclear Information System (INIS)
Kesner, J.; Knorr, G.; Nicholson, D.R.
1981-01-01
Self-consistent electrostatic potential variations are considered in a spatial region of weak magnetic field, as in the proposed tandem mirror thermal barriers (with no trapped ions). For some conditions, equivalent to ion distributions with a sufficiently high net drift speed along the magnetic field, the desired potential depressions are found. When the net drift speed is not high enough, potential depressions are found only in combination with strong electric fields on the boundaries of the system. These potential depressions are not directly related to the magnetic field depression. (author)
Applicability of self-consistent mean-field theory
International Nuclear Information System (INIS)
Guo Lu; Sakata, Fumihiko; Zhao Enguang
2005-01-01
Within the constrained Hartree-Fock (CHF) theory, an analytic condition is derived to estimate whether a concept of the self-consistent mean field is realized in the level repulsive region. The derived condition states that an iterative calculation of the CHF equation does not converge when the quantum fluctuations coming from two-body residual interaction and quadrupole deformation become larger than a single-particle energy difference between two avoided crossing orbits. By means of numerical calculation, it is shown that the analytic condition works well for a realistic case
Island of stability for consistent deformations of Einstein's gravity.
Berkhahn, Felix; Dietrich, Dennis D; Hofmann, Stefan; Kühnel, Florian; Moyassari, Parvin
2012-03-30
We construct deformations of general relativity that are consistent and phenomenologically viable, since they respect, in particular, cosmological backgrounds. These deformations have unique symmetries in accordance with their Minkowski cousins (Fierz-Pauli theory for massive gravitons) and incorporate a background curvature induced self-stabilizing mechanism. Self-stabilization is essential in order to guarantee hyperbolic evolution in and unitarity of the covariantized theory, as well as the deformation's uniqueness. We show that the deformation's parameter space contains islands of absolute stability that are persistent through the entire cosmic evolution.
The self-consistent dynamic pole tide in global oceans
Dickman, S. R.
1985-01-01
The dynamic pole tide is characterized in a self-consistent manner by means of introducing a single nondifferential matrix equation compatible with the Liouville equation, modelling the ocean as global and of uniform depth. The deviations of the theory from the realistic ocean, associated with the nonglobality of the latter, are also given consideration, with an inference that in realistic oceans long-period modes of resonances would be increasingly likely to exist. The analysis of the nature of the pole tide and its effects on the Chandler wobble indicate that departures of the pole tide from the equilibrium may indeed be minimal.
Simplified models for dark matter face their consistent completions
Energy Technology Data Exchange (ETDEWEB)
Gonçalves, Dorival; Machado, Pedro A. N.; No, Jose Miguel
2017-03-01
Simplified dark matter models have been recently advocated as a powerful tool to exploit the complementarity between dark matter direct detection, indirect detection and LHC experimental probes. Focusing on pseudoscalar mediators between the dark and visible sectors, we show that the simplified dark matter model phenomenology departs significantly from that of consistent ${SU(2)_{\\mathrm{L}} \\times U(1)_{\\mathrm{Y}}}$ gauge invariant completions. We discuss the key physics simplified models fail to capture, and its impact on LHC searches. Notably, we show that resonant mono-Z searches provide competitive sensitivities to standard mono-jet analyses at $13$ TeV LHC.
Correlations and self-consistency in pion scattering. II
International Nuclear Information System (INIS)
Johnson, M.B.; Keister, B.D.
1978-01-01
In an attempt to overcome certain difficulties of summing higher order processes in pion multiple scattering theories, a new, systematic expansion for the interaction of a pion in nuclear matter is derived within the context of the Foldy-Walecka theory, incorporating nucleon-nucleon correlations and an idea of self-consistency. The first two orders in the expansion are evaluated as a function of the nonlocality range; the expansion appears to be rapidly converging, in contrast to expansion schemes previously examined. (Auth.)
Quark mean field theory and consistency with nuclear matter
International Nuclear Information System (INIS)
Dey, J.; Dey, M.; Frederico, T.; Tomio, L.
1990-09-01
1/N c expansion in QCD (with N c the number of colours) suggests using a potential from meson sector (e.g. Richardson) for baryons. For light quarks a σ field has to be introduced to ensure chiral symmetry breaking ( χ SB). It is found that nuclear matter properties can be used to pin down the χ SB-modelling. All masses, M N , m σ , m ω are found to scale with density. The equations are solved self consistently. (author). 29 refs, 2 tabs
A self-consistent model of an isothermal tokamak
McNamara, Steven; Lilley, Matthew
2014-10-01
Continued progress in liquid lithium coating technologies have made the development of a beam driven tokamak with minimal edge recycling a feasibly possibility. Such devices are characterised by improved confinement due to their inherent stability and the suppression of thermal conduction. Particle and energy confinement become intrinsically linked and the plasma thermal energy content is governed by the injected beam. A self-consistent model of a purely beam fuelled isothermal tokamak is presented, including calculations of the density profile, bulk species temperature ratios and the fusion output. Stability considerations constrain the operating parameters and regions of stable operation are identified and their suitability to potential reactor applications discussed.
Poisson solvers for self-consistent multi-particle simulations
International Nuclear Information System (INIS)
Qiang, J; Paret, S
2014-01-01
Self-consistent multi-particle simulation plays an important role in studying beam-beam effects and space charge effects in high-intensity beams. The Poisson equation has to be solved at each time-step based on the particle density distribution in the multi-particle simulation. In this paper, we review a number of numerical methods that can be used to solve the Poisson equation efficiently. The computational complexity of those numerical methods will be O(N log(N)) or O(N) instead of O(N2), where N is the total number of grid points used to solve the Poisson equation