WorldWideScience

Sample records for two-part designation consisting

  1. Consistency in multi-viewpoint architectural design

    NARCIS (Netherlands)

    Dijkman, R.M.; Dijkman, Remco Matthijs

    2006-01-01

    This thesis presents a framework that aids in preserving consistency in multi-viewpoint designs. In a multi-viewpoint design each stakeholder constructs his own design part. We call each stakeholder’s design part the view of that stakeholder. To construct his view, a stakeholder has a viewpoint.

  2. Consistent Stochastic Modelling of Meteocean Design Parameters

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Sterndorff, M. J.

    2000-01-01

    Consistent stochastic models of metocean design parameters and their directional dependencies are essential for reliability assessment of offshore structures. In this paper a stochastic model for the annual maximum values of the significant wave height, and the associated wind velocity, current...

  3. Designing apps for success developing consistent app design practices

    CERN Document Server

    David, Matthew

    2014-01-01

    In 2007, Apple released the iPhone. With this release came tools as revolutionary as the internet was to businesses and individuals back in the mid- and late-nineties: Apps. Much like websites drove (and still drive) business, so too do apps drive sales, efficiencies and communication between people. But also like web design and development, in its early years and iterations, guidelines and best practices for apps are few and far between.Designing Apps for Success provides web/app designers and developers with consistent app design practices that result in timely, appropriate, and efficiently

  4. Use of two-part regression calibration model to correct for measurement error in episodically consumed foods in a single-replicate study design: EPIC case study.

    Science.gov (United States)

    Agogo, George O; van der Voet, Hilko; van't Veer, Pieter; Ferrari, Pietro; Leenders, Max; Muller, David C; Sánchez-Cantalejo, Emilio; Bamia, Christina; Braaten, Tonje; Knüppel, Sven; Johansson, Ingegerd; van Eeuwijk, Fred A; Boshuizen, Hendriek

    2014-01-01

    In epidemiologic studies, measurement error in dietary variables often attenuates association between dietary intake and disease occurrence. To adjust for the attenuation caused by error in dietary intake, regression calibration is commonly used. To apply regression calibration, unbiased reference measurements are required. Short-term reference measurements for foods that are not consumed daily contain excess zeroes that pose challenges in the calibration model. We adapted two-part regression calibration model, initially developed for multiple replicates of reference measurements per individual to a single-replicate setting. We showed how to handle excess zero reference measurements by two-step modeling approach, how to explore heteroscedasticity in the consumed amount with variance-mean graph, how to explore nonlinearity with the generalized additive modeling (GAM) and the empirical logit approaches, and how to select covariates in the calibration model. The performance of two-part calibration model was compared with the one-part counterpart. We used vegetable intake and mortality data from European Prospective Investigation on Cancer and Nutrition (EPIC) study. In the EPIC, reference measurements were taken with 24-hour recalls. For each of the three vegetable subgroups assessed separately, correcting for error with an appropriately specified two-part calibration model resulted in about three fold increase in the strength of association with all-cause mortality, as measured by the log hazard ratio. Further found is that the standard way of including covariates in the calibration model can lead to over fitting the two-part calibration model. Moreover, the extent of adjusting for error is influenced by the number and forms of covariates in the calibration model. For episodically consumed foods, we advise researchers to pay special attention to response distribution, nonlinearity, and covariate inclusion in specifying the calibration model.

  5. Consistent approach to air-cleaning system duct design

    International Nuclear Information System (INIS)

    Miller, W.H.; Ornberg, S.C.; Rooney, K.L.

    1981-01-01

    Nuclear power plant air-cleaning system effectiveness is dependent on the capability of a duct system to safely convey contaminated gas to a filtration unit and subsequently to a point of discharge. This paper presents a logical and consistent design approach for selecting sheet metal ductwork construction to meet applicable criteria. The differences in design engineers' duct construction specifications are acknowledged. Typical duct construction details and suggestions for their effective use are presented. Improvements in duct design sections of ANSI/ASME N509-80 are highlighted. A detailed leakage analysis of a control room HVAC system is undertaken to illustrate the effects of conceptual design variations on duct construction requirements. Shortcomings of previously published analyses and interpretations of a current standard are included

  6. Design of a Turbulence Generator of Medium Consistency Pulp Pumps

    Directory of Open Access Journals (Sweden)

    Hong Li

    2012-01-01

    Full Text Available The turbulence generator is a key component of medium consistency centrifugal pulp pumps, with functions to fluidize the medium consistency pulp and to separate gas from the liquid. Structure sizes of the generator affect the hydraulic performance. The radius and the blade laying angle are two important structural sizes of a turbulence generator. Starting with the research on the flow inside and shearing characteristics of the MC pulp, a simple mathematical model at the flow section of the shearing chamber is built, and the formula and procedure to calculate the radius of the turbulence generator are established. The blade laying angle is referenced from the turbine agitator which has the similar shape with the turbulence generator, and the CFD simulation is applied to study the different flow fields with different blade laying angles. Then the recommended blade laying angle of the turbulence generator is formed to be between 60° and 75°.

  7. Neutronic data consistency analysis for lithium blanket and shield design

    International Nuclear Information System (INIS)

    Reupke, W.A.; Muir, D.W.

    1976-01-01

    Using a compact least-squares treatment we analyze the consistency of evaluated cross sections with calculated and measured tritium production in /sup n/Li and 7 Li detectors embedded in a 14-MeV neutron driven /sup n/LiD sphere. The tritium production experimental error matrix is evaluated and an initial reduced chi 2 of 3.0 is calculated. A perturbation calculation of the tritium production cross section sensitivities is performed with secondary neutron energy and angular distributions held constant. The cross section error matrix is evaluated by the external consistency of available cross section measurements. A statistical adjustment of the combined data yields a reduced chi 2 of 2.3 and represents a tenfold improvement in statistical likelihood. The improvement is achieved by a decrease in the 7 Li(n,xt) 14-MeV group cross section from 328 mb to 284 mb and an adjustment of the /sup n/Li data closer to calculated values. The uncertainty in the tritium breeding ratio in pure 7 LiD is reduced by one-fifth

  8. Consistent HYLIFE wall design that withstands transient loading conditions

    International Nuclear Information System (INIS)

    Pitts, J.H.

    1980-10-01

    The design for a first structural wall (FSW) promises to satisfy the impact and thermal stress loads for the 30-year lifetime anticipated for the HYLIFE reaction chamber. The FSW is a 50-mm-thick cylindrical plate that is 10 m in diameter; it can withstand a rapidly varying liquid metal impact stress up to a peak of 60 MPa, combined with slowly varying thermal stresses up to 86 MPa. We selected 2 1/4 Cr-1 Mo ferritic steel as the structural material because it has adequate fatigue properties and yield strength at the peak operating temperature of 810 0 K, is compatible with liquid lithium, and has good neutron activation characteristics

  9. Measuring consistency of web page design and its effects on performance and satisfaction.

    Science.gov (United States)

    Ozok, A A; Salvendy, G

    2000-04-01

    This study examines the methods for measuring the consistency levels of web pages and the effect of consistency on the performance and satisfaction of the world-wide web (WWW) user. For clarification, a home page is referred to as a single page that is the default page of a web site on the WWW. A web page refers to a single screen that indicates a specific address on the WWW. This study has tested a series of web pages that were mostly hyperlinked. Therefore, the term 'web page' has been adopted for the nomenclature while referring to the objects of which the features were tested. It was hypothesized that participants would perform better and be more satisfied using web pages that have consistent rather than inconsistent interface design; that the overall consistency level of an interface design would significantly correlate with the three elements of consistency, physical, communicational and conceptual consistency; and that physical and communicational consistencies would interact with each other. The hypotheses were tested in a four-group, between-subject design, with 10 participants in each group. The results partially support the hypothesis regarding error rate, but not regarding satisfaction and performance time. The results also support the hypothesis that each of the three elements of consistency significantly contribute to the overall consistency of a web page, and that physical and communicational consistencies interact with each other, while conceptual consistency does not interact with them.

  10. Road Service Performance Based On Integrated Road Design Consistency (IC Along Federal Road F0023

    Directory of Open Access Journals (Sweden)

    Zainal Zaffan Farhana

    2017-01-01

    Full Text Available Road accidents are one of the world’s largest public health and injury prevention problems. In Malaysia, the west coast area of Malaysia been stated as the highest motorcycle fatalities and road accidents are one of the factors that cause of death and injuries in this country. The most common fatal accident is between a motorcycle and passenger car. The most of the fatal accidents happened on Federal roads with 44 fatal accidents reported, which is equal to 29%. Lacks of road geometric designs consistency where the drivers make mistakes errors due to the road geometric features causes the accident kept rising in Malaysia. Hence, models are based on operating speed to calculate design consistency of road. The profiles were obtained by continuous speed profile using GPS data. The continuous operating speed profile models were plotted based on operating speed model (85th percentile. The study was conduct at F0023 from km 16 until km 20. The purpose of design consistency is to know the relationship between the operating speed and elements of geometric design on the road. As a result, the integrated design consistency motorcycle and cars along a segment at F0023, the threshold shows poor design quality for motorcycles and cars.

  11. New geometric design consistency model based on operating speed profiles for road safety evaluation.

    Science.gov (United States)

    Camacho-Torregrosa, Francisco J; Pérez-Zuriaga, Ana M; Campoy-Ungría, J Manuel; García-García, Alfredo

    2013-12-01

    To assist in the on-going effort to reduce road fatalities as much as possible, this paper presents a new methodology to evaluate road safety in both the design and redesign stages of two-lane rural highways. This methodology is based on the analysis of road geometric design consistency, a value which will be a surrogate measure of the safety level of the two-lane rural road segment. The consistency model presented in this paper is based on the consideration of continuous operating speed profiles. The models used for their construction were obtained by using an innovative GPS-data collection method that is based on continuous operating speed profiles recorded from individual drivers. This new methodology allowed the researchers to observe the actual behavior of drivers and to develop more accurate operating speed models than was previously possible with spot-speed data collection, thereby enabling a more accurate approximation to the real phenomenon and thus a better consistency measurement. Operating speed profiles were built for 33 Spanish two-lane rural road segments, and several consistency measurements based on the global and local operating speed were checked. The final consistency model takes into account not only the global dispersion of the operating speed, but also some indexes that consider both local speed decelerations and speeds over posted speeds as well. For the development of the consistency model, the crash frequency for each study site was considered, which allowed estimating the number of crashes on a road segment by means of the calculation of its geometric design consistency. Consequently, the presented consistency evaluation method is a promising innovative tool that can be used as a surrogate measure to estimate the safety of a road segment. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. BSDB: A New Consistent Designation Scheme for Identifying Objects in Binary and Multiple Stars

    Directory of Open Access Journals (Sweden)

    Kovaleva D. A.

    2015-06-01

    Full Text Available The new consistent scheme for designation of objects in binary and multiple systems, BSDB, is described. It was developed in the frame of the Binary star DataBase, BDB (http://www.inasan.ru, due to necessity of a unified and consistent system for designation of objects in the database, and the name of the designation scheme was derived from that of the database. The BSDB scheme covers all types of observational data. Three classes of objects introduced within the BSDB nomenclature provide correct links between objects and data, what is especially important for complex multiple stellar systems. The final stage of establishing the BSDB scheme is compilation of the Identification List of Binaries, ILB, where all known objects in binary and multiple stars are presented with their BSDB identifiers along with identifiers according to major catalogues and lists.

  13. Using network screening methods to determine locations with specific safety issues: A design consistency case study.

    Science.gov (United States)

    Butsick, Andrew J; Wood, Jonathan S; Jovanis, Paul P

    2017-09-01

    The Highway Safety Manual provides multiple methods that can be used to identify sites with promise (SWiPs) for safety improvement. However, most of these methods cannot be used to identify sites with specific problems. Furthermore, given that infrastructure funding is often specified for use related to specific problems/programs, a method for identifying SWiPs related to those programs would be very useful. This research establishes a method for Identifying SWiPs with specific issues. This is accomplished using two safety performance functions (SPFs). This method is applied to identifying SWiPs with geometric design consistency issues. Mixed effects negative binomial regression was used to develop two SPFs using 5 years of crash data and over 8754km of two-lane rural roadway. The first SPF contained typical roadway elements while the second contained additional geometric design consistency parameters. After empirical Bayes adjustments, sites with promise (SWiPs) were identified. The disparity between SWiPs identified by the two SPFs was evident; 40 unique sites were identified by each model out of the top 220 segments. By comparing sites across the two models, candidate road segments can be identified where a lack design consistency may be contributing to an increase in expected crashes. Practitioners can use this method to more effectively identify roadway segments suffering from reduced safety performance due to geometric design inconsistency, with detailed engineering studies of identified sites required to confirm the initial assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Consistent natural phenomena design and evaluation guidelines for U.S. Department of Energy facilities

    International Nuclear Information System (INIS)

    Murray, R.C.; Short, S.A.

    1989-01-01

    Uniform design and evaluation guidelines for protection against natural phenomena hazards such as earthquakes, extreme winds, and flooding for facilities at Department of Energy (DOE) sites throughout the United States have been developed. The guidelines apply to design of new facilities and to evaluation or modification of existing facilities. These guidelines are an approach for design or evaluation for mitigating the effects of natural phenomena hazards. These guidelines are intended to control the level of conservatism introduced in the design/evaluation process such that all hazards are treated on a reasonably consistent and uniform basis and such that the level of conservatism is appropriate for facility characteristics such as importance, cost, and hazards to on-site personnel, the general public, and the environment. The philosophy and goals of these guidelines are covered by this paper

  15. Design of next step tokamak: Consistent analysis of plasma performance flux composition and poloidal field system

    International Nuclear Information System (INIS)

    Ane, J.M.; Grandgirard, V.; Albajar, F.; Johner, J.

    2001-01-01

    A consistent and simple approach to derive plasma scenarios for next step tokamak design is presented. It is based on successive plasma equilibria snapshots from plasma breakdown to end of ramp-down. Temperature and density profiles for each equilibrium are derived from a 2D plasma model. The time interval between two successive equilibria is then computed from the toroidal field magnetic energy balance, the resistive term of which depends on n, T profiles. This approach provides a consistent analysis of plasma performance, flux consumption and PF system, including average voltages waveforms across the PF coils. The plasma model and the Poynting theorem for the toroidal magnetic energy are presented. Application to ITER-FEAT and to M2, a Q=5 machine designed at CEA, are shown. (author)

  16. Design of micro distribution systems consisting of long channels with arbitrary cross sections

    International Nuclear Information System (INIS)

    Misdanitis, S; Valougeorgis, D

    2012-01-01

    Gas flows through long micro-channels of various cross sections have been extensively investigated over the years both numerically and experimentally. In various technological applications including microfluidics, these micro-channels are combined together in order to form a micro-channel network. Computational algorithms for solving gas pipe networks in the hydrodynamic regime are well developed. However, corresponding tools for solving networks consisting of micro-channels under any degree of gas rarefaction is very limited. Recently a kinetic algorithm has been developed to simulate gas distribution systems consisting of long circular channels under any vacuum conditions. In the present work this algorithm is generalized and extended into micro-channels of arbitrary cross-section etched by KOH in silicon (triangular and trapezoidal channels with acute angle of 54.74°). Since a kinetic approach is implemented, the analysis is valid and the results are accurate in the whole range of the Knudsen number, while the involved computational effort is very small. This is achieved by successfully integrating the kinetic results for the corresponding single channels into the general solver for designing the gas pipe network. To demonstrate the feasibility of the approach two typical systems consisting of long rectangular and trapezoidal micro-channels are solved.

  17. A level playing field: Obtaining consistent cost estimates for advanced reactor designs

    International Nuclear Information System (INIS)

    Hudson, C.R. II; Rohm, H.H.; Humphreys, J.R. Jr.

    1987-01-01

    Rules and guidelines for developing cost estimates are given which provide a means for presenting cost estimates for advanced concepts on a consistent and equitable basis. For advanced reactor designs, the scope of a cost estimate includes the plant capital cost, the operating and maintenance cost, the fuel cycle cost, and the cost of decommissioning. Each element is subdivided as is necessary to provide a common reporting format for all power plant concepts. The total generation cost is taken to be a suitable choice for a summary figure of merit. To test the application of the rules and guidelines as well as developing reference costs for current technologies, several different sized coal and pressurized water reactor plant cost estimates have been prepared

  18. Progress towards developing consistent design and evaluation guidelines for DOE facilities subjected to natural phenomena hazards

    International Nuclear Information System (INIS)

    Kennedy, R.P.; Short, S.A.; McDonald, J.R.; McCann, M.W. Jr.; Reed, J.W.

    1985-01-01

    Probabilistic definitions of earthquake, wind and tornado natural phenomena hazards for many Department of Energy (DOE) facilities throughout the United States have been developed. In addition, definitions of the flood hazards which might affect these locations are currently being developed. The Department of Energy Natural Phenomena Hazards Panel is now preparing a document to provide guidance and criteria for DOE facility managers to assure that DOE facilities are adequately constructed to resist the effects of natural phenomena such as earthquake, strong wind and flood. The intent of this document is to provide instruction on how to utilize the hazard definitions to evaluate existing facilities and design new facilities in a manner such that the risk of adverse consequences is consistent with the cost, function, and danger to the public or environment of the facility. Potential effects on facilities of natural phenomena hazards are emphasized in this paper. The philosophy for mitigating these effects to be employed in the design and evaluation guidelines is also presented

  19. Teacher collaborative curriculum design in technical vocational colleges: a strategy for maintaining curriculum consistency?

    NARCIS (Netherlands)

    Albashiry, N.M.; Voogt, J.M.; Pieters, J.M.

    2015-01-01

    The Technical Vocational Education and Training (TVET) curriculum requires continuous renewal and constant involvement of stakeholders in the redesign process. Due to a lack of curriculum design expertise, TVET institutions in developing contexts encounter challenges maintaining and advancing the

  20. Maintaining Curriculum Consistency of Technical and Vocational Educational Programs through Teacher Design Teams

    NARCIS (Netherlands)

    Albashiry, Nabeel; Voogt, Joke; Pieters, Julius Marie

    2016-01-01

    Maintaining the quality and relevance of Technical Vocational Education and Training (TVET) curricula is a great challenge for TVET institutions in developing countries. One major challenge lies in the lack of curriculum design expertise of TVET academics. The purpose of this multiplecase study is

  1. A level playing field-obtaining consistent cost estimates for advanced reactor designs

    International Nuclear Information System (INIS)

    Hudson, C.R.; Rohm, H.H.; Humphreys, J.R.

    1987-01-01

    A level playing field in sports is necessary to avoid a situation in which a team has an unfair advantage over its competition. Similarly, rules and guidelines for developing cost estimates can be established which, in effect, provide a level playing field whereby cost estimates for advanced power plant concepts can be presented on a consistent and equitable basis. As an example, consider the capital costs shown in Table 1. Both sets of cost are for the exact same power plant; Estimate 1 is expressed in constant dollars while Estimate 2 is presented in nominal or as-spent dollars. As shown, the costs in Table 1 are not directly comparable. Similar problems can be introduced as a result of differing assumptions in any number of parameters including the scope of the cost estimate, inflation/escalation and interest rates, contingency costs, and site location. Of course, the motivation for having consistent cost estimates is to permit comparison among various concepts. As the U.S. Department of Energy sponsors research and development work on several advanced reactor concepts in which expected cost is a key evaluation parameter, the emphasis in this particular endeavor has been in promoting the comparability of advanced reactor cost estimates among themselves and to existing power plant types. To continue with the analogy, the idea is to lay out the playing field and the rules of the contest such that each team participates in the match on an equal basis with the final score being solely determined by the inherent strengths and abilities of the teams. A description of the playing field and some of the more important rules will now be provided

  2. Road Service Performance Based On Integrated Road Design Consistency (IC) Along Federal Road F0023

    OpenAIRE

    Zainal Zaffan Farhana; Prasetijo Joewono; Musa Wan Zahidah

    2017-01-01

    Road accidents are one of the world’s largest public health and injury prevention problems. In Malaysia, the west coast area of Malaysia been stated as the highest motorcycle fatalities and road accidents are one of the factors that cause of death and injuries in this country. The most common fatal accident is between a motorcycle and passenger car. The most of the fatal accidents happened on Federal roads with 44 fatal accidents reported, which is equal to 29%. Lacks of road geometric design...

  3. Interface Consistency

    DEFF Research Database (Denmark)

    Staunstrup, Jørgen

    1998-01-01

    This paper proposes that Interface Consistency is an important issue for the development of modular designs. Byproviding a precise specification of component interfaces it becomes possible to check that separately developedcomponents use a common interface in a coherent matter thus avoiding a very...... significant source of design errors. Awide range of interface specifications are possible, the simplest form is a syntactical check of parameter types.However, today it is possible to do more sophisticated forms involving semantic checks....

  4. Architectural design of the pelvic floor is consistent with muscle functional subspecialization.

    Science.gov (United States)

    Tuttle, Lori J; Nguyen, Olivia T; Cook, Mark S; Alperin, Marianna; Shah, Sameer B; Ward, Samuel R; Lieber, Richard L

    2014-02-01

    Skeletal muscle architecture is the strongest predictor of a muscle's functional capacity. The purpose of this study was to define the architectural properties of the deep muscles of the female pelvic floor (PFMs) to elucidate their structure-function relationships. PFMs coccygeus (C), iliococcygeus (IC), and pubovisceral (PV) were harvested en bloc from ten fixed human cadavers (mean age 85 years, range 55-102). Fundamental architectural parameters of skeletal muscles [physiological cross-sectional area (PCSA), normalized fiber length, and sarcomere length (L(s))] were determined using validated methods. PCSA predicts muscle-force production, and normalized fiber length is related to muscle excursion. These parameters were compared using repeated measures analysis of variance (ANOVA) with post hoc t tests, as appropriate. Significance was set to α = 0.05. PFMs were thinner than expected based on data reported from imaging studies and in vivo palpation. Significant differences in fiber length were observed across PFMs: C = 5.29 ± 0.32 cm, IC = 7.55 ± 0.46 cm, PV = 10.45 ± 0.67 cm (p design shows individual muscles demonstrating differential architecture, corresponding to specialized function in the pelvic floor.

  5. A Cost-Effective Two-Part Experiment for Teaching Introductory Organic Chemistry Techniques

    Science.gov (United States)

    Sadek, Christopher M.; Brown, Brenna A.; Wan, Hayley

    2011-01-01

    This two-part laboratory experiment is designed to be a cost-effective method for teaching basic organic laboratory techniques (recrystallization, thin-layer chromatography, column chromatography, vacuum filtration, and melting point determination) to large classes of introductory organic chemistry students. Students are exposed to different…

  6. On the electrical two-part tariff—The Brazilian perspective

    International Nuclear Information System (INIS)

    Steele Santos, Paulo E.; Coradi Leme, Rafael; Galvão, Leandro

    2012-01-01

    This paper discusses, in terms of Brazil's situation, the use of a nonlinear pricing approach in the application of a two-part tariff to electricity distribution networks. The principles that uphold charging access and usage are to optimize energy systems that are based on a generation technology mix. Such a pricing approach is used in Brazil, where the generation mix is mainly hydro-generation. This study shows that, in a case like Brazil's, a two-part tariff may be used as a tool for network optimization. The paper presents a design for a two-part tariff for a distribution system with varying consumer behavior. To validate the discussion, we offer a numerical example. Finally, remarks are given concerning pricing access and usage for low voltage level consumers.

  7. Seismic behavior and design of a primary shield structure consisting of steel-plate composite (SC) walls

    Energy Technology Data Exchange (ETDEWEB)

    Booth, Peter N., E-mail: boothpn@purdue.edu [Lyles School of Civil Engineering, Purdue University, W. Lafayette, IN (United States); Varma, Amit H., E-mail: ahvarma@purdue.edu [Lyles School of Civil Engineering, Purdue University, W. Lafayette, IN (United States); Sener, Kadir C., E-mail: ksener@purdue.edu [Lyles School of Civil Engineering, Purdue University, W. Lafayette, IN (United States); Mori, Kentaro, E-mail: kentaro_mori@mhi.co.jp [Mitsubishi Heavy Industries, Ltd, Kobe (Japan)

    2015-12-15

    This paper presents an analytical evaluation of the seismic behavior and design of a unique primary shield (PSW) structure consisting of steel-plate composite (SC) walls designed for a typical pressurized water reactor (PWR) nuclear power plant. Researchers in Japan have previously conducted a reduced (1/6th) scale test of a PSW structure to evaluate its seismic (lateral) load-deformation behavior. This paper presents the development and benchmarking of a detailed 3D nonlinear inelastic finite element (NIFE) model to predict the lateral load-deformation response and behavior of the 1/6th scale test structure. The PSW structure consists of thick SC wall segments with complex and irregular geometry that surround the central reactor vessel cavity. The wall segments have three layers of steel plates (one each on the interior and exterior surfaces and one embedded in the middle) that are anchored to the concrete infill with stud anchors. The results from the 3D NIFE analyses include: (i) the lateral load-deformation behavior of the PSW structure, (ii) the progression of yielding in the steel plates, concrete cracking, formation of compression struts, and (iii) the final failure mode. These results are compared and benchmarked using experimental measurements and observations reported by Shodo et al. (2003). The analytical results provide significant insight into the lateral behavior and strength of the PSW structure, and are used for developing a design approach. This design approach starts with ACI 349 code equations for reinforced concrete shear walls and modifies them for application to the PSW structure. A simplified 3D linear elastic finite element (LEFE) model of the PSW structure is also proposed as a conventional structural analysis tool for estimating the design force demands for various load combinations.

  8. Seismic behavior and design of a primary shield structure consisting of steel-plate composite (SC) walls

    International Nuclear Information System (INIS)

    Booth, Peter N.; Varma, Amit H.; Sener, Kadir C.; Mori, Kentaro

    2015-01-01

    This paper presents an analytical evaluation of the seismic behavior and design of a unique primary shield (PSW) structure consisting of steel-plate composite (SC) walls designed for a typical pressurized water reactor (PWR) nuclear power plant. Researchers in Japan have previously conducted a reduced (1/6th) scale test of a PSW structure to evaluate its seismic (lateral) load-deformation behavior. This paper presents the development and benchmarking of a detailed 3D nonlinear inelastic finite element (NIFE) model to predict the lateral load-deformation response and behavior of the 1/6th scale test structure. The PSW structure consists of thick SC wall segments with complex and irregular geometry that surround the central reactor vessel cavity. The wall segments have three layers of steel plates (one each on the interior and exterior surfaces and one embedded in the middle) that are anchored to the concrete infill with stud anchors. The results from the 3D NIFE analyses include: (i) the lateral load-deformation behavior of the PSW structure, (ii) the progression of yielding in the steel plates, concrete cracking, formation of compression struts, and (iii) the final failure mode. These results are compared and benchmarked using experimental measurements and observations reported by Shodo et al. (2003). The analytical results provide significant insight into the lateral behavior and strength of the PSW structure, and are used for developing a design approach. This design approach starts with ACI 349 code equations for reinforced concrete shear walls and modifies them for application to the PSW structure. A simplified 3D linear elastic finite element (LEFE) model of the PSW structure is also proposed as a conventional structural analysis tool for estimating the design force demands for various load combinations.

  9. Permanent-magnet motor with two-part rotor for wide speed range

    International Nuclear Information System (INIS)

    Baines, G.D.; Chalmers, B.J.; Akmese, R.

    1998-01-01

    The paper describes a synchronous motor with a two-part rotor comprising a surface-magnet part and a reluctance part mounted adjacent to each other on the same axis. Machine parameters and physical design details are selected in order to obtain constant-power characteristics over a 3:1 speed range by field-weakening. Test results demonstrate the achievement of the desired characteristics, in good agreement with computed predictions. (orig.)

  10. Optimization of preparation method for ketoprofen-loaded microspheres consisting polymeric blends using simplex lattice mixture design

    Energy Technology Data Exchange (ETDEWEB)

    Das, Sanjoy Kumar, E-mail: sanjoydasju@gmail.com; Khanam, Jasmina; Nanda, Arunabha

    2016-12-01

    In the present investigation, simplex lattice mixture design was applied for formulation development and optimization of a controlled release dosage form of ketoprofen microspheres consisting polymers like ethylcellulose and Eudragit{sup ®}RL 100; when those were formed by oil-in-oil emulsion solvent evaporation method. The investigation was carried out to observe the effects of polymer amount, stirring speed and emulsifier concentration (% w/w) on percentage yield, average particle size, drug entrapment efficiency and in vitro drug release in 8 h from the microspheres. Analysis of variance (ANOVA) was used to estimate the significance of the models. Based on the desirability function approach numerical optimization was carried out. Optimized formulation (KTF-O) showed close match between actual and predicted responses with desirability factor 0.811. No adverse reaction between drug and polymers were observed on the basis of Fourier transform infrared (FTIR) spectroscopy and Differential scanning calorimetric (DSC) analysis. Scanning electron microscopy (SEM) was carried out to show discreteness of microspheres (149.2 ± 1.25 μm) and their surface conditions during pre and post dissolution operations. The drug release pattern from KTF-O was best explained by Korsmeyer-Peppas and Higuchi models. The batch of optimized microspheres were found with maximum entrapment (~ 90%), minimum loss (~ 10%) and prolonged drug release for 8 h (91.25%) which may be considered as favourable criteria of controlled release dosage form. - Graphical abstract: Optimization of preparation method for ketoprofen-loaded microspheres consisting polymeric blends using simplex lattice mixture design. - Highlights: • Simplex lattice design was used to optimize ketoprofen-loaded microspheres. • Polymeric blend (Ethylcellulose and Eudragit® RL 100) was used. • Microspheres were prepared by oil-in-oil emulsion solvent evaporation method. • Optimized formulation depicted favourable

  11. Verification Process of Behavioral Consistency between Design and Implementation programs of pSET using HW-CBMC

    International Nuclear Information System (INIS)

    Lee, Dong Ah; Lee, Jong Hoon; Yoo, Jun Beom

    2011-01-01

    Controllers in safety critical systems such as nuclear power plants often use Function Block Diagrams (FBDs) to design embedded software. The design is implemented using programming languages such as C to compile it into particular target hardware. The implementation must have the same behavior with the design and the behavior should be verified explicitly. For example, the pSET (POSAFE-Q Software Engineering Tool) is a loader software to program POSAFE-Q PLC (Programmable Logic Controller) and is developed as a part of the KNICS (Korea Nuclear Instrumentation and Control System R and D Center) project. It uses FBDs to design software of PLC, and generates ANSI-C code to compile it into specific machine code. To verify the equivalence between the FBDs and ANSI-C code, mathematical proof of code generator or a verification tools such as RETRANS can help guarantee the equivalence. Mathematical proof, however, has a weakness that requires high expenditure and repetitive fulfillment whenever the translator is modified. On the other hand, RETRANS reconstructs the generated source code without consideration of the generator. It has also a weakness that the reconstruction of generated code needs additional analysis This paper introduces verification process of behavioral consistency between design and its implementation of the pSET using the HW-CBMC. The HW-CBMC is a formal verification tool, verifying equivalence between hardware and software description. It requires two inputs for checking equivalence, Verilog for hard-ware and ANSI-C for software. In this approach, FBDs are translated into semantically equivalent Verilog pro-gram, and the HW-CBMC verifies equivalence between the Verilog program and the ANSI-C program which is generated from the FBDs

  12. Verification Process of Behavioral Consistency between Design and Implementation programs of pSET using HW-CBMC

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong Ah; Lee, Jong Hoon; Yoo, Jun Beom [Konkuk University, Seoul (Korea, Republic of)

    2011-05-15

    Controllers in safety critical systems such as nuclear power plants often use Function Block Diagrams (FBDs) to design embedded software. The design is implemented using programming languages such as C to compile it into particular target hardware. The implementation must have the same behavior with the design and the behavior should be verified explicitly. For example, the pSET (POSAFE-Q Software Engineering Tool) is a loader software to program POSAFE-Q PLC (Programmable Logic Controller) and is developed as a part of the KNICS (Korea Nuclear Instrumentation and Control System R and D Center) project. It uses FBDs to design software of PLC, and generates ANSI-C code to compile it into specific machine code. To verify the equivalence between the FBDs and ANSI-C code, mathematical proof of code generator or a verification tools such as RETRANS can help guarantee the equivalence. Mathematical proof, however, has a weakness that requires high expenditure and repetitive fulfillment whenever the translator is modified. On the other hand, RETRANS reconstructs the generated source code without consideration of the generator. It has also a weakness that the reconstruction of generated code needs additional analysis This paper introduces verification process of behavioral consistency between design and its implementation of the pSET using the HW-CBMC. The HW-CBMC is a formal verification tool, verifying equivalence between hardware and software description. It requires two inputs for checking equivalence, Verilog for hard-ware and ANSI-C for software. In this approach, FBDs are translated into semantically equivalent Verilog pro-gram, and the HW-CBMC verifies equivalence between the Verilog program and the ANSI-C program which is generated from the FBDs

  13. Progress towards developing consistent design and evaluation guidelines for US Department of Energy facilities subjected to natural phenomena

    International Nuclear Information System (INIS)

    Murray, R.C.

    1987-01-01

    Probabilistic definitions of earthquake, wind, and tornado hazards for many DOE facilities throughout the United States have been developed. In addition, definitions of the flood hazards which might affect these locations are currently being developed. We have prepared a document to provide guidance and criteria for DOE facility managers to assure that DOE facilities are adequately constructed to resist the effects of natural phenomena such as earthquake, strong wind, and flood. The intent of this document is to provide instruction on how to utilize the hazard definitions to evaluate existing facilities and design new facilities in a manner such that the risk of adverse consequences is consistent with the cost, function, and danger to the public or environment. A conference and six mini-courses were organized on natural phenomena hazards mitigation. This provided a mechanism for technology transfer to the DOE community. Complementary manuals have also been developed for 1) suspended ceiling systems and recommendations for bracing them, 2) practical equipment seismic upgrade and strengthening guidelines, and 3) suggested structural details for wind design. These manuals are intended to provide input and guidance for ongoing site safety programs. (orig./HP)

  14. Progress towards developing consistent design and evaluation guidelines for US Department of Energy facilities subjected to natural phenomena

    International Nuclear Information System (INIS)

    Murray, R.C.

    1987-01-01

    Probabilistic definitions of earthquake, wind, and tornado hazards for many Department of Energy (DOE) facilities throughout the United States have been developed. In addition, definitions of the flood hazards which might affect these locations are currently being developed. The authors have prepared a document to provide guidance and criteria for DOE facility managers to assure that DOE facilities are adequately constructed to resist the effects of natural phenomena such as earthquake, strong wind, and flood. The intent of this document is to provide instruction on how to utilize the hazard definitions to evaluate existing facilities and design new facilities in a manner such that the risk of adverse consequences is consistent with the cost, function, and danger to the public or environment. A conference and six mini-courses were organized on natural phenomena hazards mitigation. This provided a mechanism for technology transfer to the DOE community. Complementary manuals have also been developed for 1) suspended ceiling systems and recommendations for bracing them, 2) practical equipment seismic upgrade and strengthening guidelines, and 3) suggested structural details for wind design. These manuals are intended to provide input and guidance for ongoing site safety programs

  15. Much ado about two: reconsidering retransformation and the two-part model in health econometrics.

    Science.gov (United States)

    Mullahy, J

    1998-06-01

    In health economics applications involving outcomes (y) and covariates (x), it is often the case that the central inferential problems of interest involve E[y/x] and its associated partial effects or elasticities. Many such outcomes have two fundamental statistical properties: y > or = 0; and the outcome y = 0 is observed with sufficient frequency that the zeros cannot be ignored econometrically. This paper (1) describes circumstances where the standard two-part model with homoskedastic retransformation will fail to provide consistent inferences about important policy parameters; and (2) demonstrates some alternative approaches that are likely to prove helpful in applications.

  16. An Analytic Creativity Assessment Scale for Digital Game Story Design: Construct Validity, Internal Consistency and Interrater Reliability

    Science.gov (United States)

    Chuang, Tsung-Yen; Huang, Yun-Hsuan

    2015-01-01

    Mobile technology has rapidly made digital games a popular entertainment to this digital generation, and thus digital game design received considerable attention in both the game industry and design education. Digital game design involves diverse dimensions in which digital game story design (DGSD) particularly attracts our interest, as the…

  17. A sample design for globally consistent biomass estimation using lidar data from the Geoscience Laser Altimeter System (GLAS)

    Science.gov (United States)

    Sean P. Healey; Paul L. Patterson; Sassan S. Saatchi; Michael A. Lefsky; Andrew J. Lister; Elizabeth A. Freeman

    2012-01-01

    Lidar height data collected by the Geosciences Laser Altimeter System (GLAS) from 2002 to 2008 has the potential to form the basis of a globally consistent sample-based inventory of forest biomass. GLAS lidar return data were collected globally in spatially discrete full waveform "shots," which have been shown to be strongly correlated with aboveground forest...

  18. Two-part payments for the reimbursement of investments in health technologies.

    Science.gov (United States)

    Levaggi, Rosella; Moretto, Michele; Pertile, Paolo

    2014-04-01

    The paper studies the impact of alternative reimbursement systems on two provider decisions: whether to adopt a technology whose provision requires a sunk investment cost and how many patients to treat with it. Using a simple economic model we show that the optimal pricing policy involves a two-part payment: a price equal to the marginal cost of the patient whose benefit of treatment equals the cost of provision, and a separate payment for the partial reimbursement of capital costs. Departures from this scheme, which are frequent in DRG tariff systems designed around the world, lead to a trade-off between the objective of making effective technologies available to patients and the need to ensure appropriateness in use. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  19. Measuring process and knowledge consistency

    DEFF Research Database (Denmark)

    Edwards, Kasper; Jensen, Klaes Ladeby; Haug, Anders

    2007-01-01

    When implementing configuration systems, knowledge about products and processes are documented and replicated in the configuration system. This practice assumes that products are specified consistently i.e. on the same rule base and likewise for processes. However, consistency cannot be taken...... for granted; rather the contrary, and attempting to implement a configuration system may easily ignite a political battle. This is because stakes are high in the sense that the rules and processes chosen may only reflect one part of the practice, ignoring a majority of the employees. To avoid this situation......, this paper presents a methodology for measuring product and process consistency prior to implementing a configuration system. The methodology consists of two parts: 1) measuring knowledge consistency and 2) measuring process consistency. Knowledge consistency is measured by developing a questionnaire...

  20. Two-Part Models for Fractional Responses Defined as Ratios of Integers

    Directory of Open Access Journals (Sweden)

    Harald Oberhofer

    2014-09-01

    Full Text Available This paper discusses two alternative two-part models for fractional response variables that are defined as ratios of integers. The first two-part model assumes a Binomial distribution and known group size. It nests the one-part fractional response model proposed by Papke and Wooldridge (1996 and, thus, allows one to apply Wald, LM and/or LR tests in order to discriminate between the two models. The second model extends the first one by allowing for overdispersion in the data. We demonstrate the usefulness of the proposed two-part models for data on the 401(k pension plan participation rates used in Papke and Wooldridge (1996.

  1. 'Cut in two', Part 1: Exposing the Seam in Q 12:42−46

    African Journals Online (AJOL)

    2015-06-22

    Jun 22, 2015 ... 'Cut in two', Part 1: Exposing the Seam in Q 12:42−46. This publication ..... on as normal for everyone except the appointed slave. This .... Greek and English with parallels from the Gospels of Mark and Thomas, Fortress,.

  2. Bayesian inference for two-part mixed-effects model using skew distributions, with application to longitudinal semicontinuous alcohol data.

    Science.gov (United States)

    Xing, Dongyuan; Huang, Yangxin; Chen, Henian; Zhu, Yiliang; Dagne, Getachew A; Baldwin, Julie

    2017-08-01

    Semicontinuous data featured with an excessive proportion of zeros and right-skewed continuous positive values arise frequently in practice. One example would be the substance abuse/dependence symptoms data for which a substantial proportion of subjects investigated may report zero. Two-part mixed-effects models have been developed to analyze repeated measures of semicontinuous data from longitudinal studies. In this paper, we propose a flexible two-part mixed-effects model with skew distributions for correlated semicontinuous alcohol data under the framework of a Bayesian approach. The proposed model specification consists of two mixed-effects models linked by the correlated random effects: (i) a model on the occurrence of positive values using a generalized logistic mixed-effects model (Part I); and (ii) a model on the intensity of positive values using a linear mixed-effects model where the model errors follow skew distributions including skew- t and skew-normal distributions (Part II). The proposed method is illustrated with an alcohol abuse/dependence symptoms data from a longitudinal observational study, and the analytic results are reported by comparing potential models under different random-effects structures. Simulation studies are conducted to assess the performance of the proposed models and method.

  3. A basket two-part model to analyze medical expenditure on interdependent multiple sectors.

    Science.gov (United States)

    Sugawara, Shinya; Wu, Tianyi; Yamanishi, Kenji

    2018-05-01

    This study proposes a novel statistical methodology to analyze expenditure on multiple medical sectors using consumer data. Conventionally, medical expenditure has been analyzed by two-part models, which separately consider purchase decision and amount of expenditure. We extend the traditional two-part models by adding the step of basket analysis for dimension reduction. This new step enables us to analyze complicated interdependence between multiple sectors without an identification problem. As an empirical application for the proposed method, we analyze data of 13 medical sectors from the Medical Expenditure Panel Survey. In comparison with the results of previous studies that analyzed the multiple sector independently, our method provides more detailed implications of the impacts of individual socioeconomic status on the composition of joint purchases from multiple medical sectors; our method has a better prediction performance.

  4. Subway Mandibular Buccal Defect Blocked with Two Part Prosthesis Unified by Earth Magnets

    OpenAIRE

    Punjani, Shikha; Arora, Aman; Upadhyaya, Viram

    2012-01-01

    This clinical report describes the fabrication of a two-piece obturator used to close the mandibular buccal defect. Two-piece obturator prosthesis was fabricated with clear heat cure acrylic resin to be used during the healing period following the marsupialization of odontogenic keratocyst which had lead to the loss of portions of the mandibular buccal region. The prosthesis fabricated in two parts was joined by the rare earth magnets. Retention was increased by lining the prosthesis with tis...

  5. Design and validation of a consistent and reproducible manufacture process for the production of clinical-grade bone marrow-derived multipotent mesenchymal stromal cells.

    Science.gov (United States)

    Codinach, Margarita; Blanco, Margarita; Ortega, Isabel; Lloret, Mireia; Reales, Laura; Coca, Maria Isabel; Torrents, Sílvia; Doral, Manel; Oliver-Vila, Irene; Requena-Montero, Miriam; Vives, Joaquim; Garcia-López, Joan

    2016-09-01

    Multipotent mesenchymal stromal cells (MSC) have achieved a notable prominence in the field of regenerative medicine, despite the lack of common standards in the production processes and suitable quality controls compatible with Good Manufacturing Practice (GMP). Herein we describe the design of a bioprocess for bone marrow (BM)-derived MSC isolation and expansion, its validation and production of 48 consecutive batches for clinical use. BM samples were collected from the iliac crest of patients for autologous therapy. Manufacturing procedures included: (i) isolation of nucleated cells (NC) by automated density-gradient centrifugation and plating; (ii) trypsinization and expansion of secondary cultures; and (iii) harvest and formulation of a suspension containing 40 ± 10 × 10(6) viable cells. Quality controls were defined as: (i) cell count and viability assessment; (ii) immunophenotype; and (iii) sterility tests, Mycoplasma detection, endotoxin test and Gram staining. A 3-week manufacturing bioprocess was first designed and then validated in 3 consecutive mock productions, prior to producing 48 batches of BM-MSC for clinical use. Validation included the assessment of MSC identity and genetic stability. Regarding production, 139.0 ± 17.8 mL of BM containing 2.53 ± 0.92 × 10(9) viable NC were used as starting material, yielding 38.8 ± 5.3 × 10(6) viable cells in the final product. Surface antigen expression was consistent with the expected phenotype for MSC, displaying high levels of CD73, CD90 and CD105, lack of expression of CD31 and CD45 and low levels of HLA-DR. Tests for sterility, Mycoplasma, Gram staining and endotoxin had negative results in all cases. Herein we demonstrated the establishment of a feasible, consistent and reproducible bioprocess for the production of safe BM-derived MSC for clinical use. Copyright © 2016 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  6. Structural Consistency, Consistency, and Sequential Rationality.

    OpenAIRE

    Kreps, David M; Ramey, Garey

    1987-01-01

    Sequential equilibria comprise consistent beliefs and a sequentially ra tional strategy profile. Consistent beliefs are limits of Bayes ratio nal beliefs for sequences of strategies that approach the equilibrium strategy. Beliefs are structurally consistent if they are rationaliz ed by some single conjecture concerning opponents' strategies. Consis tent beliefs are not necessarily structurally consistent, notwithstan ding a claim by Kreps and Robert Wilson (1982). Moreover, the spirit of stru...

  7. Two-part pricing structure in long-term gas sales contracts

    International Nuclear Information System (INIS)

    Slocum, J.C.; Lee, S.Y.

    1992-01-01

    Although the incremental electricity generation market has the potential to be a major growth area for natural gas demand in the U.S., it may never live up to such promise unless gas suppliers are more willing to enter into long-term gas sales agreements necessary to nurture this segment of the industry. The authors submit that producer reluctance to enter into such long-term sales agreements can be traced, at least in part to the differing contract price requirements between gas producers and buyers. This paper will address an evolving solution to this contracting dilemma - the development of a two-part pricing structure for the gas commodity. A two-part pricing structure includes a usage or throughput charge established in a way to yield a marginal gas cost competitive with electric utility avoided costs, and a reservation charge established to guarantee a minimum cash flow to the producer. Moreover, the combined effect of the two charges may yield total revenues that better reflect the producer's replacement cost of the reserves committed under the contract. 2 tabs

  8. Joint two-part Tobit models for longitudinal and time-to-event data.

    Science.gov (United States)

    Dagne, Getachew A

    2017-11-20

    In this article, we show how Tobit models can address problems of identifying characteristics of subjects having left-censored outcomes in the context of developing a method for jointly analyzing time-to-event and longitudinal data. There are some methods for handling these types of data separately, but they may not be appropriate when time to event is dependent on the longitudinal outcome, and a substantial portion of values are reported to be below the limits of detection. An alternative approach is to develop a joint model for the time-to-event outcome and a two-part longitudinal outcome, linking them through random effects. This proposed approach is implemented to assess the association between the risk of decline of CD4/CD8 ratio and rates of change in viral load, along with discriminating between patients who are potentially progressors to AIDS from patients who do not. We develop a fully Bayesian approach for fitting joint two-part Tobit models and illustrate the proposed methods on simulated and real data from an AIDS clinical study. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Development of the two-part pattern during regeneration of the head in hydra

    DEFF Research Database (Denmark)

    Bode, Matthias; Awad, T A; Koizumi, O

    1988-01-01

    The head of a hydra is composed of two parts, a domed hypostome with a mouth at the top and a ring of tentacles below. When animals are decapitated a new head regenerates. During the process of regeneration the apical tip passes through a transient stage in which it exhibits tentacle......-like characteristics before becoming a hypostome. This was determined from markers which appeared before morphogenesis took place. The first was a monoclonal antibody, TS-19, that specifically binds to the ectodermal epithelial cells of the tentacles. The second was an antiserum against the peptide Arg......-Phe-amide (RFamide), which in the head of hydra is specific to the sensory cells of the hypostomal apex and the ganglion cells of the lower hypostome and tentacles. The TS-19 expression and the ganglion cells with RFamide-like immunoreactivity (RLI) arose first at the apex and spread radially. Once the tentacles...

  10. Multivariate two-part statistics for analysis of correlated mass spectrometry data from multiple biological specimens.

    Science.gov (United States)

    Taylor, Sandra L; Ruhaak, L Renee; Weiss, Robert H; Kelly, Karen; Kim, Kyoungmi

    2017-01-01

    High through-put mass spectrometry (MS) is now being used to profile small molecular compounds across multiple biological sample types from the same subjects with the goal of leveraging information across biospecimens. Multivariate statistical methods that combine information from all biospecimens could be more powerful than the usual univariate analyses. However, missing values are common in MS data and imputation can impact between-biospecimen correlation and multivariate analysis results. We propose two multivariate two-part statistics that accommodate missing values and combine data from all biospecimens to identify differentially regulated compounds. Statistical significance is determined using a multivariate permutation null distribution. Relative to univariate tests, the multivariate procedures detected more significant compounds in three biological datasets. In a simulation study, we showed that multi-biospecimen testing procedures were more powerful than single-biospecimen methods when compounds are differentially regulated in multiple biospecimens but univariate methods can be more powerful if compounds are differentially regulated in only one biospecimen. We provide R functions to implement and illustrate our method as supplementary information CONTACT: sltaylor@ucdavis.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. Design and performance of a passive dosimeter consisting of a stack of photostimulable phosphor (PSP) imaging plates (IP) based on BaF(Br, I):Eu2+

    International Nuclear Information System (INIS)

    Jean-Paul Negre

    2011-07-01

    This manuscript presents an original concept of passive, lightweight, and compact dosimeter based on a stack of BaFBr:Eu 2+ photostimulable phosphor plates (Image Plate) alternating with high-Z metal screens. It describes the manufacture and the method to calibrate the dosimeter. This method consists in using a Co 60 standard source and Monte Carlo N-Particle codes (MCNPX/MCNP5) to apply them to a large area of radiation energies, such as quasi mono-energetic radiation (Gamma rays decay of radioactive isotope, X-ray fluorescence), or continuous radiation spectra (Bremsstrahlung radiation, synchrotron light source). The measurement recurrence in the stack of couples 'metallic foil / IP' ensures consistency measurements, determines the threshold depth of electronic equilibrium (depending on the radiation energy) and allows us to infer the absolute dose in air (Kerma). The depth-dose curve in the stack and transmission measurements provide an estimate of the effective energy of incident radiations, report the presence of parasite scattered radiations and allow us to discriminate the nature of ionizing particles. The 2D features of the device are used to characterize the ballistic fate of charged particles in material thickness, which is of great interest with narrow particles beams. This dosimeter has remarkable advantages over other passive dosimeters, including a dynamic range larger than 10 7 of linear photon dose detection from less than 0.5 μGy and up to several Gy for radiation energies between a few tens of keV and more than 10 MeV (20 MeV with Bremsstrahlung X-ray spectra). This concept originality consists in almost immediately getting the measurement results after an exposure and a single pass reading of the dosimeter. It can respond positively to most of the usual needs in radiation metrology: personal or environmental dosimetry (radiation protection); Controls / characterization / mapping around materials and emitting ionizing radiation devices

  12. Impact of coil design on the contrast-to-noise ratio, precision, and consistency of quantitative cartilage morphometry at 3 Tesla: a pilot study for the osteoarthritis initiative.

    Science.gov (United States)

    Eckstein, Felix; Kunz, Manuela; Hudelmaier, Martin; Jackson, Rebecca; Yu, Joseph; Eaton, Charles B; Schneider, Erika

    2007-02-01

    Phased-array (PA) coils generally provide higher signal-to-noise ratios (SNRs) than quadrature knee coils. In this pilot study for the Osteoarthritis Initiative (OAI) we compared these two types of coils in terms of contrast-to-noise ratio (CNR), precision, and consistency of quantitative femorotibial cartilage measurements. Test-retest measurements were acquired using coronal fast low-angle shot with water excitation (FLASHwe) and coronal multiplanar reconstruction (MPR) of sagittal double-echo steady state with water excitation (DESSwe) at 3T. The precision errors for cartilage volume and thickness were coil and coil with FLASHwe, and coil and sequence. The PA coil measurements did not always fully agree with the quadrature coil measurements, and some differences were significant. The higher CNR of the PA coil did not translate directly into improved precision of cartilage measurement; however, summing up cartilage plates within the medial and lateral compartment reduced precision errors. Copyright (c) 2007 Wiley-Liss, Inc.

  13. Apparently-Different Clearance Rates from Cohort Studies of Mycoplasma genitalium Are Consistent after Accounting for Incidence of Infection, Recurrent Infection, and Study Design.

    Directory of Open Access Journals (Sweden)

    Timo Smieszek

    Full Text Available Mycoplasma genitalium is a potentially major cause of urethritis, cervicitis, pelvic inflammatory disease, infertility, and increased HIV risk. A better understanding of its natural history is crucial to informing control policy. Two extensive cohort studies (students in London, UK; Ugandan sex workers suggest very different clearance rates; we aimed to understand the reasons and obtain improved estimates by making maximal use of the data from the studies. As M. genitalium is a sexually-transmitted infectious disease, we developed a model for time-to-event analysis that incorporates the processes of (reinfection and clearance, and fitted to data from the two cohort studies to estimate incidence and clearance rates under different scenarios of sexual partnership dynamics and study design (including sample handling and associated test sensitivity. In the London students, the estimated clearance rate is 0.80 p.a. (mean duration 15 months, with incidence 1.31%-3.93% p.a. Without adjusting for study design, corresponding estimates from the Ugandan data are 3.44 p.a. (mean duration 3.5 months and 58% p.a. Apparent differences in clearance rates are probably mostly due to lower testing sensitivity in the Uganda study due to differences in sample handling, with 'true' clearance rates being similar, and adjusted incidence in Uganda being 28% p.a. Some differences are perhaps due to the sex workers having more-frequent antibiotic treatment, whilst reinfection within ongoing sexual partnerships might have caused some of the apparently-persistent infection in the London students. More information on partnership dynamics would inform more accurate estimates of natural-history parameters. Detailed studies in men are also required.

  14. The U. S. transportation sector in the year 2030: results of a two-part Delphi survey.

    Energy Technology Data Exchange (ETDEWEB)

    Morrison, G.; Stephens, T.S. (Energy Systems); (Univ. of California at Davis); (ES)

    2011-10-11

    A two-part Delphi Survey was given to transportation experts attending the Asilomar Conference on Transportation and Energy in August, 2011. The survey asked respondents about trends in the US transportation sector in 2030. Topics included: alternative vehicles, high speed rail construction, rail freight transportation, average vehicle miles traveled, truck versus passenger car shares, vehicle fuel economy, and biofuels in different modes. The survey consisted of two rounds -- both asked the same set of seven questions. In the first round, respondents were given a short introductory paragraph about the topic and asked to use their own judgment in their responses. In the second round, the respondents were asked the same questions, but were also given results from the first round as guidance. The survey was sponsored by Argonne National Lab (ANL), the National Renewable Energy Lab (NREL), and implemented by University of California at Davis, Institute of Transportation Studies. The survey was part of the larger Transportation Energy Futures (TEF) project run by the Department of Energy, Office of Energy Efficiency and Renewable Energy. Of the 206 invitation letters sent, 94 answered all questions in the first round (105 answered at least one question), and 23 of those answered all questions in the second round. 10 of the 23 second round responses were at a discussion section at Asilomar, while the remaining were online. Means and standard deviations of responses from Round One and Two are given in Table 1 below. One main purpose of Delphi surveys is to reduce the variance in opinions through successive rounds of questioning. As shown in Table 1, the standard deviations of 25 of the 30 individual sub-questions decreased between Round One and Round Two, but the decrease was slight in most cases.

  15. Marijuana Use among Juvenile Arrestees: A Two-Part Growth Model Analysis

    Science.gov (United States)

    Dembo, Richard; Wareham, Jennifer; Greenbaum, Paul E.; Childs, Kristina; Schmeidler, James

    2009-01-01

    This article examines the impact of sociodemographic characteristics and psychosocial factors on the probability and frequency of marijuana use and, for youths initiating use, on their frequency of use over four time points. The sample consists of 278 justice-involved youths completing at least one of three follow-up interviews as part of a…

  16. LOCATION OF ACYL GROUPS ON TWO PARTLY ACYLATED GLYCOLIPIDS FROM STRAINS OF USTILAGO (SMUT FUNGI),

    Science.gov (United States)

    erythritol from Ustilago sp. (probably U. nuda (Jens.) Rostr. = U. tritici (Pers.) Rostr.) PRL-627 were acetalated with methyl vinyl ether, deacylated...Partly acylated ustilagic acids 8 (from Ustilago maydis (DC) Corda (= U. zeae Unger) PRL-119), consisting of partially esterified beta-cellobiosyl

  17. Consistent model driven architecture

    Science.gov (United States)

    Niepostyn, Stanisław J.

    2015-09-01

    The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.

  18. Bitcoin Meets Strong Consistency

    OpenAIRE

    Decker, Christian; Seidel, Jochen; Wattenhofer, Roger

    2014-01-01

    The Bitcoin system only provides eventual consistency. For everyday life, the time to confirm a Bitcoin transaction is prohibitively slow. In this paper we propose a new system, built on the Bitcoin blockchain, which enables strong consistency. Our system, PeerCensus, acts as a certification authority, manages peer identities in a peer-to-peer network, and ultimately enhances Bitcoin and similar systems with strong consistency. Our extensive analysis shows that PeerCensus is in a secure state...

  19. Consistent classical supergravity theories

    International Nuclear Information System (INIS)

    Muller, M.

    1989-01-01

    This book offers a presentation of both conformal and Poincare supergravity. The consistent four-dimensional supergravity theories are classified. The formulae needed for further modelling are included

  20. Consistency of orthodox gravity

    Energy Technology Data Exchange (ETDEWEB)

    Bellucci, S. [INFN, Frascati (Italy). Laboratori Nazionali di Frascati; Shiekh, A. [International Centre for Theoretical Physics, Trieste (Italy)

    1997-01-01

    A recent proposal for quantizing gravity is investigated for self consistency. The existence of a fixed-point all-order solution is found, corresponding to a consistent quantum gravity. A criterion to unify couplings is suggested, by invoking an application of their argument to more complex systems.

  1. Quasiparticles and thermodynamical consistency

    International Nuclear Information System (INIS)

    Shanenko, A.A.; Biro, T.S.; Toneev, V.D.

    2003-01-01

    A brief and simple introduction into the problem of the thermodynamical consistency is given. The thermodynamical consistency relations, which should be taken into account under constructing a quasiparticle model, are found in a general manner from the finite-temperature extension of the Hellmann-Feynman theorem. Restrictions following from these relations are illustrated by simple physical examples. (author)

  2. Consistency in PERT problems

    OpenAIRE

    Bergantiños, Gustavo; Valencia-Toledo, Alfredo; Vidal-Puga, Juan

    2016-01-01

    The program evaluation review technique (PERT) is a tool used to schedule and coordinate activities in a complex project. In assigning the cost of a potential delay, we characterize the Shapley rule as the only rule that satisfies consistency and other desirable properties.

  3. Reporting consistently on CSR

    DEFF Research Database (Denmark)

    Thomsen, Christa; Nielsen, Anne Ellerup

    2006-01-01

    This chapter first outlines theory and literature on CSR and Stakeholder Relations focusing on the different perspectives and the contextual and dynamic character of the CSR concept. CSR reporting challenges are discussed and a model of analysis is proposed. Next, our paper presents the results...... of a case study showing that companies use different and not necessarily consistent strategies for reporting on CSR. Finally, the implications for managerial practice are discussed. The chapter concludes by highlighting the value and awareness of the discourse and the discourse types adopted...... in the reporting material. By implementing consistent discourse strategies that interact according to a well-defined pattern or order, it is possible to communicate a strong social commitment on the one hand, and to take into consideration the expectations of the shareholders and the other stakeholders...

  4. Geometrically Consistent Mesh Modification

    KAUST Repository

    Bonito, A.

    2010-01-01

    A new paradigm of adaptivity is to execute refinement, coarsening, and smoothing of meshes on manifolds with incomplete information about their geometry and yet preserve position and curvature accuracy. We refer to this collectively as geometrically consistent (GC) mesh modification. We discuss the concept of discrete GC, show the failure of naive approaches, and propose and analyze a simple algorithm that is GC and accuracy preserving. © 2010 Society for Industrial and Applied Mathematics.

  5. Two-part zero-inflated negative binomial regression model for quantitative trait loci mapping with count trait.

    Science.gov (United States)

    Moghimbeigi, Abbas

    2015-05-07

    Poisson regression models provide a standard framework for quantitative trait locus (QTL) mapping of count traits. In practice, however, count traits are often over-dispersed relative to the Poisson distribution. In these situations, the zero-inflated Poisson (ZIP), zero-inflated generalized Poisson (ZIGP) and zero-inflated negative binomial (ZINB) regression may be useful for QTL mapping of count traits. Added genetic variables to the negative binomial part equation, may also affect extra zero data. In this study, to overcome these challenges, I apply two-part ZINB model. The EM algorithm with Newton-Raphson method in the M-step uses for estimating parameters. An application of the two-part ZINB model for QTL mapping is considered to detect associations between the formation of gallstone and the genotype of markers. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. The Principle of Energetic Consistency

    Science.gov (United States)

    Cohn, Stephen E.

    2009-01-01

    A basic result in estimation theory is that the minimum variance estimate of the dynamical state, given the observations, is the conditional mean estimate. This result holds independently of the specifics of any dynamical or observation nonlinearity or stochasticity, requiring only that the probability density function of the state, conditioned on the observations, has two moments. For nonlinear dynamics that conserve a total energy, this general result implies the principle of energetic consistency: if the dynamical variables are taken to be the natural energy variables, then the sum of the total energy of the conditional mean and the trace of the conditional covariance matrix (the total variance) is constant between observations. Ensemble Kalman filtering methods are designed to approximate the evolution of the conditional mean and covariance matrix. For them the principle of energetic consistency holds independently of ensemble size, even with covariance localization. However, full Kalman filter experiments with advection dynamics have shown that a small amount of numerical dissipation can cause a large, state-dependent loss of total variance, to the detriment of filter performance. The principle of energetic consistency offers a simple way to test whether this spurious loss of variance limits ensemble filter performance in full-blown applications. The classical second-moment closure (third-moment discard) equations also satisfy the principle of energetic consistency, independently of the rank of the conditional covariance matrix. Low-rank approximation of these equations offers an energetically consistent, computationally viable alternative to ensemble filtering. Current formulations of long-window, weak-constraint, four-dimensional variational methods are designed to approximate the conditional mode rather than the conditional mean. Thus they neglect the nonlinear bias term in the second-moment closure equation for the conditional mean. The principle of

  7. Modelos de perfil de velocidad para evaluación de consistencia del trazado en carreteras de la provincia de Villa Clara, Cuba Speed profile models for evaluation of design consistency in road of the province of Villa Clara, Cuba

    Directory of Open Access Journals (Sweden)

    René A. García Depestre

    2012-08-01

    Full Text Available Entre los aspectos relativos a la carretera que influyen en la accidentalidad, tiene un gran peso el diseño geométrico, internacionalmente el método más empleado para la evaluación del diseño es a partir de la consistencia del trazado con modelos de perfil de velocidades de operación. Cuba no cuenta con modelos propios que consideren las características de las carreteras y los conductores, por lo que es necesario desarrollar modelos de predicción de velocidades para la evaluación de la consistencia del trazado. El desarrollo de modelos de predicción del perfil de velocidades de operación para diferentes condiciones de alineación en carreteras rurales de dos carriles en el contexto de Cuba, se efectúa a partir de características geométricas y velocidades puntuales, con análisis estadístico de las principales variables que relacionan la velocidad con el diseño. Una vez desarrollados los modelos, se aplican a un tramo de carretera declarado como tramo de concentración de accidentes (TCA de la provincia de Villa Clara localizada en la región central de Cuba, los resultados obtenidos confirman la validez de los modelos desarrollados para determinar los perfiles de velocidad de operación y de esta forma, evaluar la consistencia del trazado, con el objetivo de detectar los lugares de mayores dificultades con relación al trazado.Among the aspects that influence road accidents, geometric design is the most relevant. The most used method for evaluating the design all over the world is based on the design consistency with speed profile models of operation. Cuba does not have proprietary models that consider the characteristics of the roads and drivers, so it is necessary to develop predictive models for the evaluation of design consistency. The development of operating speed profile prediction models for different alignment conditions on rural roads with two lanes in the context of Cuba is made from the geometric characteristics and

  8. The Rucio Consistency Service

    CERN Document Server

    Serfon, Cedric; The ATLAS collaboration

    2016-01-01

    One of the biggest challenge with Large scale data management system is to ensure the consistency between the global file catalog and what is physically on all storage elements. To tackle this issue, the Rucio software which is used by the ATLAS Distributed Data Management system has been extended to automatically handle lost or unregistered files (aka Dark Data). This system automatically detects these inconsistencies and take actions like recovery or deletion of unneeded files in a central manner. In this talk, we will present this system, explain the internals and give some results.

  9. Is cosmology consistent?

    International Nuclear Information System (INIS)

    Wang Xiaomin; Tegmark, Max; Zaldarriaga, Matias

    2002-01-01

    We perform a detailed analysis of the latest cosmic microwave background (CMB) measurements (including BOOMERaNG, DASI, Maxima and CBI), both alone and jointly with other cosmological data sets involving, e.g., galaxy clustering and the Lyman Alpha Forest. We first address the question of whether the CMB data are internally consistent once calibration and beam uncertainties are taken into account, performing a series of statistical tests. With a few minor caveats, our answer is yes, and we compress all data into a single set of 24 bandpowers with associated covariance matrix and window functions. We then compute joint constraints on the 11 parameters of the 'standard' adiabatic inflationary cosmological model. Our best fit model passes a series of physical consistency checks and agrees with essentially all currently available cosmological data. In addition to sharp constraints on the cosmic matter budget in good agreement with those of the BOOMERaNG, DASI and Maxima teams, we obtain a heaviest neutrino mass range 0.04-4.2 eV and the sharpest constraints to date on gravity waves which (together with preference for a slight red-tilt) favor 'small-field' inflation models

  10. Consistent Quantum Theory

    Science.gov (United States)

    Griffiths, Robert B.

    2001-11-01

    Quantum mechanics is one of the most fundamental yet difficult subjects in physics. Nonrelativistic quantum theory is presented here in a clear and systematic fashion, integrating Born's probabilistic interpretation with Schrödinger dynamics. Basic quantum principles are illustrated with simple examples requiring no mathematics beyond linear algebra and elementary probability theory. The quantum measurement process is consistently analyzed using fundamental quantum principles without referring to measurement. These same principles are used to resolve several of the paradoxes that have long perplexed physicists, including the double slit and Schrödinger's cat. The consistent histories formalism used here was first introduced by the author, and extended by M. Gell-Mann, J. Hartle and R. Omnès. Essential for researchers yet accessible to advanced undergraduate students in physics, chemistry, mathematics, and computer science, this book is supplementary to standard textbooks. It will also be of interest to physicists and philosophers working on the foundations of quantum mechanics. Comprehensive account Written by one of the main figures in the field Paperback edition of successful work on philosophy of quantum mechanics

  11. Designing end-user interfaces

    CERN Document Server

    Heaton, N

    1988-01-01

    Designing End-User Interfaces: State of the Art Report focuses on the field of human/computer interaction (HCI) that reviews the design of end-user interfaces.This compilation is divided into two parts. Part I examines specific aspects of the problem in HCI that range from basic definitions of the problem, evaluation of how to look at the problem domain, and fundamental work aimed at introducing human factors into all aspects of the design cycle. Part II consists of six main topics-definition of the problem, psychological and social factors, principles of interface design, computer intelligenc

  12. OCRWM Bulletin: Westinghouse begins designing multi-purpose canister

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    This publication consists of two parts: OCRWM (Office of Civilian Radioactive Waste Management) Bulletin; and Of Mountains & Science which has articles on the Yucca Mountain project. The OCRWM provides information about OCRWM activities and in this issue has articles on multi-purpose canister design, and transportation cask trailer.

  13. OCRWM Bulletin: Westinghouse begins designing multi-purpose canister

    International Nuclear Information System (INIS)

    1995-01-01

    This publication consists of two parts: OCRWM (Office of Civilian Radioactive Waste Management) Bulletin; and Of Mountains ampersand Science which has articles on the Yucca Mountain project. The OCRWM provides information about OCRWM activities and in this issue has articles on multi-purpose canister design, and transportation cask trailer

  14. Pricing policies for a two-part exhaustible resource cartel: the case of OPEC (world oil project). Working paper

    Energy Technology Data Exchange (ETDEWEB)

    Hnyilicza, E.; Pindyck, R.S.

    1976-04-01

    This paper examines pricing policies for OPEC under the assumption that the cartel is composed of a block of spender countries with large cash needs and a block of saver countries with little immediate need for cash and a lower rate of discount. The decision problem for the two-part cartel is embodied in a game-theoretic framework and the optimal bargaining solution is computed using results from the theory of cooperative games developed by Nash. The set of feasible bargaining points--and the corresponding Nash solution--is computed under two assumptions on the behavior of output shares: that they are subject to choice and that they are fixed at historical values. The results suggest that for fixed output shares, there is little room for bargaining and the price path approximates the optimal monopoly price path. If the shares are subject to control, optimal paths depend significantly on the relative bargaining power of each block.

  15. Orthology and paralogy constraints: satisfiability and consistency.

    Science.gov (United States)

    Lafond, Manuel; El-Mabrouk, Nadia

    2014-01-01

    A variety of methods based on sequence similarity, reconciliation, synteny or functional characteristics, can be used to infer orthology and paralogy relations between genes of a given gene family  G. But is a given set  C of orthology/paralogy constraints possible, i.e., can they simultaneously co-exist in an evolutionary history for  G? While previous studies have focused on full sets of constraints, here we consider the general case where  C does not necessarily involve a constraint for each pair of genes. The problem is subdivided in two parts: (1) Is  C satisfiable, i.e. can we find an event-labeled gene tree G inducing  C? (2) Is there such a G which is consistent, i.e., such that all displayed triplet phylogenies are included in a species tree? Previous results on the Graph sandwich problem can be used to answer to (1), and we provide polynomial-time algorithms for satisfiability and consistency with a given species tree. We also describe a new polynomial-time algorithm for the case of consistency with an unknown species tree and full knowledge of pairwise orthology/paralogy relationships, as well as a branch-and-bound algorithm in the case when unknown relations are present. We show that our algorithms can be used in combination with ProteinOrtho, a sequence similarity-based orthology detection tool, to extract a set of robust orthology/paralogy relationships.

  16. Maintaining consistency in distributed systems

    Science.gov (United States)

    Birman, Kenneth P.

    1991-01-01

    In systems designed as assemblies of independently developed components, concurrent access to data or data structures normally arises within individual programs, and is controlled using mutual exclusion constructs, such as semaphores and monitors. Where data is persistent and/or sets of operation are related to one another, transactions or linearizability may be more appropriate. Systems that incorporate cooperative styles of distributed execution often replicate or distribute data within groups of components. In these cases, group oriented consistency properties must be maintained, and tools based on the virtual synchrony execution model greatly simplify the task confronting an application developer. All three styles of distributed computing are likely to be seen in future systems - often, within the same application. This leads us to propose an integrated approach that permits applications that use virtual synchrony with concurrent objects that respect a linearizability constraint, and vice versa. Transactional subsystems are treated as a special case of linearizability.

  17. An assessment of innovative pricing schemes for the communication of value: is price discrimination and two-part pricing a way forward?

    Science.gov (United States)

    Hertzman, Peter; Miller, Paul; Tolley, Keith

    2018-02-01

    With the introduction of new expensive medicines, traditional pricing schemes based on constructs such as price per pill/vial have been challenged. Potential innovative schemes could be either financial-based or performance-based. Within financial-based schemes the use of price discrimination is an emerging option, which we explore in this assessment. Areas covered: In the short term the price per indication approach is likely to become more prevalent for high cost, high benefit new pharmaceuticals, such as those emerging in oncology (e.g. new combination immunotherapies). 'Two-Part Pricing' (2PP) is a frequently used payment method in other industries, which consists of an Entry Fee, giving the buyer the right to use the product, and a Usage Price charged every time the product is purchased. Introducing 2PP into biopharma could have cross-stakeholder benefits including broader patient access, and improvement in budget/revenue predictability. A concern however is the potential complexity of the negotiation between manufacturer and payer. Expert commentary: We believe 'price discrimination' and 2PP in particular can be relevant for some new, expensive specialist medicines. A recommended first step would be to initiate pilots to test to what degree the 2PP approach meets stakeholder objectives and is practical to implement within specialty care.

  18. Testing the visual consistency of web sites

    NARCIS (Netherlands)

    van der Geest, Thea; Loorbach, N.R.

    2005-01-01

    Consistency in the visual appearance of Web pages is often checked by experts, such as designers or reviewers. This article reports a card sort study conducted to determine whether users rather than experts could distinguish visual (in-)consistency in Web elements and pages. The users proved to

  19. Human permanent teeth are divided into two parts at the cemento-enamel junction in the divine golden ratio.

    Science.gov (United States)

    Anand, Rahul; Sarode, Sachin C; Sarode, Gargi S; Patil, Shankargouda

    2017-01-01

    The aim of this study is to find out whether tooth length (crown length + root length) follows the rule of most divine and mysterious phi (ϕ) or the golden ratio. A total of 140 teeth were included in the study. The crown-root ratio was manually calculated using vernier caliper and its approximation to golden ratio or the divine number phi (ϕ) was examined. The average root-crown ratio (R/C) for maxillary central incisor was 1.627 ± 0.04, and of its antagonist, mandibular central incisor was 1.628 ± 0.02. The tooth-root ratio (T/R) for the same was 1.609 ± 0.016 and 1.61 ± 0.008, respectively. Similar values were appreciated for lateral incisors where the R/C ratio in the maxillary and mandibular teeth was 1.632 ± 0.015 and 1.641 ± 0.012 and the T/R ratio was 1.606 ± 0.005 and 1.605 ± 0.005, respectively. On measuring the tooth length in linear fashion from the cusp tip to the root apex, we found that the tooth was divided into two parts at the cemento-enamel junction in the golden ratio. This information can be exploited in restorative and implant dentistry in future.

  20. Reverse Revenue Sharing Contract versus Two-Part Tariff Contract under a Closed-Loop Supply Chain System

    Directory of Open Access Journals (Sweden)

    Zunya Shi

    2016-01-01

    Full Text Available The importance of remanufacturing has been recognized in research and practice. The integrated system, combining the forward and reverse activities of supply chains, is called closed-loop supply chain (CLSC system. By coordination in the CLSC system, players will get economic improvement. This paper studies different coordination performances of two types of contracts, two-part tariff (TTC and reverse revenue sharing contract (RRSC, in a closed-loop system. Through mathematical analysis based on Stackelberg Game Theory, we find that it is easy for manufacturer to improve more profits and retailer’s collection effects by adjusting the ratio of transfer collection price through RRSC, and we also give the function to calculate the best ratio of transfer collection price, which may be a valuable reference for the decision maker in practice. Besides, our results also suggest that although the profits of the coordinated CLSC system are always higher than the contradictory scenario, the RRSC is more favorable to the manufacturer than to the retailer, as results show that the manufacturer will share more profits from the system through RRSC. Therefore, RRSC has attracted the manufacturers more to closing the supply chain for economic consideration.

  1. Delirium, sedation and analgesia in the intensive care unit: a multinational, two-part survey among intensivists.

    Directory of Open Access Journals (Sweden)

    Alawi Luetz

    Full Text Available Analgesia, sedation and delirium management are important parts of intensive care treatment as they are relevant for patients' clinical and functional long-term outcome. Previous surveys showed that despite this fact implementation rates are still low. The primary aim of the prospective, observational multicenter study was to investigate the implementation rate of delirium monitoring among intensivists. Secondly, current practice concerning analgesia and sedation monitoring as well as treatment strategies for patients with delirium were assesed. In addition, this study compares perceived and actual practice regarding delirium, sedation and analgesia management. Data were obtained with a two-part, anonymous survey, containing general data from intensive care units in a first part and data referring to individual patients in a second part. Questionnaires from 101 hospitals (part 1 and 868 patients (part 2 were included in data analysis. Fifty-six percent of the intensive care units reported to monitor for delirium in clinical routine. Fourty-four percent reported the use of a validated delirium score. In this respect, the survey suggests an increasing use of delirium assessment tools compared to previous surveys. Nevertheless, part two of the survey revealed that in actual practice 73% of included patients were not monitored with a validated score. Furthermore, we observed a trend towards moderate or deep sedation which is contradicting to guideline-recommendations. Every fifth patient was suffering from pain. The implementation rate of adequate pain-assessment tools for mechanically ventilated and sedated patients was low (30%. In conclusion, further efforts are necessary to implement guideline recommendations into clinical practice. The study was registered (ClinicalTrials.gov identifier: NCT01278524 and approved by the ethical committee.

  2. Decentralized Consistent Updates in SDN

    KAUST Repository

    Nguyen, Thanh Dang

    2017-04-10

    We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and blackholes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.

  3. Consistent application of codes and standards

    International Nuclear Information System (INIS)

    Scott, M.A.

    1989-01-01

    The guidelines presented in the US Department of Energy, General Design Criteria (DOE 6430.1A), and the Design and Evaluation Guidelines for Department of Energy Facilities Subject to Natural Phenomena Hazards (UCRL-15910) provide a consistent and well defined approach to determine the natural phenomena hazards loads for US Department of Energy site facilities. The guidelines for the application of loads combinations and allowables criteria are not as well defined and are more flexible in interpretation. This flexibility in the interpretation of load combinations can lead to conflict between the designer and overseer. The establishment of an efficient set of acceptable design criteria, based on US Department of Energy guidelines, provides a consistent baseline for analysis, design, and review. Additionally, the proposed method should not limit the design and analytical innovation necessary to analyze or qualify the unique structure. This paper investigates the consistent application of load combinations, analytical methods, and load allowables and suggests a reference path consistent with the US Department of Energy guidelines

  4. Replica consistency in a Data Grid

    International Nuclear Information System (INIS)

    Domenici, Andrea; Donno, Flavia; Pucciani, Gianni; Stockinger, Heinz; Stockinger, Kurt

    2004-01-01

    A Data Grid is a wide area computing infrastructure that employs Grid technologies to provide storage capacity and processing power to applications that handle very large quantities of data. Data Grids rely on data replication to achieve better performance and reliability by storing copies of data sets on different Grid nodes. When a data set can be modified by applications, the problem of maintaining consistency among existing copies arises. The consistency problem also concerns metadata, i.e., additional information about application data sets such as indices, directories, or catalogues. This kind of metadata is used both by the applications and by the Grid middleware to manage the data. For instance, the Replica Management Service (the Grid middleware component that controls data replication) uses catalogues to find the replicas of each data set. Such catalogues can also be replicated and their consistency is crucial to the correct operation of the Grid. Therefore, metadata consistency generally poses stricter requirements than data consistency. In this paper we report on the development of a Replica Consistency Service based on the middleware mainly developed by the European Data Grid Project. The paper summarises the main issues in the replica consistency problem, and lays out a high-level architectural design for a Replica Consistency Service. Finally, results from simulations of different consistency models are presented

  5. Modeling and Testing Legacy Data Consistency Requirements

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard

    2003-01-01

    An increasing number of data sources are available on the Internet, many of which offer semantically overlapping data, but based on different schemas, or models. While it is often of interest to integrate such data sources, the lack of consistency among them makes this integration difficult....... This paper addresses the need for new techniques that enable the modeling and consistency checking for legacy data sources. Specifically, the paper contributes to the development of a framework that enables consistency testing of data coming from different types of data sources. The vehicle is UML and its...... accompanying XMI. The paper presents techniques for modeling consistency requirements using OCL and other UML modeling elements: it studies how models that describe the required consistencies among instances of legacy models can be designed in standard UML tools that support XMI. The paper also considers...

  6. Consistency argued students of fluid

    Science.gov (United States)

    Viyanti; Cari; Suparmi; Winarti; Slamet Budiarti, Indah; Handika, Jeffry; Widyastuti, Fatma

    2017-01-01

    Problem solving for physics concepts through consistency arguments can improve thinking skills of students and it is an important thing in science. The study aims to assess the consistency of the material Fluid student argmentation. The population of this study are College students PGRI Madiun, UIN Sunan Kalijaga Yogyakarta and Lampung University. Samples using cluster random sampling, 145 samples obtained by the number of students. The study used a descriptive survey method. Data obtained through multiple-choice test and interview reasoned. Problem fluid modified from [9] and [1]. The results of the study gained an average consistency argmentation for the right consistency, consistency is wrong, and inconsistent respectively 4.85%; 29.93%; and 65.23%. Data from the study have an impact on the lack of understanding of the fluid material which is ideally in full consistency argued affect the expansion of understanding of the concept. The results of the study as a reference in making improvements in future studies is to obtain a positive change in the consistency of argumentations.

  7. Coordinating user interfaces for consistency

    CERN Document Server

    Nielsen, Jakob

    2001-01-01

    In the years since Jakob Nielsen's classic collection on interface consistency first appeared, much has changed, and much has stayed the same. On the one hand, there's been exponential growth in the opportunities for following or disregarding the principles of interface consistency-more computers, more applications, more users, and of course the vast expanse of the Web. On the other, there are the principles themselves, as persistent and as valuable as ever. In these contributed chapters, you'll find details on many methods for seeking and enforcing consistency, along with bottom-line analys

  8. Choice, internal consistency, and rationality

    OpenAIRE

    Aditi Bhattacharyya; Prasanta K. Pattanaik; Yongsheng Xu

    2010-01-01

    The classical theory of rational choice is built on several important internal consistency conditions. In recent years, the reasonableness of those internal consistency conditions has been questioned and criticized, and several responses to accommodate such criticisms have been proposed in the literature. This paper develops a general framework to accommodate the issues raised by the criticisms of classical rational choice theory, and examines the broad impact of these criticisms from both no...

  9. Self-consistent quark bags

    International Nuclear Information System (INIS)

    Rafelski, J.

    1979-01-01

    After an introductory overview of the bag model the author uses the self-consistent solution of the coupled Dirac-meson fields to represent a bound state of strongly ineteracting fermions. In this framework he discusses the vivial approach to classical field equations. After a short description of the used numerical methods the properties of bound states of scalar self-consistent Fields and the solutions of a self-coupled Dirac field are considered. (HSI) [de

  10. Time-consistent and market-consistent evaluations

    NARCIS (Netherlands)

    Pelsser, A.; Stadje, M.A.

    2014-01-01

    We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  11. Market-consistent actuarial valuation

    CERN Document Server

    Wüthrich, Mario V

    2016-01-01

    This is the third edition of this well-received textbook, presenting powerful methods for measuring insurance liabilities and assets in a consistent way, with detailed mathematical frameworks that lead to market-consistent values for liabilities. Topics covered are stochastic discounting with deflators, valuation portfolio in life and non-life insurance, probability distortions, asset and liability management, financial risks, insurance technical risks, and solvency. Including updates on recent developments and regulatory changes under Solvency II, this new edition of Market-Consistent Actuarial Valuation also elaborates on different risk measures, providing a revised definition of solvency based on industry practice, and presents an adapted valuation framework which takes a dynamic view of non-life insurance reserving risk.

  12. Consistent guiding center drift theories

    International Nuclear Information System (INIS)

    Wimmel, H.K.

    1982-04-01

    Various guiding-center drift theories are presented that are optimized in respect of consistency. They satisfy exact energy conservation theorems (in time-independent fields), Liouville's theorems, and appropriate power balance equations. A theoretical framework is given that allows direct and exact derivation of associated drift-kinetic equations from the respective guiding-center drift-orbit theories. These drift-kinetic equations are listed. Northrop's non-optimized theory is discussed for reference, and internal consistency relations of G.C. drift theories are presented. (orig.)

  13. Weak consistency and strong paraconsistency

    Directory of Open Access Journals (Sweden)

    Gemma Robles

    2009-11-01

    Full Text Available In a standard sense, consistency and paraconsistency are understood as, respectively, the absence of any contradiction and as the absence of the ECQ (“E contradictione quodlibet” rule that allows us to conclude any well formed formula from any contradiction. The aim of this paper is to explain the concepts of weak consistency alternative to the standard one, the concepts of paraconsistency related to them and the concept of strong paraconsistency, all of which have been defined by the author together with José M. Méndez.

  14. Consistent force fields for saccharides

    DEFF Research Database (Denmark)

    Rasmussen, Kjeld

    1999-01-01

    Consistent force fields for carbohydrates were hitherto developed by extensive optimization ofpotential energy function parameters on experimental data and on ab initio results. A wide range of experimental data is used: internal structures obtained from gas phase electron diffraction and from x......-anomeric effects are accounted for without addition of specific terms. The work is done in the framework of the Consistent Force Field which originatedin Israel and was further developed in Denmark. The actual methods and strategies employed havebeen described previously. Extensive testing of the force field...

  15. Glass consistency and glass performance

    International Nuclear Information System (INIS)

    Plodinec, M.J.; Ramsey, W.G.

    1994-01-01

    Glass produced by the Defense Waste Processing Facility (DWPF) will have to consistently be more durable than a benchmark glass (evaluated using a short-term leach test), with high confidence. The DWPF has developed a Glass Product Control Program to comply with this specification. However, it is not clear what relevance product consistency has on long-term glass performance. In this report, the authors show that DWPF glass, produced in compliance with this specification, can be expected to effectively limit the release of soluble radionuclides to natural environments. However, the release of insoluble radionuclides to the environment will be limited by their solubility, and not glass durability

  16. Time-consistent actuarial valuations

    NARCIS (Netherlands)

    Pelsser, A.A.J.; Salahnejhad Ghalehjooghi, A.

    2016-01-01

    Time-consistent valuations (i.e. pricing operators) can be created by backward iteration of one-period valuations. In this paper we investigate the continuous-time limits of well-known actuarial premium principles when such backward iteration procedures are applied. This method is applied to an

  17. Dynamically consistent oil import tariffs

    International Nuclear Information System (INIS)

    Karp, L.; Newbery, D.M.

    1992-01-01

    The standard theory of optimal tariffs considers tariffs on perishable goods produced abroad under static conditions, in which tariffs affect prices only in that period. Oil and other exhaustable resources do not fit this model, for current tariffs affect the amount of oil imported, which will affect the remaining stock and hence its future price. The problem of choosing a dynamically consistent oil import tariff when suppliers are competitive but importers have market power is considered. The open-loop Nash tariff is solved for the standard competitive case in which the oil price is arbitraged, and it was found that the resulting tariff rises at the rate of interest. This tariff was found to have an equilibrium that in general is dynamically inconsistent. Nevertheless, it is shown that necessary and sufficient conditions exist under which the tariff satisfies the weaker condition of time consistency. A dynamically consistent tariff is obtained by assuming that all agents condition their current decisions on the remaining stock of the resource, in contrast to open-loop strategies. For the natural case in which all agents choose their actions simultaneously in each period, the dynamically consistent tariff was characterized, and found to differ markedly from the time-inconsistent open-loop tariff. It was shown that if importers do not have overwhelming market power, then the time path of the world price is insensitive to the ability to commit, as is the level of wealth achieved by the importer. 26 refs., 4 figs

  18. Consistently violating the non-Gaussian consistency relation

    International Nuclear Information System (INIS)

    Mooij, Sander; Palma, Gonzalo A.

    2015-01-01

    Non-attractor models of inflation are characterized by the super-horizon evolution of curvature perturbations, introducing a violation of the non-Gaussian consistency relation between the bispectrum's squeezed limit and the power spectrum's spectral index. In this work we show that the bispectrum's squeezed limit of non-attractor models continues to respect a relation dictated by the evolution of the background. We show how to derive this relation using only symmetry arguments, without ever needing to solve the equations of motion for the perturbations

  19. Consistence of Network Filtering Rules

    Institute of Scientific and Technical Information of China (English)

    SHE Kun; WU Yuancheng; HUANG Juncai; ZHOU Mingtian

    2004-01-01

    The inconsistence of firewall/VPN(Virtual Private Network) rule makes a huge maintainable cost.With development of Multinational Company,SOHO office,E-government the number of firewalls/VPN will increase rapidly.Rule table in stand-alone or network will be increased in geometric series accordingly.Checking the consistence of rule table manually is inadequate.A formal approach can define semantic consistence,make a theoretic foundation of intelligent management about rule tables.In this paper,a kind of formalization of host rules and network ones for auto rule-validation based on SET theory were proporsed and a rule validation scheme was defined.The analysis results show the superior performance of the methods and demonstrate its potential for the intelligent management based on rule tables.

  20. Self-consistent radial sheath

    International Nuclear Information System (INIS)

    Hazeltine, R.D.

    1988-12-01

    The boundary layer arising in the radial vicinity of a tokamak limiter is examined, with special reference to the TEXT tokamak. It is shown that sheath structure depends upon the self-consistent effects of ion guiding-center orbit modification, as well as the radial variation of E /times/ B-induced toroidal rotation. Reasonable agreement with experiment is obtained from an idealized model which, however simplified, preserves such self-consistent effects. It is argued that the radial sheath, which occurs whenever confining magnetic field-lines lie in the plasma boundary surface, is an object of some intrinsic interest. It differs from the more familiar axial sheath because magnetized charges respond very differently to parallel and perpendicular electric fields. 11 refs., 1 fig

  1. Lagrangian multiforms and multidimensional consistency

    Energy Technology Data Exchange (ETDEWEB)

    Lobb, Sarah; Nijhoff, Frank [Department of Applied Mathematics, University of Leeds, Leeds LS2 9JT (United Kingdom)

    2009-10-30

    We show that well-chosen Lagrangians for a class of two-dimensional integrable lattice equations obey a closure relation when embedded in a higher dimensional lattice. On the basis of this property we formulate a Lagrangian description for such systems in terms of Lagrangian multiforms. We discuss the connection of this formalism with the notion of multidimensional consistency, and the role of the lattice from the point of view of the relevant variational principle.

  2. Consistency and Communication in Committees

    OpenAIRE

    Inga Deimen; Felix Ketelaar; Mark T. Le Quement

    2013-01-01

    This paper analyzes truthtelling incentives in pre-vote communication in heterogeneous committees. We generalize the classical Condorcet jury model by introducing a new informational structure that captures consistency of information. In contrast to the impossibility result shown by Coughlan (2000) for the classical model, full pooling of information followed by sincere voting is an equilibrium outcome of our model for a large set of parameter values implying the possibility of ex post confli...

  3. Deep Feature Consistent Variational Autoencoder

    OpenAIRE

    Hou, Xianxu; Shen, Linlin; Sun, Ke; Qiu, Guoping

    2016-01-01

    We present a novel method for constructing Variational Autoencoder (VAE). Instead of using pixel-by-pixel loss, we enforce deep feature consistency between the input and the output of a VAE, which ensures the VAE's output to preserve the spatial correlation characteristics of the input, thus leading the output to have a more natural visual appearance and better perceptual quality. Based on recent deep learning works such as style transfer, we employ a pre-trained deep convolutional neural net...

  4. Design

    DEFF Research Database (Denmark)

    Volf, Mette

    This publication is unique in its demystification and operationalization of the complex and elusive nature of the design process. The publication portrays the designer’s daily work and the creative process, which the designer is a part of. Apart from displaying the designer’s work methods...... and design parameters, the publication shows examples from renowned Danish design firms. Through these examples the reader gets an insight into the designer’s reality....

  5. Consistent thermodynamic properties of lipids systems

    DEFF Research Database (Denmark)

    Cunico, Larissa; Ceriani, Roberta; Sarup, Bent

    different pressures, with azeotrope behavior observed. Available thermodynamic consistency tests for TPx data were applied before performing parameter regressions for Wilson, NRTL, UNIQUAC and original UNIFAC models. The relevance of enlarging experimental databank of lipids systems data in order to improve......Physical and thermodynamic properties of pure components and their mixtures are the basic requirement for process design, simulation, and optimization. In the case of lipids, our previous works[1-3] have indicated a lack of experimental data for pure components and also for their mixtures...... the performance of predictive thermodynamic models was confirmed in this work by analyzing the calculated values of original UNIFAC model. For solid-liquid equilibrium (SLE) data, new consistency tests have been developed [2]. Some of the developed tests were based in the quality tests proposed for VLE data...

  6. Evaluating Temporal Consistency in Marine Biodiversity Hotspots.

    Science.gov (United States)

    Piacenza, Susan E; Thurman, Lindsey L; Barner, Allison K; Benkwitt, Cassandra E; Boersma, Kate S; Cerny-Chipman, Elizabeth B; Ingeman, Kurt E; Kindinger, Tye L; Lindsley, Amy J; Nelson, Jake; Reimer, Jessica N; Rowe, Jennifer C; Shen, Chenchen; Thompson, Kevin A; Heppell, Selina S

    2015-01-01

    With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monitoring dataset collected over an eight year period off the US Pacific Coast, we developed a methodological approach for avoiding biases associated with hotspot delineation. We aggregated benthic fish species data from research trawls and calculated mean hotspot thresholds for fish species richness and Shannon's diversity indices over the eight year dataset. We used a spatial frequency distribution method to assign hotspot designations to the grid cells annually. We found no areas containing consistently high biodiversity through the entire study period based on the mean thresholds, and no grid cell was designated as a hotspot for greater than 50% of the time-series. To test if our approach was sensitive to sampling effort and the geographic extent of the survey, we followed a similar routine for the northern region of the survey area. Our finding of low consistency in benthic fish biodiversity hotspots over time was upheld, regardless of biodiversity metric used, whether thresholds were calculated per year or across all years, or the spatial extent for which we calculated thresholds and identified hotspots. Our results suggest that static measures of benthic fish biodiversity off the US West Coast are insufficient for identification of hotspots and that long-term data are required to appropriately identify patterns of high temporal variability in biodiversity for these highly mobile taxa. Given that ecological communities are responding to a changing climate and other

  7. Design

    DEFF Research Database (Denmark)

    Volf, Mette

    Design - proces & metode iBog®  er enestående i sit fokus på afmystificering og operationalisering af designprocessens flygtige og komplekse karakter. Udgivelsen går bag om designerens daglige arbejde og giver et indblik i den kreative skabelsesproces, som designeren er en del af. Udover et bredt...... indblik i designerens arbejdsmetoder og designparametre giver Design - proces & metode en række eksempler fra anerkendte designvirksomheder, der gør det muligt at komme helt tæt på designerens virkelighed....

  8. Design

    Science.gov (United States)

    Buchanan, Richard; Cross, Nigel; Durling, David; Nelson, Harold; Owen, Charles; Valtonen, Anna; Boling, Elizabeth; Gibbons, Andrew; Visscher-Voerman, Irene

    2013-01-01

    Scholars representing the field of design were asked to identify what they considered to be the most exciting and imaginative work currently being done in their field, as well as how that work might change our understanding. The scholars included Richard Buchanan, Nigel Cross, David Durling, Harold Nelson, Charles Owen, and Anna Valtonen. Scholars…

  9. Consistency of color representation in smart phones.

    Science.gov (United States)

    Dain, Stephen J; Kwan, Benjamin; Wong, Leslie

    2016-03-01

    luminance (maximum brightness) was ±7%, 15%, 7%, and 15%, respectively. The iPhones have almost 2× the luminance. To accommodate differences between makes and models, dedicated color lookup tables will be necessary, but the variations within a model appear to be small enough that consistent color vision tests can be designed successfully.

  10. Design

    DEFF Research Database (Denmark)

    Jensen, Ole B.; Pettiway, Keon

    2017-01-01

    In this chapter, Ole B. Jensen takes a situational approach to mobilities to examine how ordinary life activities are structured by technology and design. Using “staging mobilities” as a theoretical approach, Jensen considers mobilities as overlapping, actions, interactions and decisions by desig...... by providing ideas about future research for investigating mobilities in situ as a kind of “staging,” which he notes is influenced by the “material turn” in social sciences....... with a brief description of how movement is studied within social sciences after the “mobilities turn” versus the idea of physical movement in transport geography and engineering. He then explains how “mobilities design” was derived from connections between traffic and architecture. Jensen concludes...

  11. Decentralized Consistent Network Updates in SDN with ez-Segway

    KAUST Repository

    Nguyen, Thanh Dang; Chiesa, Marco; Canini, Marco

    2017-01-01

    We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and black-holes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes

  12. Design of a rotating-hearth furnace

    Energy Technology Data Exchange (ETDEWEB)

    Behrens, H A [Verein Deutscher Eisenhuettenleute (VDEh), Duesseldorf (Germany, F.R.)

    1979-10-01

    Presented in two parts, this paper is intended to provide an outline of the theoretical fundamentals for the design of rotating-hearth furnaces for heating round stock and deals with the characteristic design features of such furnaces.

  13. The benefits of molecular pathology in the diagnosis of musculoskeletal disease. Part I of a two-part review: soft tissue tumors

    International Nuclear Information System (INIS)

    Flanagan, Adrienne M.; Delaney, David; O'Donnell, Paul

    2010-01-01

    Bone and soft tissue metabolic and neoplastic diseases are increasingly characterized by their molecular signatures. This has resulted from increased knowledge of the human genome, which has contributed to the unraveling of molecular pathways in health and disease. Exploitation of this information has allowed it to be used for practical diagnostic purposes. The aim of the first part of this two-part review is to provide an up-to-date review of molecular genetic investigations that are available and routinely used by specialist musculoskeletal histopathologists in the diagnosis of neoplastic disease. Herein we focus on the benefits of employing well characterized somatic mutations in soft tissue lesions that are commonly employed in diagnostic pathology today. The second part highlights the known somatic and germline mutations implicated in osteoclast-rich lesions of bone, and the genetic changes that disturb phosphate metabolism and result in a variety of musculoskeletal phenotypes. Finally, a brief practical guide of how to use and provide a molecular pathology service is given. (orig.)

  14. Risk factors and outcomes of chronic sexual harassment during the transition to college: Examination of a two-part growth mixture model

    Science.gov (United States)

    McGinley, Meredith; Wolff, Jennifer M.; Rospenda, Kathleen M.; Liu, Li; Richman, Judith A.

    2016-01-01

    A two-part latent growth mixture model was implemented in order to examine heterogeneity in the growth of sexual harassment (SH) victimization in college and university students, and the extent to which SH class membership explains substance use and mental health outcomes for certain groups of students. Demographic risk factors, mental health, and substance use were examined as they related to chronically experienced SH victimization. Incoming freshmen students (N = 2855; 58% female; 54% White) completed a survey at five time points. In addition to self-reporting gender, race, and sexual orientation, students completed measures of sexual harassment, anxiety, depression, binge drinking, and marijuana use. Overall, self-reported SH declined upon college entry, although levels rebounded by the third year of college. Results also supported a two-class solution (Infrequent and Chronic) for SH victimization. Being female, White, and a sexual minority were linked to being classified into the Chronic (relative to the Infrequent) SH class. In turn, Chronic SH class membership predicted greater anxiety, depression, and substance use, supporting a mediational model. PMID:27712687

  15. Risk factors and outcomes of chronic sexual harassment during the transition to college: Examination of a two-part growth mixture model.

    Science.gov (United States)

    McGinley, Meredith; Wolff, Jennifer M; Rospenda, Kathleen M; Liu, Li; Richman, Judith A

    2016-11-01

    A two-part latent growth mixture model was implemented in order to examine heterogeneity in the growth of sexual harassment (SH) victimization in college and university students, and the extent to which SH class membership explains substance use and mental health outcomes for certain groups of students. Demographic risk factors, mental health, and substance use were examined as they related to chronically experienced SH victimization. Incoming freshmen students (N = 2855; 58% female; 54% White) completed a survey at five time points. In addition to self-reporting gender, race, and sexual orientation, students completed measures of sexual harassment, anxiety, depression, binge drinking, and marijuana use. Overall, self-reported SH declined upon college entry, although levels rebounded by the third year of college. Results also supported a two-class solution (Infrequent and Chronic) for SH victimization. Being female, White, and a sexual minority were linked to being classified into the Chronic (relative to the Infrequent) SH class. In turn, Chronic SH class membership predicted greater anxiety, depression, and substance use, supporting a mediational model. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Phase I, Dose-Escalation, Two-Part Trial of the PARP Inhibitor Talazoparib in Patients with Advanced Germline BRCA1/2 Mutations and Selected Sporadic Cancers.

    Science.gov (United States)

    de Bono, Johann; Ramanathan, Ramesh K; Mina, Lida; Chugh, Rashmi; Glaspy, John; Rafii, Saeed; Kaye, Stan; Sachdev, Jasgit; Heymach, John; Smith, David C; Henshaw, Joshua W; Herriott, Ashleigh; Patterson, Miranda; Curtin, Nicola J; Byers, Lauren Averett; Wainberg, Zev A

    2017-06-01

    Talazoparib inhibits PARP catalytic activity, trapping PARP1 on damaged DNA and causing cell death in BRCA1/2 -mutated cells. We evaluated talazoparib therapy in this two-part, phase I, first-in-human trial. Antitumor activity, MTD, pharmacokinetics, and pharmacodynamics of once-daily talazoparib were determined in an open-label, multicenter, dose-escalation study (NCT01286987). The MTD was 1.0 mg/day, with an elimination half-life of 50 hours. Treatment-related adverse events included fatigue (26/71 patients; 37%) and anemia (25/71 patients; 35%). Grade 3 to 4 adverse events included anemia (17/71 patients; 24%) and thrombocytopenia (13/71 patients; 18%). Sustained PARP inhibition was observed at doses ≥0.60 mg/day. At 1.0 mg/day, confirmed responses were observed in 7 of 14 (50%) and 5 of 12 (42%) patients with BRCA mutation-associated breast and ovarian cancers, respectively, and in patients with pancreatic and small cell lung cancer. Talazoparib demonstrated single-agent antitumor activity and was well tolerated in patients at the recommended dose of 1.0 mg/day. Significance: In this clinical trial, we show that talazoparib has single-agent antitumor activity and a tolerable safety profile. At its recommended phase II dose of 1.0 mg/day, confirmed responses were observed in patients with BRCA mutation-associated breast and ovarian cancers and in patients with pancreatic and small cell lung cancer. Cancer Discov; 7(6); 620-9. ©2017 AACR. This article is highlighted in the In This Issue feature, p. 539 . ©2017 American Association for Cancer Research.

  17. Consistency Anchor Formalization and Correctness Proofs

    OpenAIRE

    Miguel, Correia; Bessani, Alysson

    2014-01-01

    This is report contains the formal proofs for the techniques for increasing the consistency of cloud storage as presented in "Bessani et al. SCFS: A Cloud-backed File System. Proc. of the 2014 USENIX Annual Technical Conference. June 2014." The consistency anchor technique allows one to increase the consistency provided by eventually consistent cloud storage services like Amazon S3. This technique has been used in the SCFS (Shared Cloud File System) cloud-backed file system for solving rea...

  18. Consistent estimation of Gibbs energy using component contributions.

    Directory of Open Access Journals (Sweden)

    Elad Noor

    Full Text Available Standard Gibbs energies of reactions are increasingly being used in metabolic modeling for applying thermodynamic constraints on reaction rates, metabolite concentrations and kinetic parameters. The increasing scope and diversity of metabolic models has led scientists to look for genome-scale solutions that can estimate the standard Gibbs energy of all the reactions in metabolism. Group contribution methods greatly increase coverage, albeit at the price of decreased precision. We present here a way to combine the estimations of group contribution with the more accurate reactant contributions by decomposing each reaction into two parts and applying one of the methods on each of them. This method gives priority to the reactant contributions over group contributions while guaranteeing that all estimations will be consistent, i.e. will not violate the first law of thermodynamics. We show that there is a significant increase in the accuracy of our estimations compared to standard group contribution. Specifically, our cross-validation results show an 80% reduction in the median absolute residual for reactions that can be derived by reactant contributions only. We provide the full framework and source code for deriving estimates of standard reaction Gibbs energy, as well as confidence intervals, and believe this will facilitate the wide use of thermodynamic data for a better understanding of metabolism.

  19. Speed Consistency in the Smart Tachograph.

    Science.gov (United States)

    Borio, Daniele; Cano, Eduardo; Baldini, Gianmarco

    2018-05-16

    In the transportation sector, safety risks can be significantly reduced by monitoring the behaviour of drivers and by discouraging possible misconducts that entail fatigue and can increase the possibility of accidents. The Smart Tachograph (ST), the new revision of the Digital Tachograph (DT), has been designed with this purpose: to verify that speed limits and compulsory rest periods are respected by drivers. In order to operate properly, the ST periodically checks the consistency of data from different sensors, which can be potentially manipulated to avoid the monitoring of the driver behaviour. In this respect, the ST regulation specifies a test procedure to detect motion conflicts originating from inconsistencies between Global Navigation Satellite System (GNSS) and odometry data. This paper provides an experimental evaluation of the speed verification procedure specified by the ST regulation. Several hours of data were collected using three vehicles and considering light urban and highway environments. The vehicles were equipped with an On-Board Diagnostics (OBD) data reader and a GPS/Galileo receiver. The tests prescribed by the regulation were implemented with specific focus on synchronization aspects. The experimental analysis also considered aspects such as the impact of tunnels and the presence of data gaps. The analysis shows that the metrics selected for the tests are resilient to data gaps, latencies between GNSS and odometry data and simplistic manipulations such as data scaling. The new ST forces an attacker to falsify data from both sensors at the same time and in a coherent way. This makes more difficult the implementation of frauds in comparison to the current version of the DT.

  20. Optimal Design of Piezoelectric Materials for Maximal Energy Harvesting

    Science.gov (United States)

    2015-06-01

    constant coefficients consists of two parts, the complementary function, which is the solution of the corresponding homogeneous equation, and the...mY0ω2 cos(ωt) . (2.6) Matching the coefficients for cos(ωt) and sin(ωt), respectively, we find ( Ak−Amω2 ) cosφ +Adω sinφ = mY0ω2, (2.7)( k−mω2 ) sinφ...trical vibration based generator. Their design uses an 8.44 gram mass and a two-layer sheet of PZT -5A with a steel center shim excited by vibrations of

  1. A new approach to hull consistency

    Directory of Open Access Journals (Sweden)

    Kolev Lubomir

    2016-06-01

    Full Text Available Hull consistency is a known technique to improve the efficiency of iterative interval methods for solving nonlinear systems describing steady-states in various circuits. Presently, hull consistency is checked in a scalar manner, i.e. successively for each equation of the nonlinear system with respect to a single variable. In the present poster, a new more general approach to implementing hull consistency is suggested which consists in treating simultaneously several equations with respect to the same number of variables.

  2. STP: A mathematically and physically consistent library of steam properties

    International Nuclear Information System (INIS)

    Aguilar, F.; Hutter, A.C.; Tuttle, P.G.

    1982-01-01

    A new FORTRAN library of subroutines has been developed from the fundamental equation of Keenan et al. to evaluate a large set of water properties including derivatives such as sound speed and isothermal compressibility. The STP library uses the true saturation envelope of the Keenan et al. fundamental equation. The evaluation of the true envelope by a continuation method is explained. This envelope, along with other design features, imparts an exceptionally high degree of thermodynamic and mathematical consistency to the STP library, even at the critical point. Accuracy and smoothness, library self-consistency, and designed user convenience make the STP library a reliable and versatile water property package

  3. Student Effort, Consistency, and Online Performance

    Science.gov (United States)

    Patron, Hilde; Lopez, Salvador

    2011-01-01

    This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas…

  4. Translationally invariant self-consistent field theories

    International Nuclear Information System (INIS)

    Shakin, C.M.; Weiss, M.S.

    1977-01-01

    We present a self-consistent field theory which is translationally invariant. The equations obtained go over to the usual Hartree-Fock equations in the limit of large particle number. In addition to deriving the dynamic equations for the self-consistent amplitudes we discuss the calculation of form factors and various other observables

  5. Sticky continuous processes have consistent price systems

    DEFF Research Database (Denmark)

    Bender, Christian; Pakkanen, Mikko; Sayit, Hasanjan

    Under proportional transaction costs, a price process is said to have a consistent price system, if there is a semimartingale with an equivalent martingale measure that evolves within the bid-ask spread. We show that a continuous, multi-asset price process has a consistent price system, under...

  6. Consistent-handed individuals are more authoritarian.

    Science.gov (United States)

    Lyle, Keith B; Grillo, Michael C

    2014-01-01

    Individuals differ in the consistency with which they use one hand over the other to perform everyday activities. Some individuals are very consistent, habitually using a single hand to perform most tasks. Others are relatively inconsistent, and hence make greater use of both hands. More- versus less-consistent individuals have been shown to differ in numerous aspects of personality and cognition. In several respects consistent-handed individuals resemble authoritarian individuals. For example, both consistent-handedness and authoritarianism have been linked to cognitive inflexibility. Therefore we hypothesised that consistent-handedness is an external marker for authoritarianism. Confirming our hypothesis, we found that consistent-handers scored higher than inconsistent-handers on a measure of submission to authority, were more likely to identify with a conservative political party (Republican), and expressed less-positive attitudes towards out-groups. We propose that authoritarianism may be influenced by the degree of interaction between the left and right brain hemispheres, which has been found to differ between consistent- and inconsistent-handed individuals.

  7. Consistent spectroscopy for a extended gauge model

    International Nuclear Information System (INIS)

    Oliveira Neto, G. de.

    1990-11-01

    The consistent spectroscopy was obtained with a Lagrangian constructed with vector fields with a U(1) group extended symmetry. As consistent spectroscopy is understood the determination of quantum physical properties described by the model in an manner independent from the possible parametrizations adopted in their description. (L.C.J.A.)

  8. Predictive tools for designing new insulins and treatment regimens

    DEFF Research Database (Denmark)

    Klim, Søren

    The thesis deals with the development of "Predictive tools for designing new insulins and treatments regimens" and consists of two parts: A model based approach for bridging properties of new insulin analogues from glucose clamp experiments to meal tolerance tests (MTT) and a second part that des......The thesis deals with the development of "Predictive tools for designing new insulins and treatments regimens" and consists of two parts: A model based approach for bridging properties of new insulin analogues from glucose clamp experiments to meal tolerance tests (MTT) and a second part...... that describes an implemented software program able to handle stochastic differential equations (SDEs) with mixed effects. The thesis is supplemented with scientific papers published during the PhD. Developing an insulin analogue from candidate molecule to a clinical drug consists of a development programme...... and efficacy are investigated. Numerous methods are used to quantify dose and efficacy in Phase II - especially of interest is the 24-hour meal tolerance test as it tries to portray near normal living conditions. Part I describes an integrated model for insulin and glucose which is aimed at simulating 24-hour...

  9. Dispersion sensitivity analysis & consistency improvement of APFSDS

    Directory of Open Access Journals (Sweden)

    Sangeeta Sharma Panda

    2017-08-01

    In Bore Balloting Motion simulation shows that reduction in residual spin by about 5% results in drastic 56% reduction in first maximum yaw. A correlation between first maximum yaw and residual spin is observed. Results of data analysis are used in design modification for existing ammunition. Number of designs are evaluated numerically before freezing five designs for further soundings. These designs are critically assessed in terms of their comparative performance during In-bore travel & external ballistics phase. Results are validated by free flight trials for the finalised design.

  10. Consistency in the World Wide Web

    DEFF Research Database (Denmark)

    Thomsen, Jakob Grauenkjær

    Tim Berners-Lee envisioned that computers will behave as agents of humans on the World Wide Web, where they will retrieve, extract, and interact with information from the World Wide Web. A step towards this vision is to make computers capable of extracting this information in a reliable...... and consistent way. In this dissertation we study steps towards this vision by showing techniques for the specication, the verication and the evaluation of the consistency of information in the World Wide Web. We show how to detect certain classes of errors in a specication of information, and we show how...... the World Wide Web, in order to help perform consistent evaluations of web extraction techniques. These contributions are steps towards having computers reliable and consistently extract information from the World Wide Web, which in turn are steps towards achieving Tim Berners-Lee's vision. ii...

  11. Consistent histories and operational quantum theory

    International Nuclear Information System (INIS)

    Rudolph, O.

    1996-01-01

    In this work a generalization of the consistent histories approach to quantum mechanics is presented. We first critically review the consistent histories approach to nonrelativistic quantum mechanics in a mathematically rigorous way and give some general comments about it. We investigate to what extent the consistent histories scheme is compatible with the results of the operational formulation of quantum mechanics. According to the operational approach, nonrelativistic quantum mechanics is most generally formulated in terms of effects, states, and operations. We formulate a generalized consistent histories theory using the concepts and the terminology which have proven useful in the operational formulation of quantum mechanics. The logical rule of the logical interpretation of quantum mechanics is generalized to the present context. The algebraic structure of the generalized theory is studied in detail

  12. Self-consistent areas law in QCD

    International Nuclear Information System (INIS)

    Makeenko, Yu.M.; Migdal, A.A.

    1980-01-01

    The problem of obtaining the self-consistent areas law in quantum chromodynamics (QCD) is considered from the point of view of the quark confinement. The exact equation for the loop average in multicolor QCD is reduced to a bootstrap form. Its iterations yield new manifestly gauge invariant perturbation theory in the loop space, reproducing asymptotic freedom. For large loops, the areas law apprears to be a self-consistent solution

  13. Consistency of the MLE under mixture models

    OpenAIRE

    Chen, Jiahua

    2016-01-01

    The large-sample properties of likelihood-based statistical inference under mixture models have received much attention from statisticians. Although the consistency of the nonparametric MLE is regarded as a standard conclusion, many researchers ignore the precise conditions required on the mixture model. An incorrect claim of consistency can lead to false conclusions even if the mixture model under investigation seems well behaved. Under a finite normal mixture model, for instance, the consis...

  14. Consistent measurements comparing the drift features of noble gas mixtures

    CERN Document Server

    Becker, U; Fortunato, E M; Kirchner, J; Rosera, K; Uchida, Y

    1999-01-01

    We present a consistent set of measurements of electron drift velocities and Lorentz deflection angles for all noble gases with methane and ethane as quenchers in magnetic fields up to 0.8 T. Empirical descriptions are also presented. Details on the World Wide Web allow for guided design and optimization of future detectors.

  15. Self-consistent asset pricing models

    Science.gov (United States)

    Malevergne, Y.; Sornette, D.

    2007-08-01

    We discuss the foundations of factor or regression models in the light of the self-consistency condition that the market portfolio (and more generally the risk factors) is (are) constituted of the assets whose returns it is (they are) supposed to explain. As already reported in several articles, self-consistency implies correlations between the return disturbances. As a consequence, the alphas and betas of the factor model are unobservable. Self-consistency leads to renormalized betas with zero effective alphas, which are observable with standard OLS regressions. When the conditions derived from internal consistency are not met, the model is necessarily incomplete, which means that some sources of risk cannot be replicated (or hedged) by a portfolio of stocks traded on the market, even for infinite economies. Analytical derivations and numerical simulations show that, for arbitrary choices of the proxy which are different from the true market portfolio, a modified linear regression holds with a non-zero value αi at the origin between an asset i's return and the proxy's return. Self-consistency also introduces “orthogonality” and “normality” conditions linking the betas, alphas (as well as the residuals) and the weights of the proxy portfolio. Two diagnostics based on these orthogonality and normality conditions are implemented on a basket of 323 assets which have been components of the S&P500 in the period from January 1990 to February 2005. These two diagnostics show interesting departures from dynamical self-consistency starting about 2 years before the end of the Internet bubble. Assuming that the CAPM holds with the self-consistency condition, the OLS method automatically obeys the resulting orthogonality and normality conditions and therefore provides a simple way to self-consistently assess the parameters of the model by using proxy portfolios made only of the assets which are used in the CAPM regressions. Finally, the factor decomposition with the

  16. Consumer-Choice Health plan (second of two parts). A national-health-insurance proposal based on regulated competition in the private sector.

    Science.gov (United States)

    Enthoven, A C

    1978-03-30

    Medical costs are straining public finances. Direct economic regulation will raise costs, retard beneficial innovation and be increasingly burdensome to physicians. As an alternative, I suggest that the government change financial incentives by creating a system of competing health plans in which physicians and consumers can benefit from using resources wisely. Main proposals consist of changed tax laws, Medicare and Medicaid to subsidize individual premium payments by an amount based on financial and predicted medical need, as well as subsidies usable only for premiums in qualified health insurance or delivery plans operating under rules that include periodic open enrollment, community rating by actuarial category, premium rating by market area and a limit on each person's out-of pocket costs. Also, efficient systems should be allowed to pass on the full savings to consumers. Finally, incremental changes should be made in the present system to alter it fundamentally, but gradually and voluntarily. Freedom of choice for consumers and physicians should be preserved.

  17. Towards thermodynamical consistency of quasiparticle picture

    International Nuclear Information System (INIS)

    Biro, T.S.; Shanenko, A.A.; Toneev, V.D.; Research Inst. for Particle and Nuclear Physics, Hungarian Academy of Sciences, Budapest

    2003-01-01

    The purpose of the present article is to call attention to some realistic quasi-particle-based description of the quark/gluon matter and its consistent implementation in thermodynamics. A simple and transparent representation of the thermodynamical consistency conditions is given. This representation allows one to review critically and systemize available phenomenological approaches to the deconfinement problem with respect to their thermodynamical consistency. A particular attention is paid to the development of a method for treating the string screening in the dense matter of unbound color charges. The proposed method yields an integrable effective pair potential, which can be incorporated into the mean-field picture. The results of its application are in reasonable agreement with lattice data on the QCD thermodynamics [ru

  18. Toward thermodynamic consistency of quasiparticle picture

    International Nuclear Information System (INIS)

    Biro, T.S.; Toneev, V.D.; Shanenko, A.A.

    2003-01-01

    The purpose of the present article is to call attention to some realistic quasiparticle-based description of quark/gluon matter and its consistent implementation in thermodynamics. A simple and transparent representation of the thermodynamic consistency conditions is given. This representation allows one to review critically and systemize available phenomenological approaches to the deconfinement problem with respect to their thermodynamic consistency. Particular attention is paid to the development of a method for treating the string screening in the dense matter of unbound color charges. The proposed method yields an integrable effective pair potential that can be incorporated into the mean-field picture. The results of its application are in reasonable agreement with lattice data on the QCD thermodynamics

  19. Toward a consistent RHA-RPA

    International Nuclear Information System (INIS)

    Shepard, J.R.

    1991-01-01

    The authors examine the RPA based on a relativistic Hartree approximation description for nuclear ground states. This model includes contributions from the negative energy sea at the 1-loop level. They emphasize consistency between the treatment of the ground state and the RPA. This consistency is important in the description of low-lying collective levels but less important for the longitudinal (e, e') quasi-elastic response. They also study the effect of imposing a 3-momentum cutoff on negative energy sea contributions. A cutoff of twice the nucleon mass improves agreement with observed spin orbit splittings in nuclei compared to the standard infinite cutoff results, an effect traceable to the fact that imposing the cutoff reduces m*/m. The cutoff is much less important than consistency in the description of low-lying collective levels. The cutoff model provides excellent agreement with quasi-elastic (e, e') data

  20. Personalized recommendation based on unbiased consistence

    Science.gov (United States)

    Zhu, Xuzhen; Tian, Hui; Zhang, Ping; Hu, Zheng; Zhou, Tao

    2015-08-01

    Recently, in physical dynamics, mass-diffusion-based recommendation algorithms on bipartite network provide an efficient solution by automatically pushing possible relevant items to users according to their past preferences. However, traditional mass-diffusion-based algorithms just focus on unidirectional mass diffusion from objects having been collected to those which should be recommended, resulting in a biased causal similarity estimation and not-so-good performance. In this letter, we argue that in many cases, a user's interests are stable, and thus bidirectional mass diffusion abilities, no matter originated from objects having been collected or from those which should be recommended, should be consistently powerful, showing unbiased consistence. We further propose a consistence-based mass diffusion algorithm via bidirectional diffusion against biased causality, outperforming the state-of-the-art recommendation algorithms in disparate real data sets, including Netflix, MovieLens, Amazon and Rate Your Music.

  1. Financial model calibration using consistency hints.

    Science.gov (United States)

    Abu-Mostafa, Y S

    2001-01-01

    We introduce a technique for forcing the calibration of a financial model to produce valid parameters. The technique is based on learning from hints. It converts simple curve fitting into genuine calibration, where broad conclusions can be inferred from parameter values. The technique augments the error function of curve fitting with consistency hint error functions based on the Kullback-Leibler distance. We introduce an efficient EM-type optimization algorithm tailored to this technique. We also introduce other consistency hints, and balance their weights using canonical errors. We calibrate the correlated multifactor Vasicek model of interest rates, and apply it successfully to Japanese Yen swaps market and US dollar yield market.

  2. Visually defining and querying consistent multi-granular clinical temporal abstractions.

    Science.gov (United States)

    Combi, Carlo; Oliboni, Barbara

    2012-02-01

    the component abstractions. Moreover, we propose a visual query language where different temporal abstractions can be composed to build complex queries: temporal abstractions are visually connected through the usual logical connectives AND, OR, and NOT. The proposed visual language allows one to simply define temporal abstractions by using intuitive metaphors, and to specify temporal intervals related to abstractions by using different temporal granularities. The physician can interact with the designed and implemented tool by point-and-click selections, and can visually compose queries involving several temporal abstractions. The evaluation of the proposed granularity-related metaphors consisted in two parts: (i) solving 30 interpretation exercises by choosing the correct interpretation of a given screenshot representing a possible scenario, and (ii) solving a complex exercise, by visually specifying through the interface a scenario described only in natural language. The exercises were done by 13 subjects. The percentage of correct answers to the interpretation exercises were slightly different with respect to the considered metaphors (54.4--striped wall, 73.3--plastered wall, 61--brick wall, and 61--no wall), but post hoc statistical analysis on means confirmed that differences were not statistically significant. The result of the user's satisfaction questionnaire related to the evaluation of the proposed granularity-related metaphors ratified that there are no preferences for one of them. The evaluation of the proposed logical notation consisted in two parts: (i) solving five interpretation exercises provided by a screenshot representing a possible scenario and by three different possible interpretations, of which only one was correct, and (ii) solving five exercises, by visually defining through the interface a scenario described only in natural language. Exercises had an increasing difficulty. The evaluation involved a total of 31 subjects. Results related to

  3. Proteolysis and consistency of Meshanger cheese

    NARCIS (Netherlands)

    Jong, de L.

    1978-01-01

    Proteolysis in Meshanger cheese, estimated by quantitative polyacrylamide gel electrophoresis is discussed. The conversion of α s1 -casein was proportional to rennet concentration in the cheese. Changes in consistency, after a maximum, were correlated to breakdown of

  4. Developing consistent pronunciation models for phonemic variants

    CSIR Research Space (South Africa)

    Davel, M

    2006-09-01

    Full Text Available Pronunciation lexicons often contain pronunciation variants. This can create two problems: It can be difficult to define these variants in an internally consistent way and it can also be difficult to extract generalised grapheme-to-phoneme rule sets...

  5. Image recognition and consistency of response

    Science.gov (United States)

    Haygood, Tamara M.; Ryan, John; Liu, Qing Mary A.; Bassett, Roland; Brennan, Patrick C.

    2012-02-01

    Purpose: To investigate the connection between conscious recognition of an image previously encountered in an experimental setting and consistency of response to the experimental question. Materials and Methods: Twenty-four radiologists viewed 40 frontal chest radiographs and gave their opinion as to the position of a central venous catheter. One-to-three days later they again viewed 40 frontal chest radiographs and again gave their opinion as to the position of the central venous catheter. Half of the radiographs in the second set were repeated images from the first set and half were new. The radiologists were asked of each image whether it had been included in the first set. For this study, we are evaluating only the 20 repeated images. We used the Kruskal-Wallis test and Fisher's exact test to determine the relationship between conscious recognition of a previously interpreted image and consistency in interpretation of the image. Results. There was no significant correlation between recognition of the image and consistency in response regarding the position of the central venous catheter. In fact, there was a trend in the opposite direction, with radiologists being slightly more likely to give a consistent response with respect to images they did not recognize than with respect to those they did recognize. Conclusion: Radiologists' recognition of previously-encountered images in an observer-performance study does not noticeably color their interpretation on the second encounter.

  6. Consistent Valuation across Curves Using Pricing Kernels

    Directory of Open Access Journals (Sweden)

    Andrea Macrina

    2018-03-01

    Full Text Available The general problem of asset pricing when the discount rate differs from the rate at which an asset’s cash flows accrue is considered. A pricing kernel framework is used to model an economy that is segmented into distinct markets, each identified by a yield curve having its own market, credit and liquidity risk characteristics. The proposed framework precludes arbitrage within each market, while the definition of a curve-conversion factor process links all markets in a consistent arbitrage-free manner. A pricing formula is then derived, referred to as the across-curve pricing formula, which enables consistent valuation and hedging of financial instruments across curves (and markets. As a natural application, a consistent multi-curve framework is formulated for emerging and developed inter-bank swap markets, which highlights an important dual feature of the curve-conversion factor process. Given this multi-curve framework, existing multi-curve approaches based on HJM and rational pricing kernel models are recovered, reviewed and generalised and single-curve models extended. In another application, inflation-linked, currency-based and fixed-income hybrid securities are shown to be consistently valued using the across-curve valuation method.

  7. Guided color consistency optimization for image mosaicking

    Science.gov (United States)

    Xie, Renping; Xia, Menghan; Yao, Jian; Li, Li

    2018-01-01

    This paper studies the problem of color consistency correction for sequential images with diverse color characteristics. Existing algorithms try to adjust all images to minimize color differences among images under a unified energy framework, however, the results are prone to presenting a consistent but unnatural appearance when the color difference between images is large and diverse. In our approach, this problem is addressed effectively by providing a guided initial solution for the global consistency optimization, which avoids converging to a meaningless integrated solution. First of all, to obtain the reliable intensity correspondences in overlapping regions between image pairs, we creatively propose the histogram extreme point matching algorithm which is robust to image geometrical misalignment to some extents. In the absence of the extra reference information, the guided initial solution is learned from the major tone of the original images by searching some image subset as the reference, whose color characteristics will be transferred to the others via the paths of graph analysis. Thus, the final results via global adjustment will take on a consistent color similar to the appearance of the reference image subset. Several groups of convincing experiments on both the synthetic dataset and the challenging real ones sufficiently demonstrate that the proposed approach can achieve as good or even better results compared with the state-of-the-art approaches.

  8. Consistent Visual Analyses of Intrasubject Data

    Science.gov (United States)

    Kahng, SungWoo; Chung, Kyong-Mee; Gutshall, Katharine; Pitts, Steven C.; Kao, Joyce; Girolami, Kelli

    2010-01-01

    Visual inspection of single-case data is the primary method of interpretation of the effects of an independent variable on a dependent variable in applied behavior analysis. The purpose of the current study was to replicate and extend the results of DeProspero and Cohen (1979) by reexamining the consistency of visual analysis across raters. We…

  9. On the existence of consistent price systems

    DEFF Research Database (Denmark)

    Bayraktar, Erhan; Pakkanen, Mikko S.; Sayit, Hasanjan

    2014-01-01

    We formulate a sufficient condition for the existence of a consistent price system (CPS), which is weaker than the conditional full support condition (CFS). We use the new condition to show the existence of CPSs for certain processes that fail to have the CFS property. In particular this condition...

  10. Dynamic phonon exchange requires consistent dressing

    International Nuclear Information System (INIS)

    Hahne, F.J.W.; Engelbrecht, C.A.; Heiss, W.D.

    1976-01-01

    It is shown that states with undersirable properties (such as ghosts, states with complex eigenenergies and states with unrestricted normalization) emerge from two-body calculations using dynamic effective interactions if one is not careful in introducing single-particle self-energy insertions in a consistent manner

  11. Consistent feeding positions of great tit parents

    NARCIS (Netherlands)

    Lessells, C.M.; Poelman, E.H.; Mateman, A.C.; Cassey, Ph.

    2006-01-01

    When parent birds arrive at the nest to provision their young, their position on the nest rim may influence which chick or chicks are fed. As a result, the consistency of feeding positions of the individual parents, and the difference in position between the parents, may affect how equitably food is

  12. Consistency of the postulates of special relativity

    International Nuclear Information System (INIS)

    Gron, O.; Nicola, M.

    1976-01-01

    In a recent article in this journal, Kingsley has tried to show that the postulates of special relativity contradict each other. It is shown that the arguments of Kingsley are invalid because of an erroneous appeal to symmetry in a nonsymmetric situation. The consistency of the postulates of special relativity and the relativistic kinematics deduced from them is restated

  13. Consistency of Network Traffic Repositories: An Overview

    NARCIS (Netherlands)

    Lastdrager, E.; Lastdrager, E.E.H.; Pras, Aiko

    2009-01-01

    Traffc repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffc that has been flowing over the network; little thoughts are made regarding the consistency of these repositories. Still, for

  14. Consistency analysis of network traffic repositories

    NARCIS (Netherlands)

    Lastdrager, Elmer; Lastdrager, E.E.H.; Pras, Aiko

    Traffic repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffic that has been flowing over the network; little thoughts are made regarding the consistency of these repositories. Still, for

  15. Evidence for Consistency of the Glycation Gap in Diabetes

    OpenAIRE

    Nayak, Ananth U.; Holland, Martin R.; Macdonald, David R.; Nevill, Alan; Singh, Baldev M.

    2011-01-01

    OBJECTIVE Discordance between HbA1c and fructosamine estimations in the assessment of glycemia is often encountered. A number of mechanisms might explain such discordance, but whether it is consistent is uncertain. This study aims to coanalyze paired glycosylated hemoglobin (HbA1c)-fructosamine estimations by using fructosamine to determine a predicted HbA1c, to calculate a glycation gap (G-gap) and to determine whether the G-gap is consistent over time. RESEARCH DESIGN AND METHODS We include...

  16. A consistent interpretation of quantum mechanics

    International Nuclear Information System (INIS)

    Omnes, Roland

    1990-01-01

    Some mostly recent theoretical and mathematical advances can be linked together to yield a new consistent interpretation of quantum mechanics. It relies upon a unique and universal interpretative rule of a logical character which is based upon Griffiths consistent history. Some new results in semi-classical physics allow classical physics to be derived from this rule, including its logical aspects, and to prove accordingly the existence of determinism within the quantum framework. Together with decoherence, this can be used to retrieve the existence of facts, despite the probabilistic character of the theory. Measurement theory can then be made entirely deductive. It is accordingly found that wave packet reduction is a logical property, whereas one can always choose to avoid using it. The practical consequences of this interpretation are most often in agreement with the Copenhagen formulation but they can be proved never to give rise to any logical inconsistency or paradox. (author)

  17. Self-consistency in Capital Markets

    Science.gov (United States)

    Benbrahim, Hamid

    2013-03-01

    Capital Markets are considered, at least in theory, information engines whereby traders contribute to price formation with their diverse perspectives. Regardless whether one believes in efficient market theory on not, actions by individual traders influence prices of securities, which in turn influence actions by other traders. This influence is exerted through a number of mechanisms including portfolio balancing, margin maintenance, trend following, and sentiment. As a result market behaviors emerge from a number of mechanisms ranging from self-consistency due to wisdom of the crowds and self-fulfilling prophecies, to more chaotic behavior resulting from dynamics similar to the three body system, namely the interplay between equities, options, and futures. This talk will address questions and findings regarding the search for self-consistency in capital markets.

  18. Student Effort, Consistency and Online Performance

    Directory of Open Access Journals (Sweden)

    Hilde Patron

    2011-07-01

    Full Text Available This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas effort, or total minutes spent online, is not. Other independent variables include GPA and the difference between a pre-test and a post-test. The GPA is used as a measure of motivation, and the difference between a post-test and pre-test as marginal learning. As expected, the level of motivation is found statistically significant at a 99% confidence level, and marginal learning is also significant at a 95% level.

  19. Consistency relation for cosmic magnetic fields

    DEFF Research Database (Denmark)

    Jain, R. K.; Sloth, M. S.

    2012-01-01

    If cosmic magnetic fields are indeed produced during inflation, they are likely to be correlated with the scalar metric perturbations that are responsible for the cosmic microwave background anisotropies and large scale structure. Within an archetypical model of inflationary magnetogenesis, we show...... that there exists a new simple consistency relation for the non-Gaussian cross correlation function of the scalar metric perturbation with two powers of the magnetic field in the squeezed limit where the momentum of the metric perturbation vanishes. We emphasize that such a consistency relation turns out...... to be extremely useful to test some recent calculations in the literature. Apart from primordial non-Gaussianity induced by the curvature perturbations, such a cross correlation might provide a new observational probe of inflation and can in principle reveal the primordial nature of cosmic magnetic fields. DOI...

  20. Consistent Estimation of Partition Markov Models

    Directory of Open Access Journals (Sweden)

    Jesús E. García

    2017-04-01

    Full Text Available The Partition Markov Model characterizes the process by a partition L of the state space, where the elements in each part of L share the same transition probability to an arbitrary element in the alphabet. This model aims to answer the following questions: what is the minimal number of parameters needed to specify a Markov chain and how to estimate these parameters. In order to answer these questions, we build a consistent strategy for model selection which consist of: giving a size n realization of the process, finding a model within the Partition Markov class, with a minimal number of parts to represent the process law. From the strategy, we derive a measure that establishes a metric in the state space. In addition, we show that if the law of the process is Markovian, then, eventually, when n goes to infinity, L will be retrieved. We show an application to model internet navigation patterns.

  1. Internal Branding and Employee Brand Consistent Behaviours

    DEFF Research Database (Denmark)

    Mazzei, Alessandra; Ravazzani, Silvia

    2017-01-01

    constitutive processes. In particular, the paper places emphasis on the role and kinds of communication practices as a central part of the nonnormative and constitutive internal branding process. The paper also discusses an empirical study based on interviews with 32 Italian and American communication managers...... and 2 focus groups with Italian communication managers. Findings show that, in order to enhance employee brand consistent behaviours, the most effective communication practices are those characterised as enablement-oriented. Such a communication creates the organizational conditions adequate to sustain......Employee behaviours conveying brand values, named brand consistent behaviours, affect the overall brand evaluation. Internal branding literature highlights a knowledge gap in terms of communication practices intended to sustain such behaviours. This study contributes to the development of a non...

  2. Self-consistent velocity dependent effective interactions

    International Nuclear Information System (INIS)

    Kubo, Takayuki; Sakamoto, Hideo; Kammuri, Tetsuo; Kishimoto, Teruo.

    1993-09-01

    The field coupling method is extended to a system with a velocity dependent mean potential. By means of this method, we can derive the effective interactions which are consistent with the mean potential. The self-consistent velocity dependent effective interactions are applied to the microscopic analysis of the structures of giant dipole resonances (GDR) of 148,154 Sm, of the first excited 2 + states of Sn isotopes and of the first excited 3 - states of Mo isotopes. It is clarified that the interactions play crucial roles in describing the splitting of the resonant structure of GDR peaks, in restoring the energy weighted sum rule values, and in reducing B (Eλ) values. (author)

  3. Evaluating Temporal Consistency in Marine Biodiversity Hotspots

    OpenAIRE

    Piacenza, Susan E.; Thurman, Lindsey L.; Barner, Allison K.; Benkwitt, Cassandra E.; Boersma, Kate S.; Cerny-Chipman, Elizabeth B.; Ingeman, Kurt E.; Kindinger, Tye L.; Lindsley, Amy J.; Nelson, Jake; Reimer, Jessica N.; Rowe, Jennifer C.; Shen, Chenchen; Thompson, Kevin A.; Heppell, Selina S.

    2015-01-01

    With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monito...

  4. Cloud Standardization: Consistent Business Processes and Information

    Directory of Open Access Journals (Sweden)

    Razvan Daniel ZOTA

    2013-01-01

    Full Text Available Cloud computing represents one of the latest emerging trends in distributed computing that enables the existence of hardware infrastructure and software applications as services. The present paper offers a general approach to the cloud computing standardization as a mean of improving the speed of adoption for the cloud technologies. Moreover, this study tries to show out how organizations may achieve more consistent business processes while operating with cloud computing technologies.

  5. Consistency Analysis of Nearest Subspace Classifier

    OpenAIRE

    Wang, Yi

    2015-01-01

    The Nearest subspace classifier (NSS) finds an estimation of the underlying subspace within each class and assigns data points to the class that corresponds to its nearest subspace. This paper mainly studies how well NSS can be generalized to new samples. It is proved that NSS is strongly consistent under certain assumptions. For completeness, NSS is evaluated through experiments on various simulated and real data sets, in comparison with some other linear model based classifiers. It is also ...

  6. Autonomous Navigation with Constrained Consistency for C-Ranger

    Directory of Open Access Journals (Sweden)

    Shujing Zhang

    2014-06-01

    Full Text Available Autonomous underwater vehicles (AUVs have become the most widely used tools for undertaking complex exploration tasks in marine environments. Their synthetic ability to carry out localization autonomously and build an environmental map concurrently, in other words, simultaneous localization and mapping (SLAM, are considered to be pivotal requirements for AUVs to have truly autonomous navigation. However, the consistency problem of the SLAM system has been greatly ignored during the past decades. In this paper, a consistency constrained extended Kalman filter (EKF SLAM algorithm, applying the idea of local consistency, is proposed and applied to the autonomous navigation of the C-Ranger AUV, which is developed as our experimental platform. The concept of local consistency (LC is introduced after an explicit theoretical derivation of the EKF-SLAM system. Then, we present a locally consistency-constrained EKF-SLAM design, LC-EKF, in which the landmark estimates used for linearization are fixed at the beginning of each local time period, rather than evaluated at the latest landmark estimates. Finally, our proposed LC-EKF algorithm is experimentally verified, both in simulations and sea trials. The experimental results show that the LC-EKF performs well with regard to consistency, accuracy and computational efficiency.

  7. Consistency relations in effective field theory

    Energy Technology Data Exchange (ETDEWEB)

    Munshi, Dipak; Regan, Donough, E-mail: D.Munshi@sussex.ac.uk, E-mail: D.Regan@sussex.ac.uk [Astronomy Centre, School of Mathematical and Physical Sciences, University of Sussex, Brighton BN1 9QH (United Kingdom)

    2017-06-01

    The consistency relations in large scale structure relate the lower-order correlation functions with their higher-order counterparts. They are direct outcome of the underlying symmetries of a dynamical system and can be tested using data from future surveys such as Euclid. Using techniques from standard perturbation theory (SPT), previous studies of consistency relation have concentrated on continuity-momentum (Euler)-Poisson system of an ideal fluid. We investigate the consistency relations in effective field theory (EFT) which adjusts the SPT predictions to account for the departure from the ideal fluid description on small scales. We provide detailed results for the 3D density contrast δ as well as the scaled divergence of velocity θ-bar . Assuming a ΛCDM background cosmology, we find the correction to SPT results becomes important at k ∼> 0.05 h/Mpc and that the suppression from EFT to SPT results that scales as square of the wave number k , can reach 40% of the total at k ≈ 0.25 h/Mpc at z = 0. We have also investigated whether effective field theory corrections to models of primordial non-Gaussianity can alter the squeezed limit behaviour, finding the results to be rather insensitive to these counterterms. In addition, we present the EFT corrections to the squeezed limit of the bispectrum in redshift space which may be of interest for tests of theories of modified gravity.

  8. Consistent probabilities in loop quantum cosmology

    International Nuclear Information System (INIS)

    Craig, David A; Singh, Parampreet

    2013-01-01

    A fundamental issue for any quantum cosmological theory is to specify how probabilities can be assigned to various quantum events or sequences of events such as the occurrence of singularities or bounces. In previous work, we have demonstrated how this issue can be successfully addressed within the consistent histories approach to quantum theory for Wheeler–DeWitt-quantized cosmological models. In this work, we generalize that analysis to the exactly solvable loop quantization of a spatially flat, homogeneous and isotropic cosmology sourced with a massless, minimally coupled scalar field known as sLQC. We provide an explicit, rigorous and complete decoherent-histories formulation for this model and compute the probabilities for the occurrence of a quantum bounce versus a singularity. Using the scalar field as an emergent internal time, we show for generic states that the probability for a singularity to occur in this model is zero, and that of a bounce is unity, complementing earlier studies of the expectation values of the volume and matter density in this theory. We also show from the consistent histories point of view that all states in this model, whether quantum or classical, achieve arbitrarily large volume in the limit of infinite ‘past’ or ‘future’ scalar ‘time’, in the sense that the wave function evaluated at any arbitrary fixed value of the volume vanishes in that limit. Finally, we briefly discuss certain misconceptions concerning the utility of the consistent histories approach in these models. (paper)

  9. Do Health Systems Have Consistent Performance Across Locations and Is Consistency Associated With Higher Performance?

    Science.gov (United States)

    Crespin, Daniel J; Christianson, Jon B; McCullough, Jeffrey S; Finch, Michael D

    This study addresses whether health systems have consistent diabetes care performance across their ambulatory clinics and whether increasing consistency is associated with improvements in clinic performance. Study data included 2007 to 2013 diabetes care intermediate outcome measures for 661 ambulatory clinics in Minnesota and bordering states. Health systems provided more consistent performance, as measured by the standard deviation of performance for clinics in a system, relative to propensity score-matched proxy systems created for comparison purposes. No evidence was found that improvements in consistency were associated with higher clinic performance. The combination of high performance and consistent care is likely to enhance a health system's brand reputation, allowing it to better mitigate the financial risks of consumers seeking care outside the organization. These results suggest that larger health systems are most likely to deliver the combination of consistent and high-performance care. Future research should explore the mechanisms that drive consistent care within health systems.

  10. Consistent vapour-liquid equilibrium data containing lipids

    DEFF Research Database (Denmark)

    Cunico, Larissa; Ceriani, Roberta; Sarup, Bent

    Consistent physical and thermodynamic properties of pure components and their mixtures are important for process design, simulation, and optimization as well as design of chemical based products. In the case of lipids, it was observed a lack of experimental data for pure compounds and also...... for their mixtures in open literature, what makes necessary the development of reliable predictive models based on limited data. To contribute to the missing data, measurements of isobaric vapour-liquid equilibrium (VLE) data of three binary mixtures at two different pressures were performed at State University...

  11. Consistent creep and rupture properties for creep-fatigue evaluation

    International Nuclear Information System (INIS)

    Schultz, C.C.

    1978-01-01

    The currently accepted practice of using inconsistent representations of creep and rupture behaviors in the prediction of creep-fatigue life is shown to introduce a factor of safety beyond that specified in current ASME Code design rules for 304 stainless steel Class 1 nuclear components. Accurate predictions of creep-fatigue life for uniaxial tests on a given heat of material are obtained by using creep and rupture properties for that same heat of material. The use of a consistent representation of creep and rupture properties for a mininum strength heat is also shown to provide adequate predictions. The viability of using consistent properties (either actual or those of a minimum heat) to predict creep-fatigue life thus identifies significant design uses for the results of characterization tests and improved creep and rupture correlations

  12. Consistent creep and rupture properties for creep-fatigue evaluation

    International Nuclear Information System (INIS)

    Schultz, C.C.

    1979-01-01

    The currently accepted practice of using inconsistent representations of creep and rupture behaviors in the prediction of creep-fatigue life is shown to introduce a factor of safety beyond that specified in current ASME Code design rules for 304 stainless steel Class 1 nuclear components. Accurate predictions of creep-fatigue life for uniaxial tests on a given heat of material are obtained by using creep and rupture properties for that same heat of material. The use of a consistent representation of creep and rupture properties for a minimum strength heat is also shown to provide reasonable predictions. The viability of using consistent properties (either actual or those of a minimum strength heat) to predict creep-fatigue life thus identifies significant design uses for the results of characterization tests and improved creep and rupture correlations. 12 refs

  13. Design of a rotary reactor for chemical-looping combustion. Part 1: Fundamentals and design methodology

    KAUST Repository

    Zhao, Zhenlong

    2014-04-01

    Chemical-looping combustion (CLC) is a novel and promising option for several applications including carbon capture (CC), fuel reforming, H 2 generation, etc. Previous studies demonstrated the feasibility of performing CLC in a novel rotary design with micro-channel structures. In the reactor, a solid wheel rotates between the fuel and air streams at the reactor inlet, and depleted air and product streams at exit. The rotary wheel consists of a large number of micro-channels with oxygen carriers (OC) coated on the inner surface of the channel walls. In the CC application, the OC oxidizes the fuel while the channel is in the fuel zone to generate undiluted CO2, and is regenerated while the channel is in the air zone. In this two-part series, the effect of the reactor design parameters is evaluated and its performance with different OCs is compared. In Part 1, the design objectives and criteria are specified and the key parameters controlling the reactor performance are identified. The fundamental effects of the OC characteristics, the design parameters, and the operating conditions are studied. The design procedures are presented on the basis of the relative importance of each parameter, enabling a systematic methodology of selecting the design parameters and the operating conditions with different OCs. Part 2 presents the application of the methodology to the designs with the three commonly used OCs, i.e., nickel, copper, and iron, and compares the simulated performances of the designs. © 2013 Elsevier Ltd. All rights reserved.

  14. Consistent Partial Least Squares Path Modeling via Regularization.

    Science.gov (United States)

    Jung, Sunho; Park, JaeHong

    2018-01-01

    Partial least squares (PLS) path modeling is a component-based structural equation modeling that has been adopted in social and psychological research due to its data-analytic capability and flexibility. A recent methodological advance is consistent PLS (PLSc), designed to produce consistent estimates of path coefficients in structural models involving common factors. In practice, however, PLSc may frequently encounter multicollinearity in part because it takes a strategy of estimating path coefficients based on consistent correlations among independent latent variables. PLSc has yet no remedy for this multicollinearity problem, which can cause loss of statistical power and accuracy in parameter estimation. Thus, a ridge type of regularization is incorporated into PLSc, creating a new technique called regularized PLSc. A comprehensive simulation study is conducted to evaluate the performance of regularized PLSc as compared to its non-regularized counterpart in terms of power and accuracy. The results show that our regularized PLSc is recommended for use when serious multicollinearity is present.

  15. Overspecification of colour, pattern, and size: Salience, absoluteness, and consistency

    OpenAIRE

    Sammie eTarenskeen; Mirjam eBroersma; Mirjam eBroersma; Bart eGeurts

    2015-01-01

    The rates of overspecification of colour, pattern, and size are compared, to investigate how salience and absoluteness contribute to the production of overspecification. Colour and pattern are absolute attributes, whereas size is relative and less salient. Additionally, a tendency towards consistent responses is assessed. Using a within-participants design, we find similar rates of colour and pattern overspecification, which are both higher than the rate of size overspecification. Using a bet...

  16. Overspecification of color, pattern, and size: salience, absoluteness, and consistency

    OpenAIRE

    Tarenskeen, S.L.; Broersma, M.; Geurts, B.

    2015-01-01

    The rates of overspecification of color, pattern, and size are compared, to investigate how salience and absoluteness contribute to the production of overspecification. Color and pattern are absolute and salient attributes, whereas size is relative and less salient. Additionally, a tendency toward consistent responses is assessed. Using a within-participants design, we find similar rates of color and pattern overspecification, which are both higher than the rate of size overspecification. Usi...

  17. Self-consistent gravitational self-force

    International Nuclear Information System (INIS)

    Pound, Adam

    2010-01-01

    I review the problem of motion for small bodies in general relativity, with an emphasis on developing a self-consistent treatment of the gravitational self-force. An analysis of the various derivations extant in the literature leads me to formulate an asymptotic expansion in which the metric is expanded while a representative worldline is held fixed. I discuss the utility of this expansion for both exact point particles and asymptotically small bodies, contrasting it with a regular expansion in which both the metric and the worldline are expanded. Based on these preliminary analyses, I present a general method of deriving self-consistent equations of motion for arbitrarily structured (sufficiently compact) small bodies. My method utilizes two expansions: an inner expansion that keeps the size of the body fixed, and an outer expansion that lets the body shrink while holding its worldline fixed. By imposing the Lorenz gauge, I express the global solution to the Einstein equation in the outer expansion in terms of an integral over a worldtube of small radius surrounding the body. Appropriate boundary data on the tube are determined from a local-in-space expansion in a buffer region where both the inner and outer expansions are valid. This buffer-region expansion also results in an expression for the self-force in terms of irreducible pieces of the metric perturbation on the worldline. Based on the global solution, these pieces of the perturbation can be written in terms of a tail integral over the body's past history. This approach can be applied at any order to obtain a self-consistent approximation that is valid on long time scales, both near and far from the small body. I conclude by discussing possible extensions of my method and comparing it to alternative approaches.

  18. Consistency Checking of Web Service Contracts

    DEFF Research Database (Denmark)

    Cambronero, M. Emilia; Okika, Joseph C.; Ravn, Anders Peter

    2008-01-01

    Behavioural properties are analyzed for web service contracts formulated in Business Process Execution Language (BPEL) and Choreography Description Language (CDL). The key result reported is an automated technique to check consistency between protocol aspects of the contracts. The contracts...... are abstracted to (timed) automata and from there a simulation is set up, which is checked using automated tools for analyzing networks of finite state processes. Here we use the Concurrency Work Bench. The proposed techniques are illustrated with a case study that include otherwise difficult to analyze fault...

  19. A method for consistent precision radiation therapy

    International Nuclear Information System (INIS)

    Leong, J.

    1985-01-01

    Using a meticulous setup procedure in which repeated portal films were taken before each treatment until satisfactory portal verifications were obtained, a high degree of precision in patient positioning was achieved. A fluctuation from treatment to treatment, over 11 treatments, of less than +-0.10 cm (S.D.) for anatomical points inside the treatment field was obtained. This, however, only applies to specific anatomical points selected for this positioning procedure and does not apply to all points within the portal. We have generalized this procedure and have suggested a means by which any target volume can be consistently positioned which may approach this degree of precision. (orig.)

  20. Gentzen's centenary the quest for consistency

    CERN Document Server

    Rathjen, Michael

    2015-01-01

    Gerhard Gentzen has been described as logic’s lost genius, whom Gödel called a better logician than himself. This work comprises articles by leading proof theorists, attesting to Gentzen’s enduring legacy to mathematical logic and beyond. The contributions range from philosophical reflections and re-evaluations of Gentzen’s original consistency proofs to the most recent developments in proof theory. Gentzen founded modern proof theory. His sequent calculus and natural deduction system beautifully explain the deep symmetries of logic. They underlie modern developments in computer science such as automated theorem proving and type theory.

  1. Two consistent calculations of the Weinberg angle

    International Nuclear Information System (INIS)

    Fairlie, D.B.

    1979-01-01

    The Weinberg-Salam theory is reformulated as a pure Yang-Mills theory in a six-dimensional space, the Higgs field being interpreted as gauge potentials in the additional dimensions. Viewed in this way, the condition that the Higgs field transforms as a U(1) representation of charge one is equivalent to requiring a value of 30 0 C for the Weinberg angle. A second consistent determination comes from the idea borrowed from monopole theory that the electromagnetic field is in the direction of the Higgs field. (Author)

  2. Consistency in color parameters of a commonly used shade guide.

    Science.gov (United States)

    Tashkandi, Esam

    2010-01-01

    The use of shade guides to assess the color of natural teeth subjectively remains one of the most common means for dental shade assessment. Any variation in the color parameters of the different shade guides may lead to significant clinical implications. Particularly, since the communication between the clinic and the dental laboratory is based on using the shade guide designation. The purpose of this study was to investigate the consistency of the L∗a∗b∗ color parameters of a sample of a commonly used shade guide. The color parameters of a total of 100 VITAPAN Classical Vacuum shade guide (VITA Zahnfabrik, Bad Säckingen, Germany(were measured using a X-Rite ColorEye 7000A Spectrophotometer (Grand Rapids, Michigan, USA). Each shade guide consists of 16 tabs with different designations. Each shade tab was measured five times and the average values were calculated. The ΔE between the average L∗a∗b∗ value for each shade tab and the average of the 100 shade tabs of the same designation was calculated. Using the Student t-test analysis, no significant differences were found among the measured sample. There is a high consistency level in terms of color parameters of the measured VITAPAN Classical Vacuum shade guide sample tested.

  3. Model of designating the critical damages

    Directory of Open Access Journals (Sweden)

    Zwolińska Bożena

    2017-06-01

    Full Text Available The article consists of two parts which make for an integral body. This article depicts the method of designating the critical damages in accordance with lean maintenance method. Author considered exemplary production system (serial-parallel in which in time Δt appeared a damage on three different objects. Article depicts the mathematical model which enables determination of an indicator called “prioritized digit of the device”. In the developed model there were considered some parameters: production abilities of devices, existence of potential vicarious devices, position of damage in the production stream based on the capacity of operational buffers, time needed to remove the damages and influence of damages to the finalization of customers’ orders – CEF indicator.

  4. Model of designating the critical damages

    Directory of Open Access Journals (Sweden)

    Zwolińska Bożena

    2017-06-01

    Full Text Available Managing company in the lean way presumes no breakdowns nor reserves in the whole delivery chain. However, achieving such low indicators is impossible. That is why in some production plants it is extremely important to focus on preventive actions which can limit damages. This article depicts the method of designating the critical damages in accordance with lean maintenance method. The article consists of two parts which make for an integral body. Part one depicts the characteristic of a realistic object, it also contains productions capabilities analysis of certain areas within the production structure. Part two depicts the probabilistic model of shaping maximal time loss basing on emptying and filling interoperational buffers.

  5. Consistent resolution of some relativistic quantum paradoxes

    International Nuclear Information System (INIS)

    Griffiths, Robert B.

    2002-01-01

    A relativistic version of the (consistent or decoherent) histories approach to quantum theory is developed on the basis of earlier work by Hartle, and used to discuss relativistic forms of the paradoxes of spherical wave packet collapse, Bohm's formulation of the Einstein-Podolsky-Rosen paradox, and Hardy's paradox. It is argued that wave function collapse is not needed for introducing probabilities into relativistic quantum mechanics, and in any case should never be thought of as a physical process. Alternative approaches to stochastic time dependence can be used to construct a physical picture of the measurement process that is less misleading than collapse models. In particular, one can employ a coarse-grained but fully quantum-mechanical description in which particles move along trajectories, with behavior under Lorentz transformations the same as in classical relativistic physics, and detectors are triggered by particles reaching them along such trajectories. States entangled between spacelike separate regions are also legitimate quantum descriptions, and can be consistently handled by the formalism presented here. The paradoxes in question arise because of using modes of reasoning which, while correct for classical physics, are inconsistent with the mathematical structure of quantum theory, and are resolved (or tamed) by using a proper quantum analysis. In particular, there is no need to invoke, nor any evidence for, mysterious long-range superluminal influences, and thus no incompatibility, at least from this source, between relativity theory and quantum mechanics

  6. Self-consistent model of confinement

    International Nuclear Information System (INIS)

    Swift, A.R.

    1988-01-01

    A model of the large-spatial-distance, zero--three-momentum, limit of QCD is developed from the hypothesis that there is an infrared singularity. Single quarks and gluons do not propagate because they have infinite energy after renormalization. The Hamiltonian formulation of the path integral is used to quantize QCD with physical, nonpropagating fields. Perturbation theory in the infrared limit is simplified by the absence of self-energy insertions and by the suppression of large classes of diagrams due to vanishing propagators. Remaining terms in the perturbation series are resummed to produce a set of nonlinear, renormalizable integral equations which fix both the confining interaction and the physical propagators. Solutions demonstrate the self-consistency of the concepts of an infrared singularity and nonpropagating fields. The Wilson loop is calculated to provide a general proof of confinement. Bethe-Salpeter equations for quark-antiquark pairs and for two gluons have finite-energy solutions in the color-singlet channel. The choice of gauge is addressed in detail. Large classes of corrections to the model are discussed and shown to support self-consistency

  7. Subgame consistent cooperation a comprehensive treatise

    CERN Document Server

    Yeung, David W K

    2016-01-01

    Strategic behavior in the human and social world has been increasingly recognized in theory and practice. It is well known that non-cooperative behavior could lead to suboptimal or even highly undesirable outcomes. Cooperation suggests the possibility of obtaining socially optimal solutions and the calls for cooperation are prevalent in real-life problems. Dynamic cooperation cannot be sustainable if there is no guarantee that the agreed upon optimality principle at the beginning is maintained throughout the cooperation duration. It is due to the lack of this kind of guarantees that cooperative schemes fail to last till its end or even fail to get started. The property of subgame consistency in cooperative dynamic games and the corresponding solution mechanism resolve this “classic” problem in game theory. This book is a comprehensive treatise on subgame consistent dynamic cooperation covering the up-to-date state of the art analyses in this important topic. It sets out to provide the theory, solution tec...

  8. Sludge characterization: the role of physical consistency

    Energy Technology Data Exchange (ETDEWEB)

    Spinosa, Ludovico; Wichmann, Knut

    2003-07-01

    The physical consistency is an important parameter in sewage sludge characterization as it strongly affects almost all treatment, utilization and disposal operations. In addition, in many european Directives a reference to the physical consistency is reported as a characteristic to be evaluated for fulfilling the regulations requirements. Further, in many analytical methods for sludge different procedures are indicated depending on whether a sample is liquid or not, is solid or not. Three physical behaviours (liquid, paste-like and solid) can be observed with sludges, so the development of analytical procedures to define the boundary limit between liquid and paste-like behaviours (flowability) and that between solid and paste-like ones (solidity) is of growing interest. Several devices can be used for evaluating the flowability and solidity properties, but often they are costly and difficult to be operated in the field. Tests have been carried out to evaluate the possibility to adopt a simple extrusion procedure for flowability measurements, and a Vicat needle for solidity ones. (author)

  9. Consistent mutational paths predict eukaryotic thermostability

    Directory of Open Access Journals (Sweden)

    van Noort Vera

    2013-01-01

    Full Text Available Abstract Background Proteomes of thermophilic prokaryotes have been instrumental in structural biology and successfully exploited in biotechnology, however many proteins required for eukaryotic cell function are absent from bacteria or archaea. With Chaetomium thermophilum, Thielavia terrestris and Thielavia heterothallica three genome sequences of thermophilic eukaryotes have been published. Results Studying the genomes and proteomes of these thermophilic fungi, we found common strategies of thermal adaptation across the different kingdoms of Life, including amino acid biases and a reduced genome size. A phylogenetics-guided comparison of thermophilic proteomes with those of other, mesophilic Sordariomycetes revealed consistent amino acid substitutions associated to thermophily that were also present in an independent lineage of thermophilic fungi. The most consistent pattern is the substitution of lysine by arginine, which we could find in almost all lineages but has not been extensively used in protein stability engineering. By exploiting mutational paths towards the thermophiles, we could predict particular amino acid residues in individual proteins that contribute to thermostability and validated some of them experimentally. By determining the three-dimensional structure of an exemplar protein from C. thermophilum (Arx1, we could also characterise the molecular consequences of some of these mutations. Conclusions The comparative analysis of these three genomes not only enhances our understanding of the evolution of thermophily, but also provides new ways to engineer protein stability.

  10. Consistency of extreme flood estimation approaches

    Science.gov (United States)

    Felder, Guido; Paquet, Emmanuel; Penot, David; Zischg, Andreas; Weingartner, Rolf

    2017-04-01

    Estimations of low-probability flood events are frequently used for the planning of infrastructure as well as for determining the dimensions of flood protection measures. There are several well-established methodical procedures to estimate low-probability floods. However, a global assessment of the consistency of these methods is difficult to achieve, the "true value" of an extreme flood being not observable. Anyway, a detailed comparison performed on a given case study brings useful information about the statistical and hydrological processes involved in different methods. In this study, the following three different approaches for estimating low-probability floods are compared: a purely statistical approach (ordinary extreme value statistics), a statistical approach based on stochastic rainfall-runoff simulation (SCHADEX method), and a deterministic approach (physically based PMF estimation). These methods are tested for two different Swiss catchments. The results and some intermediate variables are used for assessing potential strengths and weaknesses of each method, as well as for evaluating the consistency of these methods.

  11. Consistent biokinetic models for the actinide elements

    International Nuclear Information System (INIS)

    Leggett, R.W.

    2001-01-01

    The biokinetic models for Th, Np, Pu, Am and Cm currently recommended by the International Commission on Radiological Protection (ICRP) were developed within a generic framework that depicts gradual burial of skeletal activity in bone volume, depicts recycling of activity released to blood and links excretion to retention and translocation of activity. For other actinide elements such as Ac, Pa, Bk, Cf and Es, the ICRP still uses simplistic retention models that assign all skeletal activity to bone surface and depicts one-directional flow of activity from blood to long-term depositories to excreta. This mixture of updated and older models in ICRP documents has led to inconsistencies in dose estimates and interpretation of bioassay for radionuclides with reasonably similar biokinetics. This paper proposes new biokinetic models for Ac, Pa, Bk, Cf and Es that are consistent with the updated models for Th, Np, Pu, Am and Cm. The proposed models are developed within the ICRP's generic model framework for bone-surface-seeking radionuclides, and an effort has been made to develop parameter values that are consistent with results of comparative biokinetic data on the different actinide elements. (author)

  12. Managing the consistency of distributed documents

    OpenAIRE

    Nentwich, C.

    2005-01-01

    Many businesses produce documents as part of their daily activities: software engineers produce requirements specifications, design models, source code, build scripts and more; business analysts produce glossaries, use cases, organisation charts, and domain ontology models; service providers and retailers produce catalogues, customer data, purchase orders, invoices and web pages. What these examples have in common is that the content of documents is often semantically relate...

  13. Consistency of canonical formulation of Horava gravity

    International Nuclear Information System (INIS)

    Soo, Chopin

    2011-01-01

    Both the non-projectable and projectable version of Horava gravity face serious challenges. In the non-projectable version, the constraint algebra is seemingly inconsistent. The projectable version lacks a local Hamiltonian constraint, thus allowing for an extra graviton mode which can be problematic. A new formulation (based on arXiv:1007.1563) of Horava gravity which is naturally realized as a representation of the master constraint algebra (instead of the Dirac algebra) studied by loop quantum gravity researchers is presented. This formulation yields a consistent canonical theory with first class constraints; and captures the essence of Horava gravity in retaining only spatial diffeomorphisms as the physically relevant non-trivial gauge symmetry. At the same time the local Hamiltonian constraint is equivalently enforced by the master constraint.

  14. Consistency of canonical formulation of Horava gravity

    Energy Technology Data Exchange (ETDEWEB)

    Soo, Chopin, E-mail: cpsoo@mail.ncku.edu.tw [Department of Physics, National Cheng Kung University, Tainan, Taiwan (China)

    2011-09-22

    Both the non-projectable and projectable version of Horava gravity face serious challenges. In the non-projectable version, the constraint algebra is seemingly inconsistent. The projectable version lacks a local Hamiltonian constraint, thus allowing for an extra graviton mode which can be problematic. A new formulation (based on arXiv:1007.1563) of Horava gravity which is naturally realized as a representation of the master constraint algebra (instead of the Dirac algebra) studied by loop quantum gravity researchers is presented. This formulation yields a consistent canonical theory with first class constraints; and captures the essence of Horava gravity in retaining only spatial diffeomorphisms as the physically relevant non-trivial gauge symmetry. At the same time the local Hamiltonian constraint is equivalently enforced by the master constraint.

  15. A consistent thermodynamic database for cement minerals

    International Nuclear Information System (INIS)

    Blanc, P.; Claret, F.; Burnol, A.; Marty, N.; Gaboreau, S.; Tournassat, C.; Gaucher, E.C.; Giffault, E.; Bourbon, X.

    2010-01-01

    work - the formation enthalpy and the Cp(T) function are taken from the literature or estimated - finally, the Log K(T) function is calculated, based on the selected dataset and it is compared to experimental data gathered at different temperatures. Each experimental point is extracted from solution compositions by using PHREEQC with a selection of aqueous complexes, consistent with the Thermochimie database. The selection was tested namely by drawing activity diagrams, allowing to assess phases relations. An example of such a diagram, drawn in the CaO-Al 2 O 3 -SiO 2 -H 2 O system is displayed. It can be seen that low pH concrete alteration proceeds essentially in decreasing the C/S ratio in C-S-H phases to the point where C-S-H are no longer stable and replaced by zeolite, then clay minerals. This evolution corresponds to a decrease in silica activity, which is consistent with the pH decrease, as silica concentration depends essentially on pH. Some rather consistent phase relations have been obtained for the SO 3 -Al 2 O 3 -CaO-CO 2 -H 2 O system. Addition of iron III enlarges the AFm-SO 4 stability field to the low temperature domain, whereas it decreases the pH domain where ettringite is stable. On the other hand, the stability field of katoite remains largely ambiguous, namely with respect to a hydro-garnet/grossular solid solution. With respect to other databases this work was made in consistency with a larger mineral selection, so that it can be used for modelling works in the cement clay interaction context

  16. Evaluating the hydrological consistency of evaporation products

    KAUST Repository

    Lopez Valencia, Oliver Miguel; Houborg, Rasmus; McCabe, Matthew

    2017-01-01

    Advances in space-based observations have provided the capacity to develop regional- to global-scale estimates of evaporation, offering insights into this key component of the hydrological cycle. However, the evaluation of large-scale evaporation retrievals is not a straightforward task. While a number of studies have intercompared a range of these evaporation products by examining the variance amongst them, or by comparison of pixel-scale retrievals against ground-based observations, there is a need to explore more appropriate techniques to comprehensively evaluate remote-sensing-based estimates. One possible approach is to establish the level of product agreement between related hydrological components: for instance, how well do evaporation patterns and response match with precipitation or water storage changes? To assess the suitability of this "consistency"-based approach for evaluating evaporation products, we focused our investigation on four globally distributed basins in arid and semi-arid environments, comprising the Colorado River basin, Niger River basin, Aral Sea basin, and Lake Eyre basin. In an effort to assess retrieval quality, three satellite-based global evaporation products based on different methodologies and input data, including CSIRO-PML, the MODIS Global Evapotranspiration product (MOD16), and Global Land Evaporation: the Amsterdam Methodology (GLEAM), were evaluated against rainfall data from the Global Precipitation Climatology Project (GPCP) along with Gravity Recovery and Climate Experiment (GRACE) water storage anomalies. To ensure a fair comparison, we evaluated consistency using a degree correlation approach after transforming both evaporation and precipitation data into spherical harmonics. Overall we found no persistent hydrological consistency in these dryland environments. Indeed, the degree correlation showed oscillating values between periods of low and high water storage changes, with a phase difference of about 2–3 months

  17. Self-consistent modelling of ICRH

    International Nuclear Information System (INIS)

    Hellsten, T.; Hedin, J.; Johnson, T.; Laxaaback, M.; Tennfors, E.

    2001-01-01

    The performance of ICRH is often sensitive to the shape of the high energy part of the distribution functions of the resonating species. This requires self-consistent calculations of the distribution functions and the wave-field. In addition to the wave-particle interactions and Coulomb collisions the effects of the finite orbit width and the RF-induced spatial transport are found to be important. The inward drift dominates in general even for a symmetric toroidal wave spectrum in the centre of the plasma. An inward drift does not necessarily produce a more peaked heating profile. On the contrary, for low concentrations of hydrogen minority in deuterium plasmas it can even give rise to broader profiles. (author)

  18. Non linear self consistency of microtearing modes

    International Nuclear Information System (INIS)

    Garbet, X.; Mourgues, F.; Samain, A.

    1987-01-01

    The self consistency of a microtearing turbulence is studied in non linear regimes where the ergodicity of the flux lines determines the electron response. The current which sustains the magnetic perturbation via the Ampere law results from the combines action of the radial electric field in the frame where the island chains are static and of the thermal electron diamagnetism. Numerical calculations show that at usual values of β pol in Tokamaks the turbulence can create a diffusion coefficient of order ν th p 2 i where p i is the ion larmor radius and ν th the electron ion collision frequency. On the other hand, collisionless regimes involving special profiles of each mode near the resonant surface seem possible

  19. Consistent evolution in a pedestrian flow

    Science.gov (United States)

    Guan, Junbiao; Wang, Kaihua

    2016-03-01

    In this paper, pedestrian evacuation considering different human behaviors is studied by using a cellular automaton (CA) model combined with the snowdrift game theory. The evacuees are divided into two types, i.e. cooperators and defectors, and two different human behaviors, herding behavior and independent behavior, are investigated. It is found from a large amount of numerical simulations that the ratios of the corresponding evacuee clusters are evolved to consistent states despite 11 typically different initial conditions, which may largely owe to self-organization effect. Moreover, an appropriate proportion of initial defectors who are of herding behavior, coupled with an appropriate proportion of initial defectors who are of rationally independent thinking, are two necessary factors for short evacuation time.

  20. Evaluating the hydrological consistency of evaporation products

    KAUST Repository

    Lopez Valencia, Oliver Miguel

    2017-01-18

    Advances in space-based observations have provided the capacity to develop regional- to global-scale estimates of evaporation, offering insights into this key component of the hydrological cycle. However, the evaluation of large-scale evaporation retrievals is not a straightforward task. While a number of studies have intercompared a range of these evaporation products by examining the variance amongst them, or by comparison of pixel-scale retrievals against ground-based observations, there is a need to explore more appropriate techniques to comprehensively evaluate remote-sensing-based estimates. One possible approach is to establish the level of product agreement between related hydrological components: for instance, how well do evaporation patterns and response match with precipitation or water storage changes? To assess the suitability of this "consistency"-based approach for evaluating evaporation products, we focused our investigation on four globally distributed basins in arid and semi-arid environments, comprising the Colorado River basin, Niger River basin, Aral Sea basin, and Lake Eyre basin. In an effort to assess retrieval quality, three satellite-based global evaporation products based on different methodologies and input data, including CSIRO-PML, the MODIS Global Evapotranspiration product (MOD16), and Global Land Evaporation: the Amsterdam Methodology (GLEAM), were evaluated against rainfall data from the Global Precipitation Climatology Project (GPCP) along with Gravity Recovery and Climate Experiment (GRACE) water storage anomalies. To ensure a fair comparison, we evaluated consistency using a degree correlation approach after transforming both evaporation and precipitation data into spherical harmonics. Overall we found no persistent hydrological consistency in these dryland environments. Indeed, the degree correlation showed oscillating values between periods of low and high water storage changes, with a phase difference of about 2–3 months

  1. Thermodynamically consistent model calibration in chemical kinetics

    Directory of Open Access Journals (Sweden)

    Goutsias John

    2011-05-01

    Full Text Available Abstract Background The dynamics of biochemical reaction systems are constrained by the fundamental laws of thermodynamics, which impose well-defined relationships among the reaction rate constants characterizing these systems. Constructing biochemical reaction systems from experimental observations often leads to parameter values that do not satisfy the necessary thermodynamic constraints. This can result in models that are not physically realizable and may lead to inaccurate, or even erroneous, descriptions of cellular function. Results We introduce a thermodynamically consistent model calibration (TCMC method that can be effectively used to provide thermodynamically feasible values for the parameters of an open biochemical reaction system. The proposed method formulates the model calibration problem as a constrained optimization problem that takes thermodynamic constraints (and, if desired, additional non-thermodynamic constraints into account. By calculating thermodynamically feasible values for the kinetic parameters of a well-known model of the EGF/ERK signaling cascade, we demonstrate the qualitative and quantitative significance of imposing thermodynamic constraints on these parameters and the effectiveness of our method for accomplishing this important task. MATLAB software, using the Systems Biology Toolbox 2.1, can be accessed from http://www.cis.jhu.edu/~goutsias/CSS lab/software.html. An SBML file containing the thermodynamically feasible EGF/ERK signaling cascade model can be found in the BioModels database. Conclusions TCMC is a simple and flexible method for obtaining physically plausible values for the kinetic parameters of open biochemical reaction systems. It can be effectively used to recalculate a thermodynamically consistent set of parameter values for existing thermodynamically infeasible biochemical reaction models of cellular function as well as to estimate thermodynamically feasible values for the parameters of new

  2. The design of a measuring system for soft X ray absolute intensity

    International Nuclear Information System (INIS)

    Cui Congwu; Cui Mingqi

    1997-01-01

    The design of a measuring system for soft X ray absolute intensity in detail is presented. The system consists of two parts: the ionization chamber, the silicon photodiode and its transferring system. The system can be used as the primary standard detector for the measurement of soft X ray absolute radiation flux in the energy range of 50 to 2000 eV after being calibrated. The whole system will be installed to the newly built beamline of 3W1B at Beijing Synchrotron Radiation Facility

  3. Multi-Parameter Wireless Monitoring and Telecommand of a Rocket Payload: Design and Implementation

    Science.gov (United States)

    Pamungkas, Arga C.; Putra, Alma A.; Puspitaningayu, Pradini; Fransisca, Yulia; Widodo, Arif

    2018-04-01

    A rocket system generally consists of two parts, the rocket motor and the payload. The payload system is built of several sensors such as accelerometer, gyroscope, magnetometer, and also a surveillance camera. These sensors are used to monitor the rocket in a three-dimensional axis which determine its attitude. Additionally, the payload must be able to perform image capturing in a certain distance using telecommand. This article is intended to describe the design and also the implementation of a rocket payload which has attitude monitoring and telecommand ability from the ground control station using a long-range wireless module Digi XBee Pro 900 HP.

  4. Exploring the Consistent behavior of Information Services

    Directory of Open Access Journals (Sweden)

    Kapidakis Sarantos

    2016-01-01

    Full Text Available Computer services are normally assumed to work well all the time. This usually happens for crucial services like bank electronic services, but not necessarily so for others, that there is no commercial interest in their operation. In this work we examined the operation and the errors of information services and tried to find clues that will help predicting the consistency of the behavior and the quality of the harvesting, which is harder because of the transient conditions and the many services and the huge amount of harvested information. We found many unexpected situations. The services that always successfully satisfy a request may in fact return part of it. A significant part of the OAI services have ceased working while many other serves occasionally fail to respond. Some services fail in the same way each time, and we pronounce them dead, as we do not see a way to overcome that. Others also always, or sometimes fail, but not in the same way, and we hope that their behavior is affected by temporary factors, that may improve later on. We categorized the services into classes, to study their behavior in more detail.

  5. A Consistent Phylogenetic Backbone for the Fungi

    Science.gov (United States)

    Ebersberger, Ingo; de Matos Simoes, Ricardo; Kupczok, Anne; Gube, Matthias; Kothe, Erika; Voigt, Kerstin; von Haeseler, Arndt

    2012-01-01

    The kingdom of fungi provides model organisms for biotechnology, cell biology, genetics, and life sciences in general. Only when their phylogenetic relationships are stably resolved, can individual results from fungal research be integrated into a holistic picture of biology. However, and despite recent progress, many deep relationships within the fungi remain unclear. Here, we present the first phylogenomic study of an entire eukaryotic kingdom that uses a consistency criterion to strengthen phylogenetic conclusions. We reason that branches (splits) recovered with independent data and different tree reconstruction methods are likely to reflect true evolutionary relationships. Two complementary phylogenomic data sets based on 99 fungal genomes and 109 fungal expressed sequence tag (EST) sets analyzed with four different tree reconstruction methods shed light from different angles on the fungal tree of life. Eleven additional data sets address specifically the phylogenetic position of Blastocladiomycota, Ustilaginomycotina, and Dothideomycetes, respectively. The combined evidence from the resulting trees supports the deep-level stability of the fungal groups toward a comprehensive natural system of the fungi. In addition, our analysis reveals methodologically interesting aspects. Enrichment for EST encoded data—a common practice in phylogenomic analyses—introduces a strong bias toward slowly evolving and functionally correlated genes. Consequently, the generalization of phylogenomic data sets as collections of randomly selected genes cannot be taken for granted. A thorough characterization of the data to assess possible influences on the tree reconstruction should therefore become a standard in phylogenomic analyses. PMID:22114356

  6. [Consistent Declarative Memory with Depressive Symptomatology].

    Science.gov (United States)

    Botelho de Oliveira, Silvia; Flórez, Ruth Natalia Suárez; Caballero, Diego Andrés Vásquez

    2012-12-01

    Some studies have suggested that potentiated remembrance of negative events on people with depressive disorders seems to be an important factor in the etiology, course and maintenance of depression. Evaluate the emotional memory in people with and without depressive symptomatology by means of an audio-visual test. 73 university students were evaluated, male and female, between 18 and 40 years old, distributed in two groups: with depressive symptomatology (32) and without depressive symptomatology (40), using the Scale from the Center of Epidemiologic Studies for Depression (CES-D, English Abbreviation) and a cutting point of 20. There were not meaningful differences between free and voluntary recalls, with and without depressive symptomatology, in spite of the fact that both groups had granted a higher emotional value to the audio-visual test and that they had associated it with emotional sadness. People with depressive symptomatology did not exhibit the effect of mnemonic potentiation generally associated to the content of the emotional version of the test; therefore, the hypothesis of emotional consistency was not validated. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  7. Self consistent field theory of virus assembly

    Science.gov (United States)

    Li, Siyu; Orland, Henri; Zandi, Roya

    2018-04-01

    The ground state dominance approximation (GSDA) has been extensively used to study the assembly of viral shells. In this work we employ the self-consistent field theory (SCFT) to investigate the adsorption of RNA onto positively charged spherical viral shells and examine the conditions when GSDA does not apply and SCFT has to be used to obtain a reliable solution. We find that there are two regimes in which GSDA does work. First, when the genomic RNA length is long enough compared to the capsid radius, and second, when the interaction between the genome and capsid is so strong that the genome is basically localized next to the wall. We find that for the case in which RNA is more or less distributed uniformly in the shell, regardless of the length of RNA, GSDA is not a good approximation. We observe that as the polymer-shell interaction becomes stronger, the energy gap between the ground state and first excited state increases and thus GSDA becomes a better approximation. We also present our results corresponding to the genome persistence length obtained through the tangent-tangent correlation length and show that it is zero in case of GSDA but is equal to the inverse of the energy gap when using SCFT.

  8. Consistency based correlations for tailings consolidation

    Energy Technology Data Exchange (ETDEWEB)

    Azam, S.; Paul, A.C. [Regina Univ., Regina, SK (Canada). Environmental Systems Engineering

    2010-07-01

    The extraction of oil, uranium, metals and mineral resources from the earth generates significant amounts of tailings slurry. The tailings are contained in a disposal area with perimeter dykes constructed from the coarser fraction of the slurry. There are many unique challenges pertaining to the management of the containment facilities for several decades beyond mine closure that are a result of the slow settling rates of the fines and the high standing toxic waters. Many tailings dam failures in different parts of the world have been reported to result in significant contaminant releases causing public concern over the conventional practice of tailings disposal. Therefore, in order to reduce and minimize the environmental footprint, the fluid tailings need to undergo efficient consolidation. This paper presented an investigation into the consolidation behaviour of tailings in conjunction with soil consistency that captured physicochemical interactions. The paper discussed the large strain consolidation behaviour (volume compressibility and hydraulic conductivity) of six fine-grained soil slurries based on published data. The paper provided background information on the study and presented the research methodology. The geotechnical index properties of the selected materials were also presented. The large strain consolidation, volume compressibility correlations, and hydraulic conductivity correlations were provided. It was concluded that the normalized void ratio best described volume compressibility whereas liquidity index best explained the hydraulic conductivity. 17 refs., 3 tabs., 4 figs.

  9. Consistency between GRUAN sondes, LBLRTM and IASI

    Directory of Open Access Journals (Sweden)

    X. Calbet

    2017-06-01

    Full Text Available Radiosonde soundings from the GCOS Reference Upper-Air Network (GRUAN data record are shown to be consistent with Infrared Atmospheric Sounding Instrument (IASI-measured radiances via LBLRTM (Line-By-Line Radiative Transfer Model in the part of the spectrum that is mostly affected by water vapour absorption in the upper troposphere (from 700 hPa up. This result is key for climate data records, since GRUAN, IASI and LBLRTM constitute reference measurements or a reference radiative transfer model in each of their fields. This is specially the case for night-time radiosonde measurements. Although the sample size is small (16 cases, daytime GRUAN radiosonde measurements seem to have a small dry bias of 2.5 % in absolute terms of relative humidity, located mainly in the upper troposphere, with respect to LBLRTM and IASI. Full metrological closure is not yet possible and will not be until collocation uncertainties are better characterized and a full uncertainty covariance matrix is clarified for GRUAN.

  10. Self-consistent nuclear energy systems

    International Nuclear Information System (INIS)

    Shimizu, A.; Fujiie, Y.

    1995-01-01

    A concept of self-consistent energy systems (SCNES) has been proposed as an ultimate goal of the nuclear energy system in the coming centuries. SCNES should realize a stable and unlimited energy supply without endangering the human race and the global environment. It is defined as a system that realizes at least the following four objectives simultaneously: (a) energy generation -attain high efficiency in the utilization of fission energy; (b) fuel production - secure inexhaustible energy source: breeding of fissile material with the breeding ratio greater than one and complete burning of transuranium through recycling; (c) burning of radionuclides - zero release of radionuclides from the system: complete burning of transuranium and elimination of radioactive fission products by neutron capture reactions through recycling; (d) system safety - achieve system safety both for the public and experts: eliminate criticality-related safety issues by using natural laws and simple logic. This paper describes the concept of SCNES and discusses the feasibility of the system. Both ''neutron balance'' and ''energbalance'' of the system are introduced as the necessary conditions to be satisfied at least by SCNES. Evaluations made so far indicate that both the neutron balance and the energy balance can be realized by fast reactors but not by thermal reactors. Concerning the system safety, two safety concepts: ''self controllability'' and ''self-terminability'' are introduced to eliminate the criticality-related safety issues in fast reactors. (author)

  11. Toward a consistent model for glass dissolution

    International Nuclear Information System (INIS)

    Strachan, D.M.; McGrail, B.P.; Bourcier, W.L.

    1994-01-01

    Understanding the process of glass dissolution in aqueous media has advanced significantly over the last 10 years through the efforts of many scientists around the world. Mathematical models describing the glass dissolution process have also advanced from simple empirical functions to structured models based on fundamental principles of physics, chemistry, and thermodynamics. Although borosilicate glass has been selected as the waste form for disposal of high-level wastes in at least 5 countries, there is no international consensus on the fundamental methodology for modeling glass dissolution that could be used in assessing the long term performance of waste glasses in a geologic repository setting. Each repository program is developing their own model and supporting experimental data. In this paper, we critically evaluate a selected set of these structured models and show that a consistent methodology for modeling glass dissolution processes is available. We also propose a strategy for a future coordinated effort to obtain the model input parameters that are needed for long-term performance assessments of glass in a geologic repository. (author) 4 figs., tabs., 75 refs

  12. Overspecification of colour, pattern, and size: Salience, absoluteness, and consistency

    Directory of Open Access Journals (Sweden)

    Sammie eTarenskeen

    2015-11-01

    Full Text Available The rates of overspecification of colour, pattern, and size are compared, to investigate how salience and absoluteness contribute to the production of overspecification. Colour and pattern are absolute attributes, whereas size is relative and less salient. Additionally, a tendency towards consistent responses is assessed. Using a within-participants design, we find similar rates of colour and pattern overspecification, which are both higher than the rate of size overspecification. Using a between-participants design, however, we find similar rates of pattern and size overspecification, which are both lower than the rate of colour overspecification. This indicates that although many speakers are more likely to include colour than pattern (probably because colour is more salient, they may also treat pattern like colour due to a tendency towards consistency. We find no increase in size overspecification when the salience of size is increased, suggesting that speakers are more likely to include absolute than relative attributes. However, we do find an increase in size overspecification when mentioning the attributes is triggered, which again shows that speakers tend refer in a consistent manner, and that there are circumstances in which even size overspecification is frequently produced.

  13. The Ostomy: Part One of Two Parts.

    Science.gov (United States)

    Watt, Rosemary C.; And Others

    1985-01-01

    Teaches nurses to identify four common indications for fecal diversion surgery: list three types of colostomies; distinguish a colostomy from an ileostomy; describe the two basic methods of colostomy management; and identify factors that influence the choice of method of colostomy care. (CT)

  14. View from Europe: stability, consistency or pragmatism

    International Nuclear Information System (INIS)

    Dunster, H.J.

    1988-01-01

    The last few years of this decade look like a period of reappraisal of radiation protection standards. The revised risk estimates from Japan will be available, and the United Nations Scientific Committee on the Effects of Atomic Radiation will be publishing new reports on biological topics. The International Commission on Radiological Protection (ICRP) has started a review of its basic recommendations, and the new specification for dose equivalent in radiation fields of the International Commission on Radiation Units and Measurements (ICRU) will be coming into use. All this is occurring at a time when some countries are still trying to catch up with committed dose equivalent and the recently recommended change in the value of the quality factor for neutrons. In Europe, the problems of adapting to new ICRP recommendations are considerable. The European Community, including 12 states and nine languages, takes ICRP recommendations as a basis and develops council directives that are binding on member states, which have then to arrange for their own regulatory changes. Any substantial adjustments could take 5 y or more to work through the system. Clearly, the regulatory preference is for stability. Equally clearly, trade unions and public interest groups favor a rapid response to scientific developments (provided that the change is downward). Organizations such as the ICRP have to balance their desire for internal consistency and intellectual purity against the practical problems of their clients in adjusting to change. This paper indicates some of the changes that might be necessary over the next few years and how, given a pragmatic approach, they might be accommodated in Europe without too much regulatory confusion

  15. The Consistency Between Clinical and Electrophysiological Diagnoses

    Directory of Open Access Journals (Sweden)

    Esra E. Okuyucu

    2009-09-01

    Full Text Available OBJECTIVE: The aim of this study was to provide information concerning the impact of electrophysiological tests in the clinical management and diagnosis of patients, and to evaluate the consistency between referring clinical diagnoses and electrophysiological diagnoses. METHODS: The study included 957 patients referred to the electroneuromyography (ENMG laboratory from different clinics with different clinical diagnoses in 2008. Demographic data, referring clinical diagnoses, the clinics where the requests wanted, and diagnoses after ENMG testing were recorded and statistically evaluated. RESULTS: In all, 957 patients [644 (67.3% female and 313 (32.7% male] were included in the study. Mean age of the patients was 45.40 ± 14.54 years. ENMG requests were made by different specialists; 578 (60.4% patients were referred by neurologists, 122 (12.8% by orthopedics, 140 (14.6% by neurosurgeons, and 117 (12.2% by physical treatment and rehabilitation departments. According to the results of ENMG testing, 513 (53.6% patients’ referrals were related to their referral diagnosis, whereas 397 (41.5% patients had normal ENMG test results, and 47 (4.9% patients had a diagnosis that differed from the referring diagnosis. Among the relation between the referral diagnosis and electrophysiological diagnosis according to the clinics where the requests were made, there was no statistical difference (p= 0.794, but there were statistically significant differences between the support of different clinical diagnoses, such as carpal tunnel syndrome, polyneuropathy, radiculopathy-plexopathy, entrapment neuropathy, and myopathy based on ENMG test results (p< 0.001. CONCLUSION: ENMG is a frequently used neurological examination. As such, referrals for ENMG can be made to either support the referring diagnosis or to exclude other diagnoses. This may explain the inconsistency between clinical referring diagnoses and diagnoses following ENMG

  16. Self-consistent meson mass spectrum

    International Nuclear Information System (INIS)

    Balazs, L.A.P.

    1982-01-01

    A dual-topological-unitarization (or dual-fragmentation) approach to the calculation of hadron masses is presented, in which the effect of planar ''sea''-quark loops is taken into account from the beginning. Using techniques based on analyticity and generalized ladder-graph dynamics, we first derive the approximate ''generic'' Regge-trajectory formula α(t) = max (S 1 +S 2 , S 3 +S 4 )-(1/2) +2alpha-circumflex'[s/sub a/ +(1/2)(t-summationm/sub i/ 2 )] for any given hadronic process 1+2→3+4, where S/sub i/ and m/sub i/ are the spins and masses of i = 1,2,3,4, and √s/sub a/ is the effective mass of the lowest nonvanishing contribution (a) exchanged in the crossed channel. By requiring a minimization of secondary (background, etc.) contributions to a, and demanding simultaneous consistency for entire sets of such processes, we are then able to calculate the masses of all the lowest pseudoscalar and vector qq-bar states with q = u,d,s and the Regge trajectories on which they lie. By making certain additional assumptions we are also able to do this with q = u,d,c and q = u,d,b. Our only arbitrary parameters are m/sub rho/, m/sub K/*, m/sub psi/, and m/sub Upsilon/, one of which merely serves to fix the energy scale. In contrast to many other approaches, a small m/sub π/ 2 /m/sub rho/ 2 ratio arises quite naturally in the present scheme

  17. On Consistency of Operational Transformation Approach

    Directory of Open Access Journals (Sweden)

    Aurel Randolph

    2013-02-01

    Full Text Available The Operational Transformation (OT approach, used in many collaborative editors, allows a group of users to concurrently update replicas of a shared object and exchange their updates in any order. The basic idea of this approach is to transform any received update operation before its execution on a replica of the object. This transformation aims to ensure the convergence of the different replicas of the object, even though the operations are executed in different orders. However, designing transformation functions for achieving convergence is a critical and challenging issue. Indeed, the transformation functions proposed in the literature are all revealed incorrect. In this paper, we investigate the existence of transformation functions for a shared string altered by insert and delete operations. From the theoretical point of view, two properties – named TP1 and TP2 – are necessary and sufficient to ensure convergence. Using controller synthesis technique, we show that there are some transformation functions which satisfy only TP1 for the basic signatures of insert and delete operations. As a matter of fact, it is impossible to meet both properties TP1 and TP2 with these simple signatures.

  18. Fundamentals of piping design

    CERN Document Server

    Smith, Peter

    2013-01-01

    Written for the piping engineer and designer in the field, this two-part series helps to fill a void in piping literature,since the Rip Weaver books of the '90s were taken out of print at the advent of the Computer Aid Design(CAD) era. Technology may have changed, however the fundamentals of piping rules still apply in the digitalrepresentation of process piping systems. The Fundamentals of Piping Design is an introduction to the designof piping systems, various processes and the layout of pipe work connecting the major items of equipment forthe new hire, the engineering student and the vetera

  19. Time-Consistent and Market-Consistent Evaluations (Revised version of 2012-086)

    NARCIS (Netherlands)

    Stadje, M.A.; Pelsser, A.

    2014-01-01

    Abstract: We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  20. Criteria for the generation of spectra consistent time histories

    International Nuclear Information System (INIS)

    Lin, C.-W.

    1977-01-01

    Several methods are available to conduct seismic analysis for nuclear power plant systems and components. Among them, the response spectrum technique has been most widely adopted for linear type of modal analysis. However, for designs which consist of structural or material nonlinearites such as frequency dependent soil properties, the existance of gaps, single tie rods, and friction between supports where the response has to be computed as a function of time, time history approach is the only viable method of analysis. Two examples of time history analysis are: 1) soil-structure interaction study and, 2) a coupled reactor coolant system and building analysis to either generate the floor response specra or compute nonlinear system time history response. The generation of a suitable time history input for the analysis has been discussed in the literature. Some general guidelines are available to insure that the time history imput will be as conservative as the design response spectra. Very little has been reported as to the effect of the dyanmic characteristics of the time history input upon the system response. In fact, the only available discussion in this respect concerns only with the statitical independent nature of the time history components. In this paper, numerical results for cases using the time history approach are presented. Criteria are also established which may be advantageously used to arrive at spectra consistent time histories which are conservative and more importantly, realistic. (Auth.)

  1. Consistent Partial Least Squares Path Modeling via Regularization

    Directory of Open Access Journals (Sweden)

    Sunho Jung

    2018-02-01

    Full Text Available Partial least squares (PLS path modeling is a component-based structural equation modeling that has been adopted in social and psychological research due to its data-analytic capability and flexibility. A recent methodological advance is consistent PLS (PLSc, designed to produce consistent estimates of path coefficients in structural models involving common factors. In practice, however, PLSc may frequently encounter multicollinearity in part because it takes a strategy of estimating path coefficients based on consistent correlations among independent latent variables. PLSc has yet no remedy for this multicollinearity problem, which can cause loss of statistical power and accuracy in parameter estimation. Thus, a ridge type of regularization is incorporated into PLSc, creating a new technique called regularized PLSc. A comprehensive simulation study is conducted to evaluate the performance of regularized PLSc as compared to its non-regularized counterpart in terms of power and accuracy. The results show that our regularized PLSc is recommended for use when serious multicollinearity is present.

  2. Context-specific metabolic networks are consistent with experiments.

    Directory of Open Access Journals (Sweden)

    Scott A Becker

    2008-05-01

    Full Text Available Reconstructions of cellular metabolism are publicly available for a variety of different microorganisms and some mammalian genomes. To date, these reconstructions are "genome-scale" and strive to include all reactions implied by the genome annotation, as well as those with direct experimental evidence. Clearly, many of the reactions in a genome-scale reconstruction will not be active under particular conditions or in a particular cell type. Methods to tailor these comprehensive genome-scale reconstructions into context-specific networks will aid predictive in silico modeling for a particular situation. We present a method called Gene Inactivity Moderated by Metabolism and Expression (GIMME to achieve this goal. The GIMME algorithm uses quantitative gene expression data and one or more presupposed metabolic objectives to produce the context-specific reconstruction that is most consistent with the available data. Furthermore, the algorithm provides a quantitative inconsistency score indicating how consistent a set of gene expression data is with a particular metabolic objective. We show that this algorithm produces results consistent with biological experiments and intuition for adaptive evolution of bacteria, rational design of metabolic engineering strains, and human skeletal muscle cells. This work represents progress towards producing constraint-based models of metabolism that are specific to the conditions where the expression profiling data is available.

  3. Consistently Showing Your Best Side? Intra-individual Consistency in #Selfie Pose Orientation

    Science.gov (United States)

    Lindell, Annukka K.

    2017-01-01

    Painted and photographic portraits of others show an asymmetric bias: people favor their left cheek. Both experimental and database studies confirm that the left cheek bias extends to selfies. To date all such selfie studies have been cross-sectional; whether individual selfie-takers tend to consistently favor the same pose orientation, or switch between multiple poses, remains to be determined. The present study thus examined intra-individual consistency in selfie pose orientations. Two hundred selfie-taking participants (100 male and 100 female) were identified by searching #selfie on Instagram. The most recent 10 single-subject selfies for the each of the participants were selected and coded for type of selfie (normal; mirror) and pose orientation (left, midline, right), resulting in a sample of 2000 selfies. Results indicated that selfie-takers do tend to consistently adopt a preferred pose orientation (α = 0.72), with more participants showing an overall left cheek bias (41%) than would be expected by chance (overall right cheek bias = 31.5%; overall midline bias = 19.5%; no overall bias = 8%). Logistic regression modellng, controlling for the repeated measure of participant identity, indicated that sex did not affect pose orientation. However, selfie type proved a significant predictor when comparing left and right cheek poses, with a stronger left cheek bias for mirror than normal selfies. Overall, these novel findings indicate that selfie-takers show intra-individual consistency in pose orientation, and in addition, replicate the previously reported left cheek bias for selfies and other types of portrait, confirming that the left cheek bias also presents within individuals’ selfie corpora. PMID:28270790

  4. DOE handbook: Design considerations

    International Nuclear Information System (INIS)

    1999-04-01

    The Design Considerations Handbook includes information and suggestions for the design of systems typical to nuclear facilities, information specific to various types of special facilities, and information useful to various design disciplines. The handbook is presented in two parts. Part 1, which addresses design considerations, includes two sections. The first addresses the design of systems typically used in nuclear facilities to control radiation or radioactive materials. Specifically, this part addresses the design of confinement systems and radiation protection and effluent monitoring systems. The second section of Part 1 addresses the design of special facilities (i.e., specific types of nonreactor nuclear facilities). The specific design considerations provided in this section were developed from review of DOE 6430.1A and are supplemented with specific suggestions and considerations from designers with experience designing and operating such facilities. Part 2 of the Design Considerations Handbook describes good practices and design principles that should be considered in specific design disciplines, such as mechanical systems and electrical systems. These good practices are based on specific experiences in the design of nuclear facilities by design engineers with related experience. This part of the Design Considerations Handbook contains five sections, each of which applies to a particular engineering discipline

  5. DOE handbook: Design considerations

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-04-01

    The Design Considerations Handbook includes information and suggestions for the design of systems typical to nuclear facilities, information specific to various types of special facilities, and information useful to various design disciplines. The handbook is presented in two parts. Part 1, which addresses design considerations, includes two sections. The first addresses the design of systems typically used in nuclear facilities to control radiation or radioactive materials. Specifically, this part addresses the design of confinement systems and radiation protection and effluent monitoring systems. The second section of Part 1 addresses the design of special facilities (i.e., specific types of nonreactor nuclear facilities). The specific design considerations provided in this section were developed from review of DOE 6430.1A and are supplemented with specific suggestions and considerations from designers with experience designing and operating such facilities. Part 2 of the Design Considerations Handbook describes good practices and design principles that should be considered in specific design disciplines, such as mechanical systems and electrical systems. These good practices are based on specific experiences in the design of nuclear facilities by design engineers with related experience. This part of the Design Considerations Handbook contains five sections, each of which applies to a particular engineering discipline.

  6. Planck 2013 results. XXXI. Consistency of the Planck data

    DEFF Research Database (Denmark)

    Ade, P. A. R.; Arnaud, M.; Ashdown, M.

    2014-01-01

    The Planck design and scanning strategy provide many levels of redundancy that can be exploited to provide tests of internal consistency. One of the most important is the comparison of the 70 GHz (amplifier) and 100 GHz (bolometer) channels. Based on dierent instrument technologies, with feeds...... in the HFI channels would result in shifts in the posterior distributions of parameters of less than 0.3σ except for As, the amplitude of the primordial curvature perturbations at 0.05 Mpc-1, which changes by about 1.We extend these comparisons to include the sky maps from the complete nine-year mission...... located dierently in the focal plane, analysed independently by dierent teams using dierent software, and near∫ the minimum of diuse foreground emission, these channels are in eect two dierent experiments. The 143 GHz channel has the lowest noise level on Planck, and is near the minimum of unresolved...

  7. Sustaining biological welfare for our future through consistent science

    Directory of Open Access Journals (Sweden)

    Shimomura Yoshihiro

    2013-01-01

    Full Text Available Abstract Physiological anthropology presently covers a very broad range of human knowledge and engineering technologies. This study reviews scientific inconsistencies within a variety of areas: sitting posture; negative air ions; oxygen inhalation; alpha brain waves induced by music and ultrasound; 1/f fluctuations; the evaluation of feelings using surface electroencephalography; Kansei; universal design; and anti-stress issues. We found that the inconsistencies within these areas indicate the importance of integrative thinking and the need to maintain the perspective on the biological benefit to humanity. Analytical science divides human physiological functions into discrete details, although individuals comprise a unified collection of whole-body functions. Such disparate considerations contribute to the misunderstanding of physiological functions and the misevaluation of positive and negative values for humankind. Research related to human health will, in future, depend on the concept of maintaining physiological functions based on consistent science and on sustaining human health to maintain biological welfare in future generations.

  8. Planck 2013 results. XXXI. Consistency of the Planck data

    CERN Document Server

    Ade, P A R; Ashdown, M; Aumont, J; Baccigalupi, C; Banday, A.J; Barreiro, R.B; Battaner, E; Benabed, K; Benoit-Levy, A; Bernard, J.P; Bersanelli, M; Bielewicz, P; Bond, J.R; Borrill, J; Bouchet, F.R; Burigana, C; Cardoso, J.F; Catalano, A; Challinor, A; Chamballu, A; Chiang, H.C; Christensen, P.R; Clements, D.L; Colombi, S; Colombo, L.P.L; Couchot, F; Coulais, A; Crill, B.P; Curto, A; Cuttaia, F; Danese, L; Davies, R.D; Davis, R.J; de Bernardis, P; de Rosa, A; de Zotti, G; Delabrouille, J; Desert, F.X; Dickinson, C; Diego, J.M; Dole, H; Donzelli, S; Dore, O; Douspis, M; Dupac, X; Ensslin, T.A; Eriksen, H.K; Finelli, F; Forni, O; Frailis, M; Fraisse, A A; Franceschi, E; Galeotta, S; Ganga, K; Giard, M; Gonzalez-Nuevo, J; Gorski, K.M.; Gratton, S.; Gregorio, A; Gruppuso, A; Gudmundsson, J E; Hansen, F.K; Hanson, D; Harrison, D; Henrot-Versille, S; Herranz, D; Hildebrandt, S.R; Hivon, E; Hobson, M; Holmes, W.A.; Hornstrup, A; Hovest, W.; Huffenberger, K.M; Jaffe, T.R; Jaffe, A.H; Jones, W.C; Keihanen, E; Keskitalo, R; Knoche, J; Kunz, M; Kurki-Suonio, H; Lagache, G; Lahteenmaki, A; Lamarre, J.M; Lasenby, A; Lawrence, C.R; Leonardi, R; Leon-Tavares, J; Lesgourgues, J; Liguori, M; Lilje, P.B; Linden-Vornle, M; Lopez-Caniego, M; Lubin, P.M; Macias-Perez, J.F; Maino, D; Mandolesi, N; Maris, M; Martin, P.G; Martinez-Gonzalez, E; Masi, S; Matarrese, S; Mazzotta, P; Meinhold, P.R; Melchiorri, A; Mendes, L; Mennella, A; Migliaccio, M; Mitra, S; Miville-Deschenes, M.A; Moneti, A; Montier, L; Morgante, G; Mortlock, D; Moss, A; Munshi, D; Murphy, J A; Naselsky, P; Nati, F; Natoli, P; Norgaard-Nielsen, H.U; Noviello, F; Novikov, D; Novikov, I; Oxborrow, C.A; Pagano, L; Pajot, F; Paoletti, D; Partridge, B; Pasian, F; Patanchon, G; Pearson, D; Pearson, T.J; Perdereau, O; Perrotta, F; Piacentini, F; Piat, M; Pierpaoli, E; Pietrobon, D; Plaszczynski, S; Pointecouteau, E; Polenta, G; Ponthieu, N; Popa, L; Pratt, G.W; Prunet, S; Puget, J.L; Rachen, J.P; Reinecke, M; Remazeilles, M; Renault, C; Ricciardi, S.; Ristorcelli, I; Rocha, G.; Roudier, G; Rubino-Martin, J.A; Rusholme, B; Sandri, M; Scott, D; Stolyarov, V; Sudiwala, R; Sutton, D; Suur-Uski, A.S; Sygnet, J.F; Tauber, J.A; Terenzi, L; Toffolatti, L; Tomasi, M; Tristram, M; Tucci, M; Valenziano, L; Valiviita, J; Van Tent, B; Vielva, P; Villa, F; Wade, L.A; Wandelt, B.D; Wehus, I K; White, S D M; Yvon, D; Zacchei, A; Zonca, A

    2014-01-01

    The Planck design and scanning strategy provide many levels of redundancy that can be exploited to provide tests of internal consistency. One of the most important is the comparison of the 70 GHz (amplifier) and 100 GHz (bolometer) channels. Based on different instrument technologies, with feeds located differently in the focal plane, analysed independently by different teams using different software, and near the minimum of diffuse foreground emission, these channels are in effect two different experiments. The 143 GHz channel has the lowest noise level on Planck, and is near the minimum of unresolved foreground emission. In this paper, we analyse the level of consistency achieved in the 2013 Planck data. We concentrate on comparisons between the 70, 100, and 143 GHz channel maps and power spectra, particularly over the angular scales of the first and second acoustic peaks, on maps masked for diffuse Galactic emission and for strong unresolved sources. Difference maps covering angular scales from 8°...

  9. Preliminary design of a coffee harvester

    Directory of Open Access Journals (Sweden)

    Raphael Magalhães Gomes Moreira

    2016-10-01

    Full Text Available Design of an agricultural machine is a highly complex process due to interactions between the operator, machine, and environment. Mountain coffee plantations constitute an economic sector that requires huge investments for the development of agricultural machinery to improve the harvesting and post-harvesting processes and to overcome the scarcity of work forces in the fields. The aim of this study was to develop a preliminary design for a virtual prototype of a coffee fruit harvester. In this study, a project methodology was applied and adapted for the development of the following steps: project planning, informational design, conceptual design, and preliminary design. The construction of a morphological matrix made it possible to obtain a list of different mechanisms with specific functions. The union between these mechanisms resulted in variants, which were weighed to attribute scores for each selected criterion. From each designated proposal, two variants with the best scores were selected and this permitted the preparation of the preliminary design of both variants. The archetype was divided in two parts, namely the hydraulically articulated arms and the harvesting system that consisted of the vibration mechanism and the detachment mechanism. The proposed innovation involves the use of parallel rods, which were fixed in a plane and rectangular metal sheet. In this step, dimensions including a maximum length of 4.7 m, a minimum length of 3.3 m, and a total height of 2.15 m were identified based on the functioning of the harvester in relation to the coupling point of the tractor.

  10. Design of an economically efficient feed-in tariff structure for renewable energy development

    International Nuclear Information System (INIS)

    Lesser, Jonathan A.; Su Xuejuan

    2008-01-01

    Evidence suggests, albeit tentatively, that feed-in tariffs (FITs) are more effective than alternative support schemes in promoting renewable energy technologies (RETs). FITs provide long-term financial stability for investors in RETs, which, at the prevailing market price of electricity, are not currently cost-efficient enough to compete with traditional fossil fuel technologies. On the other hand, if not properly designed, FITs can be economically inefficient, as is widely regarded to have been the case under the Public Utility Regulatory Policies Act of 1978 (PURPA). Under PURPA, too high a guaranteed price led to the creation of so-called 'PURPA machines'-poorly performing generating units that could survive financially only because of heavy subsidies that came at the expense of retail customers. Similarly, because of their adverse impacts on retail electricity rates, German FITs have been subject to increasing political pressure from utilities and customers. In this paper, we propose an innovative two-part FIT, consisting of both a capacity payment and a market-based energy payment, which can be used to meet the renewables policy goals of regulators. Our two-part tariff design draws on the strengths of traditional FITs, relies on market mechanisms, is easy to implement, and avoids the problems caused by distorting wholesale energy markets through above-market energy payments. The approach is modeled on forward capacity market designs that have been recently implemented by several regional transmission organizations in the USA to address needs for new generating capacity to ensure system reliability

  11. Multiplicative Consistency for Interval Valued Reciprocal Preference Relations

    OpenAIRE

    Wu, Jian; Chiclana, Francisco

    2014-01-01

    The multiplicative consistency (MC) property of interval additive reciprocal preference relations (IARPRs) is explored, and then the consistency index is quantified by the multiplicative consistency estimated IARPR. The MC property is used to measure the level of consistency of the information provided by the experts and also to propose the consistency index induced ordered weighted averaging (CI-IOWA) operator. The novelty of this operator is that it aggregates individual IARPRs in such ...

  12. Quench simulation of SMES consisting of some superconducting coils

    International Nuclear Information System (INIS)

    Noguchi, S.; Oga, Y.; Igarashi, H.

    2011-01-01

    A chain of quenches may be caused by a quench of one element coil when SMES is consists of many element coils. To avoid the chain of quenches, the energy stored in element coil has to be quickly discharged. The cause of the chain of the quenches is the short time constant of the decreasing current of the quenched coil. In recent years, many HTS superconducting magnetic energy storage (HTS-SMES) systems are investigated and designed. They usually consist of some superconducting element coils due to storing excessively high energy. If one of them was quenched, the storage energy of the superconducting element coil quenched has to be immediately dispersed to protect the HTS-SMES system. As the result, the current of the other element coils, which do not reach to quench, increases since the magnetic coupling between the quenched element coil and the others are excessively strong. The increase of the current may cause the quench of the other element coils. If the energy dispersion of the element coil quenched was failed, the other superconducting element coil would be quenched in series. Therefore, it is necessary to investigate the behavior of the HTS-SMES after quenching one or more element coils. To protect a chain of quenches, it is also important to investigate the time constant of the coils. We have developed a simulation code to investigate the behavior of the HTS-SMES. By the quench simulation, it is indicated that a chain of quenches is caused by a quench of one element coil.

  13. Longitudinal tDCS: Consistency across Working Memory Training Studies

    Directory of Open Access Journals (Sweden)

    Marian E. Berryhill

    2017-04-01

    Full Text Available There is great interest in enhancing and maintaining cognitive function. In recent years, advances in noninvasive brain stimulation devices, such as transcranial direct current stimulation (tDCS, have targeted working memory in particular. Despite controversy surrounding outcomes of single-session studies, a growing field of working memory training studies incorporate multiple sessions of tDCS. It is useful to take stock of these findings because there is a diversity of paradigms employed and the outcomes observed between research groups. This will be important in assessing cognitive training programs paired with stimulation techniques and identifying the more useful and less effective approaches. Here, we treat the tDCS+ working memory training field as a case example, but also survey training benefits in other neuromodulatory techniques (e.g., tRNS, tACS. There are challenges associated with the broad parameter space including: individual differences, stimulation intensity, duration, montage, session number, session spacing, training task selection, timing of follow up testing, near and far transfer tasks. In summary, although the field of assisted cognitive training is young, some design choices are more favorable than others. By way of heuristic, the current evidence supports including more training/tDCS sessions (5+, applying anodal tDCS targeting prefrontal regions, including follow up testing on trained and transfer tasks after a period of no contact. What remains unclear, but important for future translational value is continuing work to pinpoint optimal values for the tDCS parameters on a per cognitive task basis. Importantly the emerging literature shows notable consistency in the application of tDCS for WM across various participant populations compared to single session experimental designs.

  14. Improving risk assessment by defining consistent and reliable system scenarios

    Directory of Open Access Journals (Sweden)

    B. Mazzorana

    2009-02-01

    Full Text Available During the entire procedure of risk assessment for hydrologic hazards, the selection of consistent and reliable scenarios, constructed in a strictly systematic way, is fundamental for the quality and reproducibility of the results. However, subjective assumptions on relevant impact variables such as sediment transport intensity on the system loading side and weak point response mechanisms repeatedly cause biases in the results, and consequently affect transparency and required quality standards. Furthermore, the system response of mitigation measures to extreme event loadings represents another key variable in hazard assessment, as well as the integral risk management including intervention planning. Formative Scenario Analysis, as a supplement to conventional risk assessment methods, is a technique to construct well-defined sets of assumptions to gain insight into a specific case and the potential system behaviour. By two case studies, carried out (1 to analyse sediment transport dynamics in a torrent section equipped with control measures, and (2 to identify hazards induced by woody debris transport at hydraulic weak points, the applicability of the Formative Scenario Analysis technique is presented. It is argued that during scenario planning in general and with respect to integral risk management in particular, Formative Scenario Analysis allows for the development of reliable and reproducible scenarios in order to design more specifically an application framework for the sustainable assessment of natural hazards impact. The overall aim is to optimise the hazard mapping and zoning procedure by methodologically integrating quantitative and qualitative knowledge.

  15. Self-Consistent Study of Conjugated Aromatic Molecular Transistors

    International Nuclear Information System (INIS)

    Jing, Wang; Yun-Ye, Liang; Hao, Chen; Peng, Wang; Note, R.; Mizuseki, H.; Kawazoe, Y.

    2010-01-01

    We study the current through conjugated aromatic molecular transistors modulated by a transverse field. The self-consistent calculation is realized with density function theory through the standard quantum chemistry software Gaussian03 and the non-equilibrium Green's function formalism. The calculated I – V curves controlled by the transverse field present the characteristics of different organic molecular transistors, the transverse field effect of which is improved by the substitutions of nitrogen atoms or fluorine atoms. On the other hand, the asymmetry of molecular configurations to the axis connecting two sulfur atoms is in favor of realizing the transverse field modulation. Suitably designed conjugated aromatic molecular transistors possess different I – V characteristics, some of them are similar to those of metal-oxide-semiconductor field-effect transistors (MOSFET). Some of the calculated molecular devices may work as elements in graphene electronics. Our results present the richness and flexibility of molecular transistors, which describe the colorful prospect of next generation devices. (condensed matter: electronic structure, electrical, magnetic, and optical properties)

  16. Decentralized Consistent Network Updates in SDN with ez-Segway

    KAUST Repository

    Nguyen, Thanh Dang

    2017-03-06

    We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and black-holes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.

  17. Privacy, Time Consistent Optimal Labour Income Taxation and Education Policy

    OpenAIRE

    Konrad, Kai A.

    1999-01-01

    Incomplete information is a commitment device for time consistency problems. In the context of time consistent labour income taxation privacy reduces welfare losses and increases the effectiveness of public education as a second best policy.

  18. Generalized contexts and consistent histories in quantum mechanics

    International Nuclear Information System (INIS)

    Losada, Marcelo; Laura, Roberto

    2014-01-01

    We analyze a restriction of the theory of consistent histories by imposing that a valid description of a physical system must include quantum histories which satisfy the consistency conditions for all states. We prove that these conditions are equivalent to imposing the compatibility conditions of our formalism of generalized contexts. Moreover, we show that the theory of consistent histories with the consistency conditions for all states and the formalism of generalized context are equally useful representing expressions which involve properties at different times

  19. Personality and Situation Predictors of Consistent Eating Patterns

    OpenAIRE

    Vainik, Uku; Dub?, Laurette; Lu, Ji; Fellows, Lesley K.

    2015-01-01

    Introduction A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studi...

  20. Two Impossibility Results on the Converse Consistency Principle in Bargaining

    OpenAIRE

    Youngsub Chun

    1999-01-01

    We present two impossibility results on the converse consistency principle in the context of bargaining. First, we show that there is no solution satis-fying Pareto optimality, contraction independence, and converse consistency. Next, we show that there is no solution satisfying Pareto optimality, strong individual rationality, individual monotonicity, and converse consistency.

  1. Personality consistency analysis in cloned quarantine dog candidates

    Directory of Open Access Journals (Sweden)

    Jin Choi

    2017-01-01

    Full Text Available In recent research, personality consistency has become an important characteristic. Diverse traits and human-animal interactions, in particular, are studied in the field of personality consistency in dogs. Here, we investigated the consistency of dominant behaviours in cloned and control groups followed by the modified Puppy Aptitude Test, which consists of ten subtests to ascertain the influence of genetic identity. In this test, puppies are exposed to stranger, restraint, prey-like object, noise, startling object, etc. Six cloned and four control puppies participated and the consistency of responses at ages 7–10 and 16 weeks in the two groups was compared. The two groups showed different consistencies in the subtests. While the average scores of the cloned group were consistent (P = 0.7991, those of the control group were not (P = 0.0089. Scores of Pack Drive and Fight or Flight Drive were consistent in the cloned group, however, those of the control group were not. Scores of Prey Drive were not consistent in either the cloned or the control group. Therefore, it is suggested that consistency of dominant behaviour is affected by genetic identity and some behaviours can be influenced more than others. Our results suggest that cloned dogs could show more consistent traits than non-cloned. This study implies that personality consistency could be one of the ways to analyse traits of puppies.

  2. Checking Consistency of Pedigree Information is NP-complete

    DEFF Research Database (Denmark)

    Aceto, Luca; Hansen, Jens A.; Ingolfsdottir, Anna

    Consistency checking is a fundamental computational problem in genetics. Given a pedigree and information on the genotypes of some of the individuals in it, the aim of consistency checking is to determine whether these data are consistent with the classic Mendelian laws of inheritance. This probl...

  3. 26 CFR 1.338-8 - Asset and stock consistency.

    Science.gov (United States)

    2010-04-01

    ... that are controlled foreign corporations. (6) Stock consistency. This section limits the application of... 26 Internal Revenue 4 2010-04-01 2010-04-01 false Asset and stock consistency. 1.338-8 Section 1... (CONTINUED) INCOME TAXES Effects on Corporation § 1.338-8 Asset and stock consistency. (a) Introduction—(1...

  4. Accurate, consistent, and fast droplet splitting and dispensing in electrowetting on dielectric digital microfluidics

    Science.gov (United States)

    Nikapitiya, N. Y. Jagath B.; Nahar, Mun Mun; Moon, Hyejin

    2017-12-01

    This letter reports two novel electrode design considerations to satisfy two very important aspects of EWOD operation—(1) Highly consistent volume of generated droplets and (2) Highly improved accuracy in the generated droplet volume. Considering the design principles investigated two novel designs were proposed; L-junction electrode design to offer high throughput droplet generation and Y-junction electrode design to split a droplet very fast while maintaining equal volume of each part. Devices of novel designs were fabricated and tested, and the results are compared with those of conventional approach. It is demonstrated that inaccuracy and inconsistency of droplet volume dispensed in the device with novel electrode designs are as low as 0.17 and 0.10%, respectively, while those of conventional approach are 25 and 0.76%, respectively. The dispensing frequency is enhanced from 4 to 9 Hz by using the novel design.

  5. Thermodynamically consistent Bayesian analysis of closed biochemical reaction systems

    Directory of Open Access Journals (Sweden)

    Goutsias John

    2010-11-01

    Full Text Available Abstract Background Estimating the rate constants of a biochemical reaction system with known stoichiometry from noisy time series measurements of molecular concentrations is an important step for building predictive models of cellular function. Inference techniques currently available in the literature may produce rate constant values that defy necessary constraints imposed by the fundamental laws of thermodynamics. As a result, these techniques may lead to biochemical reaction systems whose concentration dynamics could not possibly occur in nature. Therefore, development of a thermodynamically consistent approach for estimating the rate constants of a biochemical reaction system is highly desirable. Results We introduce a Bayesian analysis approach for computing thermodynamically consistent estimates of the rate constants of a closed biochemical reaction system with known stoichiometry given experimental data. Our method employs an appropriately designed prior probability density function that effectively integrates fundamental biophysical and thermodynamic knowledge into the inference problem. Moreover, it takes into account experimental strategies for collecting informative observations of molecular concentrations through perturbations. The proposed method employs a maximization-expectation-maximization algorithm that provides thermodynamically feasible estimates of the rate constant values and computes appropriate measures of estimation accuracy. We demonstrate various aspects of the proposed method on synthetic data obtained by simulating a subset of a well-known model of the EGF/ERK signaling pathway, and examine its robustness under conditions that violate key assumptions. Software, coded in MATLAB®, which implements all Bayesian analysis techniques discussed in this paper, is available free of charge at http://www.cis.jhu.edu/~goutsias/CSS%20lab/software.html. Conclusions Our approach provides an attractive statistical methodology for

  6. ECO DESIGN IN DESIGN PROCESS

    Directory of Open Access Journals (Sweden)

    PRALEA Jeni

    2014-05-01

    Full Text Available Eco-design is a new domain, required by the new trends and existing concerns worldwide, generated by the necessity of adopting new design principles. New design principles require the designer to provide a friendly relationship between concept created, environment and consume. This "friendly" relationship should be valid both at present and in the future, generating new opportunities for product, product components or materials from which it was made. Awareness, by the designer, the importance of this new trend, permits the establishment of concepts that have as their objective the protection of present values and ensuring the legacy of future generations. Ecodesig, by its principles, is involved in the design process, from early stage, the stage of product design. Priority objective of the designers will consist in reducing the negative effects on the environment through the entire life cycle and after it is taken out of use. The main aspects of the eco-design will consider extending product exploitation, make better use of materials, reduction of emission of waste. The design process in the "eco"domein must be started by selecting the function of the concept, materials and technological processes, causing the shape of macro and micro geometric of the product through an analysis that involves optimizing and streamlining the product. This paper presents the design process of a cross-sports footwear concept, built on the basis of the principles of ecodesign

  7. A design method for two-layer beams consisting of normal and fibered high strength concrete

    International Nuclear Information System (INIS)

    Iskhakov, I.; Ribakov, Y.

    2007-01-01

    Two-layer fibered concrete beams can be analyzed using conventional methods for composite elements. The compressed zone of such beam section is made of high strength concrete (HSC), and the tensile one of normal strength concrete (NSC). The problems related to such type of beams are revealed and studied. An appropriate depth of each layer is prescribed. Compatibility conditions between HSC and NSC layers are found. It is based on the shear deformations equality on the layers border in a section with maximal depth of the compression zone. For the first time a rigorous definition of HSC is given using a comparative analysis of deformability and strength characteristics of different concrete classes. According to this definition, HSC has no download branch in the stress-strain diagram, the stress-strain function has minimum exponent, the ductility parameter is minimal and the concrete tensile strength remains constant with an increase in concrete compression strength. The application fields of two-layer concrete beams based on different static schemes and load conditions make known. It is known that the main disadvantage of HSCs is their low ductility. In order to overcome this problem, fibers are added to the HSC layer. Influence of different fiber volume ratios on structural ductility is discussed. An upper limit of the required fibers volume ratio is found based on compatibility equation of transverse tensile concrete deformations and deformations of fibers

  8. Multicriteria steepest ascent in a design space consisting of both mixture and process variables

    NARCIS (Netherlands)

    Duineveld, CAA; Coenegracht, PMJ

    1995-01-01

    Steepest ascent is shown to be a feasible method for problems where two or more responses are to be optimized. With the aid of Pareto optimality the (one response) standard method is adapted for the use of more responses. A special kind of steepest ascent problem involves the presence of both

  9. GPS Space Service Volume: Ensuring Consistent Utility Across GPS Design Builds for Space Users

    Science.gov (United States)

    Bauer, Frank H.; Parker, Joel Jefferson Konkl; Valdez, Jennifer Ellen

    2015-01-01

    GPS availability and signal strength originally specified for users on or near surface of Earth with transmitted power levels specified at edge-of-Earth, 14.3 degrees. Prior to the SSV specification, on-orbit performance of GPS varied from block build to block build (IIA, IIRM, IIF) due to antenna gain and beam width variances. Unstable on-orbit performance results in significant risk to space users. Side-lobe signals, although not specified, were expected to significantly boost GPS signal availability for users above the constellation. During GPS III Phase A, NASA noted significant discrepancies in power levels specified in GPS III specification documents, and measured on-orbit performance. To stabilize the signal for high altitude space users, NASA DoD team in 2003-2005 led the creation of new Space Service Volume (SSV) definition and specifications.

  10. SCALCE: boosting sequence compression algorithms using locally consistent encoding.

    Science.gov (United States)

    Hach, Faraz; Numanagic, Ibrahim; Alkan, Can; Sahinalp, S Cenk

    2012-12-01

    The high throughput sequencing (HTS) platforms generate unprecedented amounts of data that introduce challenges for the computational infrastructure. Data management, storage and analysis have become major logistical obstacles for those adopting the new platforms. The requirement for large investment for this purpose almost signalled the end of the Sequence Read Archive hosted at the National Center for Biotechnology Information (NCBI), which holds most of the sequence data generated world wide. Currently, most HTS data are compressed through general purpose algorithms such as gzip. These algorithms are not designed for compressing data generated by the HTS platforms; for example, they do not take advantage of the specific nature of genomic sequence data, that is, limited alphabet size and high similarity among reads. Fast and efficient compression algorithms designed specifically for HTS data should be able to address some of the issues in data management, storage and communication. Such algorithms would also help with analysis provided they offer additional capabilities such as random access to any read and indexing for efficient sequence similarity search. Here we present SCALCE, a 'boosting' scheme based on Locally Consistent Parsing technique, which reorganizes the reads in a way that results in a higher compression speed and compression rate, independent of the compression algorithm in use and without using a reference genome. Our tests indicate that SCALCE can improve the compression rate achieved through gzip by a factor of 4.19-when the goal is to compress the reads alone. In fact, on SCALCE reordered reads, gzip running time can improve by a factor of 15.06 on a standard PC with a single core and 6 GB memory. Interestingly even the running time of SCALCE + gzip improves that of gzip alone by a factor of 2.09. When compared with the recently published BEETL, which aims to sort the (inverted) reads in lexicographic order for improving bzip2, SCALCE + gzip

  11. Design of Annular Linear Induction Pump for High Temperature Liquid Lead Transportation

    Energy Technology Data Exchange (ETDEWEB)

    Kwak, Jae Sik; Kim, Hee Reyoung [Ulsan National Institute of Science and Technology, Ulsan (Korea, Republic of)

    2014-05-15

    EM(Electro Magnetic) Pump is divided into two parts, which consisted of the primary one with electromagnetic core and exciting coils, and secondary one with liquid lead flow. The main geometrical variables of the pump included core length, inner diameter and flow gap while the electromagnetic ones covered pole pitch, turns of coil, number of pole pairs, input current and input frequency. The characteristics of design variables are analyzed by electrical equivalent circuit method taking into account hydraulic head loss in the narrow annular channel of the ALIP. The design program, which was composed by using MATLAB language, was developed to draw pump design variables according to input requirements of the flow rate, developing pressure and operation temperature from the analyses. The analysis on the design of ALIP for high temperature liquid lead transportation was carried for the produce of ALIP designing program based on MATLAB. By the using of ALIP designing program, we don't have to bother about geometrical relationship between each component during detail designing process because code calculate automatically. And prediction of outputs about designing pump can be done easily before manufacturing. By running the code, we also observe and analysis change of outputs caused by changing of pump factors. It will be helpful for the research about optimization of pump outputs.

  12. Consistent Regulation of Infrastructure Businesses: Some Economic Issues

    OpenAIRE

    Flavio M. Menezes

    2008-01-01

    This paper examines some important economic aspects associated with the notion that consistency in the regulation of infrastructure businesses is a desirable feature. It makes two important points. First, it is not easy to measure consistency. In particular, one cannot simply point to different regulatory parameters as evidence of inconsistent regulatory policy. Second, even if one does observe consistency emerging from decisions made by different regulators, it does not necessarily mean that...

  13. Consistency considerations in the use of point kinetics for BWR application

    International Nuclear Information System (INIS)

    Holzer, J.M.; Habert, R.; Pilat, E.E.

    1981-01-01

    The basic question of producing point reactivity parameters for use in RETRAN anaylses is addressed. The technique used in establishing a methodology consists of a stepwise reduction of resolution, in space and time, so as to identify possible areas in which error may be induced and to establish procedures that retain consistency and accuracy. The presented calculational flow plan culminating from this analysis will ultimately be used at Yankee Atomic Electric for design application

  14. How consistent are beliefs about the causes and solutions to illness? An experimental study.

    OpenAIRE

    Ogden, J; Jubb, A

    2008-01-01

    Objectives: Research illustrates that people hold beliefs about the causes and solutions to illness. This study aimed to assess the consistency in these beliefs in terms of their variation according to type of problem and whether they are consistent with each other. Further, the study aimed to assess whether they are open to change and whether changing beliefs about cause resulted in a subsequent shift in beliefs about solutions. Design: Experimental factorial 3 (problem) × 2 (manipulated cau...

  15. Seismic structural response analysis using consistent mass matrices having dynamic coupling

    International Nuclear Information System (INIS)

    Shaw, D.E.

    1977-01-01

    The basis for the theoretical development of this paper is the linear matrix equations of motion for an unconstrained structure subject to support excitation. The equations are formulated in terms of absolute displacement, velocity and acceleration vectors. By means of a transformation of the absolute response vectors into displacements, velocities and accelerations relative to the support motions, the homogeneous equations become non-homogeneous and the non-homogeneous boundary conditions become homogeneous with relative displacements, velocities and accelerations being zero at support points. The forcing function or inertial loading vector is shown to consist of two parts. The first part is comprised of the mass matrix times the suppport acceleration function times a vector of structural displacements resulting from a unit vector of support displacements in the direction of excitation. This inertial loading corresponds to the classical seismic loading vector and is indeed the only loading vector for lumped-mass systems. The second part of he inertial loading vectors consists of the mass matrix times the support acceleration function times a vector of structural accelerations resulting from unit support accelerations in the direction of excitation. This term is not present in classical seismic analysis formulations and results from the presence of off-diagonal terms in the mass matrices which give rise to dynamic coupling through the mass matrix. Thus, for lumped-mass models, the classical formulation of the inertial loading vector is correct. However, if dynamic coupling terms are included through off-diagonal terms in the mass matrix, an additional inertia loading vector must be considered

  16. Personality consistency in dogs: a meta-analysis.

    Science.gov (United States)

    Fratkin, Jamie L; Sinn, David L; Patall, Erika A; Gosling, Samuel D

    2013-01-01

    Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that 'puppy tests' measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family). Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43). Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., 'puppy tests') versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed.

  17. Personality consistency in dogs: a meta-analysis.

    Directory of Open Access Journals (Sweden)

    Jamie L Fratkin

    Full Text Available Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that 'puppy tests' measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family. Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43. Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., 'puppy tests' versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed.

  18. Personality Consistency in Dogs: A Meta-Analysis

    Science.gov (United States)

    Fratkin, Jamie L.; Sinn, David L.; Patall, Erika A.; Gosling, Samuel D.

    2013-01-01

    Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that ‘puppy tests’ measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family). Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43). Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., ‘puppy tests’) versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed. PMID:23372787

  19. Rock Slope Design Criteria

    Science.gov (United States)

    2010-06-01

    Based on the stratigraphy and the type of slope stability problems, the flat lying, Paleozoic age, sedimentary : rocks of Ohio were divided into three design units: 1) competent rock design unit consisting of sandstones, limestones, : and siltstones ...

  20. Managing Consistency Anomalies in Distributed Integrated Databases with Relaxed ACID Properties

    DEFF Research Database (Denmark)

    Frank, Lars; Ulslev Pedersen, Rasmus

    2014-01-01

    In central databases the consistency of data is normally implemented by using the ACID (Atomicity, Consistency, Isolation and Durability) properties of a DBMS (Data Base Management System). This is not possible if distributed and/or mobile databases are involved and the availability of data also...... has to be optimized. Therefore, we will in this paper use so called relaxed ACID properties across different locations. The objective of designing relaxed ACID properties across different database locations is that the users can trust the data they use even if the distributed database temporarily...... is inconsistent. It is also important that disconnected locations can operate in a meaningful way in socalled disconnected mode. A database is DBMS consistent if its data complies with the consistency rules of the DBMS's metadata. If the database is DBMS consistent both when a transaction starts and when it has...

  1. Student Consistency and Implications for Feedback in Online Assessment Systems

    Science.gov (United States)

    Madhyastha, Tara M.; Tanimoto, Steven

    2009-01-01

    Most of the emphasis on mining online assessment logs has been to identify content-specific errors. However, the pattern of general "consistency" is domain independent, strongly related to performance, and can itself be a target of educational data mining. We demonstrate that simple consistency indicators are related to student outcomes,…

  2. 26 CFR 301.6224(c)-3 - Consistent settlements.

    Science.gov (United States)

    2010-04-01

    ... 26 Internal Revenue 18 2010-04-01 2010-04-01 false Consistent settlements. 301.6224(c)-3 Section... settlements. (a) In general. If the Internal Revenue Service enters into a settlement agreement with any..., settlement terms consistent with those contained in the settlement agreement entered into. (b) Requirements...

  3. Self-consistent calculation of atomic structure for mixture

    International Nuclear Information System (INIS)

    Meng Xujun; Bai Yun; Sun Yongsheng; Zhang Jinglin; Zong Xiaoping

    2000-01-01

    Based on relativistic Hartree-Fock-Slater self-consistent average atomic model, atomic structure for mixture is studied by summing up component volumes in mixture. Algorithmic procedure for solving both the group of Thomas-Fermi equations and the self-consistent atomic structure is presented in detail, and, some numerical results are discussed

  4. A Preliminary Study toward Consistent Soil Moisture from AMSR2

    NARCIS (Netherlands)

    Parinussa, R.M.; Holmes, T.R.H.; Wanders, N.; Dorigo, W.A.; de Jeu, R.A.M.

    2015-01-01

    A preliminary study toward consistent soil moisture products from the Advanced Microwave Scanning Radiometer 2 (AMSR2) is presented. Its predecessor, the Advanced Microwave Scanning Radiometer for Earth Observing System (AMSR-E), has providedEarth scientists with a consistent and continuous global

  5. Consistency and Inconsistency in PhD Thesis Examination

    Science.gov (United States)

    Holbrook, Allyson; Bourke, Sid; Lovat, Terry; Fairbairn, Hedy

    2008-01-01

    This is a mixed methods investigation of consistency in PhD examination. At its core is the quantification of the content and conceptual analysis of examiner reports for 804 Australian theses. First, the level of consistency between what examiners say in their reports and the recommendation they provide for a thesis is explored, followed by an…

  6. Delimiting Coefficient a from Internal Consistency and Unidimensionality

    Science.gov (United States)

    Sijtsma, Klaas

    2015-01-01

    I discuss the contribution by Davenport, Davison, Liou, & Love (2015) in which they relate reliability represented by coefficient a to formal definitions of internal consistency and unidimensionality, both proposed by Cronbach (1951). I argue that coefficient a is a lower bound to reliability and that concepts of internal consistency and…

  7. Risk aversion vs. the Omega ratio : Consistency results

    NARCIS (Netherlands)

    Balder, Sven; Schweizer, Nikolaus

    This paper clarifies when the Omega ratio and related performance measures are consistent with second order stochastic dominance and when they are not. To avoid consistency problems, the threshold parameter in the ratio should be chosen as the expected return of some benchmark – as is commonly done

  8. Carl Rogers during Initial Interviews: A Moderate and Consistent Therapist.

    Science.gov (United States)

    Edwards, H. P.; And Others

    1982-01-01

    Analyzed two initial interviews by Carl Rogers in their entirety using the Carkhuff scales, Hill's category system, and a brief grammatical analysis to establish the level and consistency with which Rogers provides facilitative conditions. Results indicated his behavior as counselor was stable and consistent within and across interviews. (Author)

  9. Policy consistency and the achievement of Nigeria's foreign policy ...

    African Journals Online (AJOL)

    This study is an attempt to investigate the policy consistency of Nigeria‟s foreign policy and to understand the basis for this consistency; and also to see whether peacekeeping/peace-enforcement is key instrument in the achievement of Nigeria‟s foreign policy goals. The objective of the study was to examine whether the ...

  10. Decentralized Consistency Checking in Cross-organizational Workflows

    NARCIS (Netherlands)

    Wombacher, Andreas

    Service Oriented Architectures facilitate loosely coupled composed services, which are established in a decentralized way. One challenge for such composed services is to guarantee consistency, i.e., deadlock-freeness. This paper presents a decentralized approach to consistency checking, which

  11. Consistency of a system of equations: What does that mean?

    NARCIS (Netherlands)

    Still, Georg J.; Kern, Walter; Koelewijn, Jaap; Bomhoff, M.J.

    2010-01-01

    The concept of (structural) consistency also called structural solvability is an important basic tool for analyzing the structure of systems of equations. Our aim is to provide a sound and practically relevant meaning to this concept. The implications of consistency are expressed in terms of

  12. Quasi-Particle Self-Consistent GW for Molecules.

    Science.gov (United States)

    Kaplan, F; Harding, M E; Seiler, C; Weigend, F; Evers, F; van Setten, M J

    2016-06-14

    We present the formalism and implementation of quasi-particle self-consistent GW (qsGW) and eigenvalue only quasi-particle self-consistent GW (evGW) adapted to standard quantum chemistry packages. Our implementation is benchmarked against high-level quantum chemistry computations (coupled-cluster theory) and experimental results using a representative set of molecules. Furthermore, we compare the qsGW approach for five molecules relevant for organic photovoltaics to self-consistent GW results (scGW) and analyze the effects of the self-consistency on the ground state density by comparing calculated dipole moments to their experimental values. We show that qsGW makes a significant improvement over conventional G0W0 and that partially self-consistent flavors (in particular evGW) can be excellent alternatives.

  13. Consistency of hand preference: predictions to intelligence and school achievement.

    Science.gov (United States)

    Kee, D W; Gottfried, A; Bathurst, K

    1991-05-01

    Gottfried and Bathurst (1983) reported that hand preference consistency measured over time during infancy and early childhood predicts intellectual precocity for females, but not for males. In the present study longitudinal assessments of children previously classified by Gottfried and Bathurst as consistent or nonconsistent in cross-time hand preference were conducted during middle childhood (ages 5 to 9). Findings show that (a) early measurement of hand preference consistency for females predicts school-age intellectual precocity, (b) the locus of the difference between consistent vs. nonconsistent females is in verbal intelligence, and (c) the precocity of the consistent females was also revealed on tests of school achievement, particularly tests of reading and mathematics.

  14. Putting humans in ecology: consistency in science and management.

    Science.gov (United States)

    Hobbs, Larry; Fowler, Charles W

    2008-03-01

    Normal and abnormal levels of human participation in ecosystems can be revealed through the use of macro-ecological patterns. Such patterns also provide consistent and objective guidance that will lead to achieving and maintaining ecosystem health and sustainability. This paper focuses on the consistency of this type of guidance and management. Such management, in sharp contrast to current management practices, ensures that our actions as individuals, institutions, political groups, societies, and as a species are applied consistently across all temporal, spatial, and organizational scales. This approach supplants management of today, where inconsistency results from debate, politics, and legal and religious polarity. Consistency is achieved when human endeavors are guided by natural patterns. Pattern-based management meets long-standing demands for enlightened management that requires humans to participate in complex systems in consistent and sustainable ways.

  15. Personality and Situation Predictors of Consistent Eating Patterns.

    Science.gov (United States)

    Vainik, Uku; Dubé, Laurette; Lu, Ji; Fellows, Lesley K

    2015-01-01

    A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studied. A community-based sample of 164 women completed various personality tests, and 139 of them also reported their eating behaviour 6 times/day over 10 observational days. We focused on observations with meals (breakfast, lunch, or dinner). The participants indicated if their momentary eating patterns were consistent with their own baseline eating patterns in terms of healthiness or size of the meal. Further, participants described various characteristics of each eating situation. Eating consistency was positively predicted by trait self-control. Eating consistency was undermined by eating in the evening, eating with others, eating away from home, having consumed alcohol and having undertaken physical exercise. Interactions emerged between personality traits and situations, including punishment sensitivity, restraint, physical activity and alcohol consumption. Trait self-control and several eating situation variables were related to eating consistency. These findings provide a starting point for targeting interventions to improve consistency, suggesting that a focus on self-control skills, together with addressing contextual factors such as social situations and time of day, may be most promising. This work is a first step to provide people with the tools they need to maintain a consistently healthy lifestyle in a food-rich environment.

  16. Personality and Situation Predictors of Consistent Eating Patterns.

    Directory of Open Access Journals (Sweden)

    Uku Vainik

    Full Text Available A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studied.A community-based sample of 164 women completed various personality tests, and 139 of them also reported their eating behaviour 6 times/day over 10 observational days. We focused on observations with meals (breakfast, lunch, or dinner. The participants indicated if their momentary eating patterns were consistent with their own baseline eating patterns in terms of healthiness or size of the meal. Further, participants described various characteristics of each eating situation.Eating consistency was positively predicted by trait self-control. Eating consistency was undermined by eating in the evening, eating with others, eating away from home, having consumed alcohol and having undertaken physical exercise. Interactions emerged between personality traits and situations, including punishment sensitivity, restraint, physical activity and alcohol consumption.Trait self-control and several eating situation variables were related to eating consistency. These findings provide a starting point for targeting interventions to improve consistency, suggesting that a focus on self-control skills, together with addressing contextual factors such as social situations and time of day, may be most promising. This work is a first step to provide people with the tools they need to maintain a consistently healthy lifestyle in a food-rich environment.

  17. HBV infection in relation to consistent condom use: a population-based study in Peru.

    Science.gov (United States)

    Bernabe-Ortiz, Antonio; Carcamo, Cesar P; Scott, John D; Hughes, James P; Garcia, Patricia J; Holmes, King K

    2011-01-01

    Data on hepatitis B virus (HBV) prevalence are limited in developing countries. There is also limited information of consistent condom use efficacy for reducing HBV transmission at the population level. The study goal was to evaluate the prevalence and factors associated with HBV infection in Peru, and the relationship between anti-HBc positivity and consistent condom use. Data from two different surveys performed in 28 mid-sized Peruvian cities were analyzed. Participants aged 18-29 years were selected using a multistage cluster sampling. Information was collected through a validated two-part questionnaire. The first part (face-to-face) concerned demographic data, while the second part (self-administered using handheld computers) concerned sexual behavior. Hepatitis B core antibody (anti-HBc) was tested in 7,000 blood samples. Prevalences and associations were adjusted for sample strata, primary sampling units and population weights. Anti-HBc prevalence was 5.0% (95%CI 4.1%-5.9%), with the highest prevalence among jungle cities: 16.3% (95%CI 13.8%-19.1%). In the multivariable analysis, Anti-HBc positivity was directly associated with geographic region (highlands OR = 2.05; 95%CI 1.28-3.27, and jungle OR = 4.86; 95%CI 3.05-7.74; compared to coastal region); and inversely associated with age at sexual debut (OR = 0.90; 95%CI 0.85-0.97). Consistent condom use, evaluated in about 40% of participants, was associated with reduced prevalence (OR = 0.34; 95%CI 0.15-0.79) after adjusting for gender, geographic region, education level, lifetime number of sex partners, age at sexual debut and year of survey. Residence in highlands or jungle cities is associated with higher anti-HBc prevalences, whereas increasing age at sexual debut were associated with lower prevalences. Consistent condom use was associated with decreased risk of anti-HBc. Findings from this study emphasize the need of primary prevention programs (vaccination) especially in the jungle

  18. HBV infection in relation to consistent condom use: a population-based study in Peru.

    Directory of Open Access Journals (Sweden)

    Antonio Bernabe-Ortiz

    Full Text Available Data on hepatitis B virus (HBV prevalence are limited in developing countries. There is also limited information of consistent condom use efficacy for reducing HBV transmission at the population level. The study goal was to evaluate the prevalence and factors associated with HBV infection in Peru, and the relationship between anti-HBc positivity and consistent condom use.Data from two different surveys performed in 28 mid-sized Peruvian cities were analyzed. Participants aged 18-29 years were selected using a multistage cluster sampling. Information was collected through a validated two-part questionnaire. The first part (face-to-face concerned demographic data, while the second part (self-administered using handheld computers concerned sexual behavior. Hepatitis B core antibody (anti-HBc was tested in 7,000 blood samples. Prevalences and associations were adjusted for sample strata, primary sampling units and population weights. Anti-HBc prevalence was 5.0% (95%CI 4.1%-5.9%, with the highest prevalence among jungle cities: 16.3% (95%CI 13.8%-19.1%. In the multivariable analysis, Anti-HBc positivity was directly associated with geographic region (highlands OR = 2.05; 95%CI 1.28-3.27, and jungle OR = 4.86; 95%CI 3.05-7.74; compared to coastal region; and inversely associated with age at sexual debut (OR = 0.90; 95%CI 0.85-0.97. Consistent condom use, evaluated in about 40% of participants, was associated with reduced prevalence (OR = 0.34; 95%CI 0.15-0.79 after adjusting for gender, geographic region, education level, lifetime number of sex partners, age at sexual debut and year of survey.Residence in highlands or jungle cities is associated with higher anti-HBc prevalences, whereas increasing age at sexual debut were associated with lower prevalences. Consistent condom use was associated with decreased risk of anti-HBc. Findings from this study emphasize the need of primary prevention programs (vaccination especially in the

  19. Safety, tolerability, pharmacokinetics, and activity of the novel long-acting antimalarial DSM265: a two-part first-in-human phase 1a/1b randomised study.

    Science.gov (United States)

    McCarthy, James S; Lotharius, Julie; Rückle, Thomas; Chalon, Stephan; Phillips, Margaret A; Elliott, Suzanne; Sekuloski, Silvana; Griffin, Paul; Ng, Caroline L; Fidock, David A; Marquart, Louise; Williams, Noelle S; Gobeau, Nathalie; Bebrevska, Lidiya; Rosario, Maria; Marsh, Kennan; Möhrle, Jörg J

    2017-06-01

    DSM265 is a novel antimalarial that inhibits plasmodial dihydroorotate dehydrogenase, an enzyme essential for pyrimidine biosynthesis. We investigated the safety, tolerability, and pharmacokinetics of DSM265, and tested its antimalarial activity. Healthy participants aged 18-55 years were enrolled in a two-part study: part 1, a single ascending dose (25-1200 mg), double-blind, randomised, placebo-controlled study, and part 2, an open-label, randomised, active-comparator controlled study, in which participants were inoculated with Plasmodium falciparum induced blood-stage malaria (IBSM) and treated with DSM265 (150 mg) or mefloquine (10 mg/kg). Primary endpoints were DSM265 safety, tolerability, and pharmacokinetics. Randomisation lists were created using a validated, automated system. Both parts were registered with the Australian New Zealand Clinical Trials Registry, number ACTRN12613000522718 (part 1) and number ACTRN12613000527763 (part 2). In part 1, 73 participants were enrolled between April 12, 2013, and July 14, 2015 (DSM265, n=55; placebo, n=18). In part 2, nine participants were enrolled between Sept 30 and Nov 25, 2013 (150 mg DSM265, n=7; 10 mg/kg mefloquine, n=2). In part 1, 117 adverse events were reported; no drug-related serious or severe events were reported. The most common drug-related adverse event was headache. The mean DSM265 peak plasma concentration (C max ) ranged between 1310 ng/mL and 34 800 ng/mL and was reached in a median time (t max ) between 1·5 h and 4 h, with a mean elimination half-life between 86 h and 118 h. In part 2, the log 10 parasite reduction ratio at 48 h in the DSM265 (150 mg) group was 1·55 (95% CI 1·42-1·67) and in the mefloquine (10 mg/kg) group was 2·34 (2·17-2·52), corresponding to a parasite clearance half-life of 9·4 h (8·7-10·2) and 6·2 h (5·7-6·7), respectively. The median minimum inhibitory concentration of DSM265 in blood was estimated as 1040 ng/mL (range 552-1500), resulting in a predicted

  20. Facial Mimicry and Emotion Consistency: Influences of Memory and Context.

    Science.gov (United States)

    Kirkham, Alexander J; Hayes, Amy E; Pawling, Ralph; Tipper, Steven P

    2015-01-01

    This study investigates whether mimicry of facial emotions is a stable response or can instead be modulated and influenced by memory of the context in which the emotion was initially observed, and therefore the meaning of the expression. The study manipulated emotion consistency implicitly, where a face expressing smiles or frowns was irrelevant and to be ignored while participants categorised target scenes. Some face identities always expressed emotions consistent with the scene (e.g., smiling with a positive scene), whilst others were always inconsistent (e.g., frowning with a positive scene). During this implicit learning of face identity and emotion consistency there was evidence for encoding of face-scene emotion consistency, with slower RTs, a reduction in trust, and inhibited facial EMG for faces expressing incompatible emotions. However, in a later task where the faces were subsequently viewed expressing emotions with no additional context, there was no evidence for retrieval of prior emotion consistency, as mimicry of emotion was similar for consistent and inconsistent individuals. We conclude that facial mimicry can be influenced by current emotion context, but there is little evidence of learning, as subsequent mimicry of emotionally consistent and inconsistent faces is similar.

  1. Facial Mimicry and Emotion Consistency: Influences of Memory and Context.

    Directory of Open Access Journals (Sweden)

    Alexander J Kirkham

    Full Text Available This study investigates whether mimicry of facial emotions is a stable response or can instead be modulated and influenced by memory of the context in which the emotion was initially observed, and therefore the meaning of the expression. The study manipulated emotion consistency implicitly, where a face expressing smiles or frowns was irrelevant and to be ignored while participants categorised target scenes. Some face identities always expressed emotions consistent with the scene (e.g., smiling with a positive scene, whilst others were always inconsistent (e.g., frowning with a positive scene. During this implicit learning of face identity and emotion consistency there was evidence for encoding of face-scene emotion consistency, with slower RTs, a reduction in trust, and inhibited facial EMG for faces expressing incompatible emotions. However, in a later task where the faces were subsequently viewed expressing emotions with no additional context, there was no evidence for retrieval of prior emotion consistency, as mimicry of emotion was similar for consistent and inconsistent individuals. We conclude that facial mimicry can be influenced by current emotion context, but there is little evidence of learning, as subsequent mimicry of emotionally consistent and inconsistent faces is similar.

  2. Quasiparticle self-consistent GW method: a short summary

    International Nuclear Information System (INIS)

    Kotani, Takao; Schilfgaarde, Mark van; Faleev, Sergey V; Chantis, Athanasios

    2007-01-01

    We have developed a quasiparticle self-consistent GW method (QSGW), which is a new self-consistent method to calculate the electronic structure within the GW approximation. The method is formulated based on the idea of a self-consistent perturbation; the non-interacting Green function G 0 , which is the starting point for GWA to obtain G, is determined self-consistently so as to minimize the perturbative correction generated by GWA. After self-consistency is attained, we have G 0 , W (the screened Coulomb interaction) and G self-consistently. This G 0 can be interpreted as the optimum non-interacting propagator for the quasiparticles. We will summarize some theoretical discussions to justify QSGW. Then we will survey results which have been obtained up to now: e.g., band gaps for normal semiconductors are predicted to a precision of 0.1-0.3 eV; the self-consistency including the off-diagonal part is required for NiO and MnO; and so on. There are still some remaining disagreements with experiments; however, they are very systematic, and can be explained from the neglect of excitonic effects

  3. Design and fabrication of sun tracker

    International Nuclear Information System (INIS)

    Novinrooz, A. J.; Ghasemi, M. R.; Mohati, M.; Sadri, H.

    2003-01-01

    A sun tacker system, consists of two parts (opto-electronic and hydraulic), has been designed and fabricated to be used in solar thermal power plant. In this paper various parts of the system including optical sensors, electronic circuits, computational control and mechanical lever have been explained and the operational mechanism of each one is discussed. The parabolic mirror used in this plant has 400 cm length, 570 cm width and 170 cm focal length. Rays falling to the axis of mirror are reflected and collected at the focal point, while unparallel rays are diverted. To determine the rate of divergence, a three - dimensional equation of radiation path is written. Using a computational program in Cl anguage the error is calculated from 0t o 0 .5 d eg, for modifying the operational error of the optical system. The optical sensors detect the beam deviation from the mirror's principal axis with a precision of 0.1 degree and transfer the necessary corrections to the active mechanical system of the hydraulic type. A three phase electro motor of 0.7 k W power and one thousand revolutions per minute controls the mirror movement

  4. Protective Factors, Risk Indicators, and Contraceptive Consistency Among College Women.

    Science.gov (United States)

    Morrison, Leslie F; Sieving, Renee E; Pettingell, Sandra L; Hellerstedt, Wendy L; McMorris, Barbara J; Bearinger, Linda H

    2016-01-01

    To explore risk and protective factors associated with consistent contraceptive use among emerging adult female college students and whether effects of risk indicators were moderated by protective factors. Secondary analysis of National Longitudinal Study of Adolescent to Adult Health Wave III data. Data collected through in-home interviews in 2001 and 2002. National sample of 18- to 25-year-old women (N = 842) attending 4-year colleges. We examined relationships between protective factors, risk indicators, and consistent contraceptive use. Consistent contraceptive use was defined as use all of the time during intercourse in the past 12 months. Protective factors included external supports of parental closeness and relationship with caring nonparental adult and internal assets of self-esteem, confidence, independence, and life satisfaction. Risk indicators included heavy episodic drinking, marijuana use, and depression symptoms. Multivariable logistic regression models were used to evaluate relationships between protective factors and consistent contraceptive use and between risk indicators and contraceptive use. Self-esteem, confidence, independence, and life satisfaction were significantly associated with more consistent contraceptive use. In a final model including all internal assets, life satisfaction was significantly related to consistent contraceptive use. Marijuana use and depression symptoms were significantly associated with less consistent use. With one exception, protective factors did not moderate relationships between risk indicators and consistent use. Based on our findings, we suggest that risk and protective factors may have largely independent influences on consistent contraceptive use among college women. A focus on risk and protective factors may improve contraceptive use rates and thereby reduce unintended pregnancy among college students. Copyright © 2016 AWHONN, the Association of Women's Health, Obstetric and Neonatal Nurses. Published

  5. Introduction of circuit design on RFID system

    International Nuclear Information System (INIS)

    Pak, Sunho

    2007-06-01

    This is a case of research of Fujitsu company and design of basic circuit of electronic technique. It is composed of two parts. The first part deals with introduction of RFID system design, which lists basic knowledge of ubiquitous, glossary of high frequency, design of impedance matching circuit, RFID system, sorts and design of filter, modulator and a transmission and RFID system design. The second part deals with research and development of Fujitsu company, including RFID middle ware RFID CONNECT of Fujitsu, sensor network of Fujitsu and high handing technique of RFID system.

  6. Introduction of circuit design on RFID system

    Energy Technology Data Exchange (ETDEWEB)

    Pak, Sunho

    2007-06-15

    This is a case of research of Fujitsu company and design of basic circuit of electronic technique. It is composed of two parts. The first part deals with introduction of RFID system design, which lists basic knowledge of ubiquitous, glossary of high frequency, design of impedance matching circuit, RFID system, sorts and design of filter, modulator and a transmission and RFID system design. The second part deals with research and development of Fujitsu company, including RFID middle ware RFID CONNECT of Fujitsu, sensor network of Fujitsu and high handing technique of RFID system.

  7. The Consistent Preferences Approach to Deductive Reasoning in Games

    CERN Document Server

    Asheim, Geir B

    2006-01-01

    "The Consistent Preferences Approach to Deductive Reasoning in Games" presents, applies, and synthesizes what my co-authors and I have called the 'consistent preferences' approach to deductive reasoning in games. Briefly described, this means that the object of the analysis is the ranking by each player of his own strategies, rather than his choice. The ranking can be required to be consistent (in different senses) with his beliefs about the opponent's ranking of her strategies. This can be contrasted to the usual 'rational choice' approach where a player's strategy choice is (in dif

  8. Multiphase flows of N immiscible incompressible fluids: A reduction-consistent and thermodynamically-consistent formulation and associated algorithm

    Science.gov (United States)

    Dong, S.

    2018-05-01

    We present a reduction-consistent and thermodynamically consistent formulation and an associated numerical algorithm for simulating the dynamics of an isothermal mixture consisting of N (N ⩾ 2) immiscible incompressible fluids with different physical properties (densities, viscosities, and pair-wise surface tensions). By reduction consistency we refer to the property that if only a set of M (1 ⩽ M ⩽ N - 1) fluids are present in the system then the N-phase governing equations and boundary conditions will exactly reduce to those for the corresponding M-phase system. By thermodynamic consistency we refer to the property that the formulation honors the thermodynamic principles. Our N-phase formulation is developed based on a more general method that allows for the systematic construction of reduction-consistent formulations, and the method suggests the existence of many possible forms of reduction-consistent and thermodynamically consistent N-phase formulations. Extensive numerical experiments have been presented for flow problems involving multiple fluid components and large density ratios and large viscosity ratios, and the simulation results are compared with the physical theories or the available physical solutions. The comparisons demonstrate that our method produces physically accurate results for this class of problems.

  9. Organization Design

    OpenAIRE

    Milton Harris; Artur Raviv

    2002-01-01

    This paper attempts to explain organization structure based on optimal coordination of interactions among activities. The main idea is that each manager is capable of detecting and coordinating interactions only within his limited area of expertise. Only the CEO can coordinate company wide interactions. The optimal design of the organization trades off the costs and benefits of various configurations of managers. Our results consist of classifying the characteristics of activities and manager...

  10. On the consistent histories approach to quantum mechanics

    International Nuclear Information System (INIS)

    Dowker, F.; Kent, A.

    1996-01-01

    We review the consistent histories formulations of quantum mechanics developed by Griffiths, Omnes, Gell-Man, and Hartle, and we describe the classifications of consistent sets. We illustrate some general features of consistent sets by a few lemmas and examples. We also consider various interpretations of the formalism, and we examine the new problems which arise in reconstructing the past and predicting the future. It is shown that Omnes characterization of true statements---statements that can be deduced unconditionally in his interpretation---is incorrect. We examine critically Gell-Mann and Hartle's interpretation of the formalism, and in particular, their discussions of communication, prediction, and retrodiction, and we conclude that their explanation of the apparent persistence of quasiclassicality relies on assumptions about an as-yet-unknown theory of experience. Our overall conclusion is that the consistent histories approach illustrates the need to supplement quantum mechanics by some selection principle in order to produce a fundamental theory capable of unconditional predictions

  11. Consistency of Trend Break Point Estimator with Underspecified Break Number

    Directory of Open Access Journals (Sweden)

    Jingjing Yang

    2017-01-01

    Full Text Available This paper discusses the consistency of trend break point estimators when the number of breaks is underspecified. The consistency of break point estimators in a simple location model with level shifts has been well documented by researchers under various settings, including extensions such as allowing a time trend in the model. Despite the consistency of break point estimators of level shifts, there are few papers on the consistency of trend shift break point estimators in the presence of an underspecified break number. The simulation study and asymptotic analysis in this paper show that the trend shift break point estimator does not converge to the true break points when the break number is underspecified. In the case of two trend shifts, the inconsistency problem worsens if the magnitudes of the breaks are similar and the breaks are either both positive or both negative. The limiting distribution for the trend break point estimator is developed and closely approximates the finite sample performance.

  12. Liking for Evaluators: Consistency and Self-Esteem Theories

    Science.gov (United States)

    Regan, Judith Weiner

    1976-01-01

    Consistency and self-esteem theories make contrasting predictions about the relationship between a person's self-evaluation and his liking for an evaluator. Laboratory experiments confirmed predictions about these theories. (Editor/RK)

  13. Sparse PDF Volumes for Consistent Multi-Resolution Volume Rendering

    KAUST Repository

    Sicat, Ronell Barrera; Kruger, Jens; Moller, Torsten; Hadwiger, Markus

    2014-01-01

    This paper presents a new multi-resolution volume representation called sparse pdf volumes, which enables consistent multi-resolution volume rendering based on probability density functions (pdfs) of voxel neighborhoods. These pdfs are defined

  14. Structures, profile consistency, and transport scaling in electrostatic convection

    DEFF Research Database (Denmark)

    Bian, N.H.; Garcia, O.E.

    2005-01-01

    Two mechanisms at the origin of profile consistency in models of electrostatic turbulence in magnetized plasmas are considered. One involves turbulent diffusion in collisionless plasmas and the subsequent turbulent equipartition of Lagrangian invariants. By the very nature of its definition...

  15. 15 CFR 930.36 - Consistency determinations for proposed activities.

    Science.gov (United States)

    2010-01-01

    ... necessity of issuing separate consistency determinations for each incremental action controlled by the major... plans), and that affect any coastal use or resource of more than one State. Many States share common...

  16. The utility of theory of planned behavior in predicting consistent ...

    African Journals Online (AJOL)

    admin

    disease. Objective: To examine the utility of theory of planned behavior in predicting consistent condom use intention of HIV .... (24-25), making subjective norms as better predictors of intention ..... Organizational Behavior and Human Decision.

  17. A methodology for the data energy regional consumption consistency analysis

    International Nuclear Information System (INIS)

    Canavarros, Otacilio Borges; Silva, Ennio Peres da

    1999-01-01

    The article introduces a methodology for data energy regional consumption consistency analysis. The work was going based on recent studies accomplished by several cited authors and boarded Brazilian matrices and Brazilian energetics regional balances. The results are compared and analyzed

  18. Island of Stability for Consistent Deformations of Einstein's Gravity

    DEFF Research Database (Denmark)

    Dietrich, Dennis D.; Berkhahn, Felix; Hofmann, Stefan

    2012-01-01

    We construct deformations of general relativity that are consistent and phenomenologically viable, since they respect, in particular, cosmological backgrounds. These deformations have unique symmetries in accordance with their Minkowski cousins (Fierz-Pauli theory for massive gravitons) and incor...

  19. DC Brushless Motor Control Design and Preliminary Testing for Independent 4-Wheel Drive Rev-11 Robotic Platform

    Directory of Open Access Journals (Sweden)

    Roni Permana Saputra

    2012-03-01

    Full Text Available This paper discusses the design of control system for brushless DC motor using microcontroller ATMega 16 that will be applied to an independent 4-wheel drive Mobile Robot LIPI version 2 (REV-11. The control system consists of two parts which are brushless DC motor control module and supervisory control module that coordinates the desired command to the motor control module. To control the REV-11 platform, supervisory control transmit the reference data of speed and direction of motor to control the speed and direction of each actuator on the platform REV-11. From the test results it is concluded that the designed control system work properly to coordinate and control the speed and direction of motion of the actuator motor REV-11 platform. 

  20. Self-consistent normal ordering of gauge field theories

    International Nuclear Information System (INIS)

    Ruehl, W.

    1987-01-01

    Mean-field theories with a real action of unconstrained fields can be self-consistently normal ordered. This leads to a considerable improvement over standard mean-field theory. This concept is applied to lattice gauge theories. First an appropriate real action mean-field theory is constructed. The equations determining the Gaussian kernel necessary for self-consistent normal ordering of this mean-field theory are derived. (author). 4 refs

  1. Consistency of the least weighted squares under heteroscedasticity

    Czech Academy of Sciences Publication Activity Database

    Víšek, Jan Ámos

    2011-01-01

    Roč. 2011, č. 47 (2011), s. 179-206 ISSN 0023-5954 Grant - others:GA UK(CZ) GA402/09/055 Institutional research plan: CEZ:AV0Z10750506 Keywords : Regression * Consistency * The least weighted squares * Heteroscedasticity Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.454, year: 2011 http://library.utia.cas.cz/separaty/2011/SI/visek-consistency of the least weighted squares under heteroscedasticity.pdf

  2. Cosmological consistency tests of gravity theory and cosmic acceleration

    Science.gov (United States)

    Ishak-Boushaki, Mustapha B.

    2017-01-01

    Testing general relativity at cosmological scales and probing the cause of cosmic acceleration are among the important objectives targeted by incoming and future astronomical surveys and experiments. I present our recent results on consistency tests that can provide insights about the underlying gravity theory and cosmic acceleration using cosmological data sets. We use statistical measures, the rate of cosmic expansion, the growth rate of large scale structure, and the physical consistency of these probes with one another.

  3. Self-consistency corrections in effective-interaction calculations

    International Nuclear Information System (INIS)

    Starkand, Y.; Kirson, M.W.

    1975-01-01

    Large-matrix extended-shell-model calculations are used to compute self-consistency corrections to the effective interaction and to the linked-cluster effective interaction. The corrections are found to be numerically significant and to affect the rate of convergence of the corresponding perturbation series. The influence of various partial corrections is tested. It is concluded that self-consistency is an important effect in determining the effective interaction and improving the rate of convergence. (author)

  4. Parquet equations for numerical self-consistent-field theory

    International Nuclear Information System (INIS)

    Bickers, N.E.

    1991-01-01

    In recent years increases in computational power have provided new motivation for the study of self-consistent-field theories for interacting electrons. In this set of notes, the so-called parquet equations for electron systems are derived pedagogically. The principal advantages of the parquet approach are outlined, and its relationship to simpler self-consistent-field methods, including the Baym-Kadanoff technique, is discussed in detail. (author). 14 refs, 9 figs

  5. Consistent Estimation of Pricing Kernels from Noisy Price Data

    OpenAIRE

    Vladislav Kargin

    2003-01-01

    If pricing kernels are assumed non-negative then the inverse problem of finding the pricing kernel is well-posed. The constrained least squares method provides a consistent estimate of the pricing kernel. When the data are limited, a new method is suggested: relaxed maximization of the relative entropy. This estimator is also consistent. Keywords: $\\epsilon$-entropy, non-parametric estimation, pricing kernel, inverse problems.

  6. Design Methodology - Design Synthesis

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup

    2003-01-01

    Design Methodology is part of our practice and our knowledge about designing, and it has been strongly supported by the establishing and work of a design research community. The aim of this article is to broaden the reader¿s view of designing and Design Methodology. This is done by sketching...... the development of Design Methodology through time and sketching some important approaches and methods. The development is mainly forced by changing industrial condition, by the growth of IT support for designing, but also by the growth of insight into designing created by design researchers.......ABSTRACT Design Methodology shall be seen as our understanding of how to design; it is an early (emerging late 60ies) and original articulation of teachable and learnable methodics. The insight is based upon two sources: the nature of the designed artefacts and the nature of human designing. Today...

  7. Measuring consistency of autobiographical memory recall in depression.

    LENUS (Irish Health Repository)

    Semkovska, Maria

    2012-05-15

    Autobiographical amnesia assessments in depression need to account for normal changes in consistency over time, contribution of mood and type of memories measured. We report herein validation studies of the Columbia Autobiographical Memory Interview - Short Form (CAMI-SF), exclusively used in depressed patients receiving electroconvulsive therapy (ECT) but without previous published report of normative data. The CAMI-SF was administered twice with a 6-month interval to 44 healthy volunteers to obtain normative data for retrieval consistency of its Semantic, Episodic-Extended and Episodic-Specific components and assess their reliability and validity. Healthy volunteers showed significant large decreases in retrieval consistency on all components. The Semantic and Episodic-Specific components demonstrated substantial construct validity. We then assessed CAMI-SF retrieval consistencies over a 2-month interval in 30 severely depressed patients never treated with ECT compared with healthy controls (n=19). On initial assessment, depressed patients produced less episodic-specific memories than controls. Both groups showed equivalent amounts of consistency loss over a 2-month interval on all components. At reassessment, only patients with persisting depressive symptoms were distinguishable from controls on episodic-specific memories retrieved. Research quantifying retrograde amnesia following ECT for depression needs to control for normal loss in consistency over time and contribution of persisting depressive symptoms.

  8. Measuring consistency of autobiographical memory recall in depression.

    Science.gov (United States)

    Semkovska, Maria; Noone, Martha; Carton, Mary; McLoughlin, Declan M

    2012-05-15

    Autobiographical amnesia assessments in depression need to account for normal changes in consistency over time, contribution of mood and type of memories measured. We report herein validation studies of the Columbia Autobiographical Memory Interview - Short Form (CAMI-SF), exclusively used in depressed patients receiving electroconvulsive therapy (ECT) but without previous published report of normative data. The CAMI-SF was administered twice with a 6-month interval to 44 healthy volunteers to obtain normative data for retrieval consistency of its Semantic, Episodic-Extended and Episodic-Specific components and assess their reliability and validity. Healthy volunteers showed significant large decreases in retrieval consistency on all components. The Semantic and Episodic-Specific components demonstrated substantial construct validity. We then assessed CAMI-SF retrieval consistencies over a 2-month interval in 30 severely depressed patients never treated with ECT compared with healthy controls (n=19). On initial assessment, depressed patients produced less episodic-specific memories than controls. Both groups showed equivalent amounts of consistency loss over a 2-month interval on all components. At reassessment, only patients with persisting depressive symptoms were distinguishable from controls on episodic-specific memories retrieved. Research quantifying retrograde amnesia following ECT for depression needs to control for normal loss in consistency over time and contribution of persisting depressive symptoms. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  9. High-performance speech recognition using consistency modeling

    Science.gov (United States)

    Digalakis, Vassilios; Murveit, Hy; Monaco, Peter; Neumeyer, Leo; Sankar, Ananth

    1994-12-01

    The goal of SRI's consistency modeling project is to improve the raw acoustic modeling component of SRI's DECIPHER speech recognition system and develop consistency modeling technology. Consistency modeling aims to reduce the number of improper independence assumptions used in traditional speech recognition algorithms so that the resulting speech recognition hypotheses are more self-consistent and, therefore, more accurate. At the initial stages of this effort, SRI focused on developing the appropriate base technologies for consistency modeling. We first developed the Progressive Search technology that allowed us to perform large-vocabulary continuous speech recognition (LVCSR) experiments. Since its conception and development at SRI, this technique has been adopted by most laboratories, including other ARPA contracting sites, doing research on LVSR. Another goal of the consistency modeling project is to attack difficult modeling problems, when there is a mismatch between the training and testing phases. Such mismatches may include outlier speakers, different microphones and additive noise. We were able to either develop new, or transfer and evaluate existing, technologies that adapted our baseline genonic HMM recognizer to such difficult conditions.

  10. Are prescription drug insurance choices consistent with expected utility theory?

    Science.gov (United States)

    Bundorf, M Kate; Mata, Rui; Schoenbaum, Michael; Bhattacharya, Jay

    2013-09-01

    To determine the extent to which people make choices inconsistent with expected utility theory when choosing among prescription drug insurance plans and whether tabular or graphical presentation format influences the consistency of their choices. Members of an Internet-enabled panel chose between two Medicare prescription drug plans. The "low variance" plan required higher out-of-pocket payments for the drugs respondents usually took but lower out-of-pocket payments for the drugs they might need if they developed a new health condition than the "high variance" plan. The probability of a change in health varied within subjects and the presentation format (text vs. graphical) and the affective salience of the clinical condition (abstract vs. risk related to specific clinical condition) varied between subjects. Respondents were classified based on whether they consistently chose either the low or high variance plan. Logistic regression models were estimated to examine the relationship between decision outcomes and task characteristics. The majority of respondents consistently chose either the low or high variance plan, consistent with expected utility theory. Half of respondents consistently chose the low variance plan. Respondents were less likely to make discrepant choices when information was presented in graphical format. Many people, although not all, make choices consistent with expected utility theory when they have information on differences among plans in the variance of out-of-pocket spending. Medicare beneficiaries would benefit from information on the extent to which prescription drug plans provide risk protection. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  11. Validity and internal consistency of a whiplash-specific disability measure

    NARCIS (Netherlands)

    Pinfold, Melanie; Niere, Ken R.; O'Leary, Elizabeth F.; Hoving, Jan Lucas; Green, Sally; Buchbinder, Rachelle

    2004-01-01

    STUDY DESIGN: Cross-sectional study of patients with whiplash-associated disorders investigating the internal consistency, factor structure, response rates, and presence of floor and ceiling effects of the Whiplash Disability Questionnaire (WDQ). OBJECTIVES: The aim of this study was to confirm the

  12. Consistency between Constructivist Profiles and Instructional Practices of Prospective Physics Teachers

    Science.gov (United States)

    Ates, Ozlem; Unal Coban, Gul; Kaya Sengoren, Serap

    2018-01-01

    This study aims to explain the extent to which prospective physics teachers' views and practices are consistent with the constructivist framework. A case study design was employed as the research approach. The study was conducted with 11 prospective physics teachers attending a state university in Turkey. Data was collected through semi-structured…

  13. The internal consistency and validity of the Self-assessment Parkinson's Disease Disability Scale.

    NARCIS (Netherlands)

    Biemans, M.A.J.E.; Dekker, J.; Woude, L.H.V. van der

    2001-01-01

    OBJECTIVE: To test the consistency and validity of the Self-assessment Parkinson's Disease Disability Scale in patients with Parkinson's disease living at home. DESIGN: Patients with Parkinson's disease responded to a set of questionnaires. In addition, an observation of the performance of daily

  14. Internal consistency and validity of the self-assessment Parkinson's Disease disability scale. Abstract.

    NARCIS (Netherlands)

    Dekker, J.; Biemans, M.A.J.E.; Woude, L.H.V. van der

    2000-01-01

    OBJECTIVE: To test the consistency and validity of the Self-assessment Parkinson's Disease Disability Scale in patients with Parkinson's disease living at home. DESIGN: Patients with Parkinson's disease responded to a set of questionnaires. In addition, an observation of the performance of daily

  15. A proposed grading system for standardizing tumor consistency of intracranial meningiomas.

    Science.gov (United States)

    Zada, Gabriel; Yashar, Parham; Robison, Aaron; Winer, Jesse; Khalessi, Alexander; Mack, William J; Giannotta, Steven L

    2013-12-01

    Tumor consistency plays an important and underrecognized role in the surgeon's ability to resect meningiomas, especially with evolving trends toward minimally invasive and keyhole surgical approaches. Aside from descriptors such as "hard" or "soft," no objective criteria exist for grading, studying, and conveying the consistency of meningiomas. The authors designed a practical 5-point scale for intraoperative grading of meningiomas based on the surgeon's ability to internally debulk the tumor and on the subsequent resistance to folding of the tumor capsule. Tumor consistency grades and features are as follows: 1) extremely soft tumor, internal debulking with suction only; 2) soft tumor, internal debulking mostly with suction, and remaining fibrous strands resected with easily folded capsule; 3) average consistency, tumor cannot be freely suctioned and requires mechanical debulking, and the capsule then folds with relative ease; 4) firm tumor, high degree of mechanical debulking required, and capsule remains difficult to fold; and 5) extremely firm, calcified tumor, approaches density of bone, and capsule does not fold. Additional grading categories included tumor heterogeneity (with minimum and maximum consistency scores) and a 3-point vascularity score. This grading system was prospectively assessed in 50 consecutive patients undergoing craniotomy for meningioma resection by 2 surgeons in an independent fashion. Grading scores were subjected to a linear weighted kappa analysis for interuser reliability. Fifty patients (100 scores) were included in the analysis. The mean maximal tumor diameter was 4.3 cm. The distribution of overall tumor consistency scores was as follows: Grade 1, 4%; Grade 2, 9%; Grade 3, 43%; Grade 4, 44%; and Grade 5, 0%. Regions of Grade 5 consistency were reported only focally in 14% of heterogeneous tumors. Tumors were designated as homogeneous in 68% and heterogeneous in 32% of grades. The kappa analysis score for overall tumor consistency

  16. Integrated communications: From one look to normative consistency

    DEFF Research Database (Denmark)

    Torp, Simon

    2009-01-01

    ambitious interpretations of the concept the integration endeavour extends from the external integration of visual design to the internal integration of the organization's culture and "soul".   Design/methodology/approach The paper is based on a critical and thematic reading of the integrated marketing...

  17. Designing Material Materialising Design

    DEFF Research Database (Denmark)

    Nicholas, Paul

    2013-01-01

    Designing Material Materialising Design documents five projects developed at the Centre for Information Technology and Architecture (CITA) at the Royal Danish Academy of Fine Arts, School of Architecture. These projects explore the idea that new designed materials might require new design methods....... Focusing on fibre reinforced composites, this book sustains an exploration into the design and making of elastically tailored architectural structures that rely on the use of computational design to predict sensitive interdependencies between geometry and behaviour. Developing novel concepts...

  18. The Bilevel Design Problem for Communication Networks on Trains: Model, Algorithm, and Verification

    Directory of Open Access Journals (Sweden)

    Yin Tian

    2014-01-01

    Full Text Available This paper proposes a novel method to solve the problem of train communication network design. Firstly, we put forward a general description of such problem. Then, taking advantage of the bilevel programming theory, we created the cost-reliability-delay model (CRD model that consisted of two parts: the physical topology part aimed at obtaining the networks with the maximum reliability under constrained cost, while the logical topology part focused on the communication paths yielding minimum delay based on the physical topology delivered from upper level. We also suggested a method to solve the CRD model, which combined the genetic algorithm and the Floyd-Warshall algorithm. Finally, we used a practical example to verify the accuracy and the effectiveness of the CRD model and further applied the novel method on a train with six carriages.

  19. ICR studies of some anionic gas phase reactions and FTICR software design

    International Nuclear Information System (INIS)

    Noest, A.J.

    1983-01-01

    This thesis consists of two parts. Part one (Chs. 1-5) reports experimental results from mostly drift-cell ICR studies of negative ion-molecule reactions; part two (Chs. 6-11) concerns the design of software for an FTICR instrument. The author discusses successively: 1. ion cyclotron resonance spectrometry; 2. the gas phase allyl anion; 3. the (M-H) and (M-H2) anions from acetone; 4. negative ion-molecule reactions of aliphatic nitrites studied by cyclotron resonance; 5. homoconjugation versus charge-dipole interaction effects in the stabilization of carbanions in the gas phase; 6. the Fourier Transform ICR method; 7. the FTICR-software; 8. an efficient adaptive matcher filter for fast transient signals; 9. reduction of spectral peak height errors by time-domain weighing; 10. Chirp excitation; 11. Compact data storage. The book concludes with a Dutch and English summary (G.J.P.)

  20. Martial arts striking hand peak acceleration, accuracy and consistency.

    Science.gov (United States)

    Neto, Osmar Pinto; Marzullo, Ana Carolina De Miranda; Bolander, Richard P; Bir, Cynthia A

    2013-01-01

    The goal of this paper was to investigate the possible trade-off between peak hand acceleration and accuracy and consistency of hand strikes performed by martial artists of different training experiences. Ten male martial artists with training experience ranging from one to nine years volunteered to participate in the experiment. Each participant performed 12 maximum effort goal-directed strikes. Hand acceleration during the strikes was obtained using a tri-axial accelerometer block. A pressure sensor matrix was used to determine the accuracy and consistency of the strikes. Accuracy was estimated by the radial distance between the centroid of each subject's 12 strikes and the target, whereas consistency was estimated by the square root of the 12 strikes mean squared distance from their centroid. We found that training experience was significantly correlated to hand peak acceleration prior to impact (r(2)=0.456, p =0.032) and accuracy (r(2)=0. 621, p=0.012). These correlations suggest that more experienced participants exhibited higher hand peak accelerations and at the same time were more accurate. Training experience, however, was not correlated to consistency (r(2)=0.085, p=0.413). Overall, our results suggest that martial arts training may lead practitioners to achieve higher striking hand accelerations with better accuracy and no change in striking consistency.

  1. Self-consistent electrodynamic scattering in the symmetric Bragg case

    International Nuclear Information System (INIS)

    Campos, H.S.

    1988-01-01

    We have analyzed the symmetric Bragg case, introducing a model of self consistent scattering for two elliptically polarized beams. The crystal is taken as a set of mathematical planes, each of them defined by a surface density of dipoles. We have considered the mesofield and the epifield differently from that of the Ewald's theory and, we assumed a plane of dipoles and the associated fields as a self consistent scattering unit. The exact analytical treatment when applied to any two neighbouring planes, results in a general and self consistent Bragg's equation, in terms of the amplitude and phase variations. The generalized solution for the set of N planes was obtained after introducing an absorption factor in the incident radiation, in two ways: (i) the analytical one, through a rule of field similarity, which says that the incidence occurs in both faces of the all crystal planes and also, through a matricial development with the Chebyshev polynomials; (ii) using the numerical solution we calculated, iteratively, the reflectivity, the reflection phase, the transmissivity, the transmission phase and the energy. The results are showed through reflection and transmission curves, which are characteristics as from kinematical as dynamical theories. The conservation of the energy results from the Ewald's self consistency principle is used. In the absorption case, the results show that it is not the only cause for the asymmetric form in the reflection curves. The model contains basic elements for a unified, microscope, self consistent, vectorial and exact formulation for interpretating the X ray diffraction in perfect crystals. (author)

  2. Cognitive consistency and math-gender stereotypes in Singaporean children.

    Science.gov (United States)

    Cvencek, Dario; Meltzoff, Andrew N; Kapur, Manu

    2014-01-01

    In social psychology, cognitive consistency is a powerful principle for organizing psychological concepts. There have been few tests of cognitive consistency in children and no research about cognitive consistency in children from Asian cultures, who pose an interesting developmental case. A sample of 172 Singaporean elementary school children completed implicit and explicit measures of math-gender stereotype (male=math), gender identity (me=male), and math self-concept (me=math). Results showed strong evidence for cognitive consistency; the strength of children's math-gender stereotypes, together with their gender identity, significantly predicted their math self-concepts. Cognitive consistency may be culturally universal and a key mechanism for developmental change in social cognition. We also discovered that Singaporean children's math-gender stereotypes increased as a function of age and that boys identified with math more strongly than did girls despite Singaporean girls' excelling in math. The results reveal both cultural universals and cultural variation in developing social cognition. Copyright © 2013 Elsevier Inc. All rights reserved.

  3. A consistent response spectrum analysis including the resonance range

    International Nuclear Information System (INIS)

    Schmitz, D.; Simmchen, A.

    1983-01-01

    The report provides a complete consistent Response Spectrum Analysis for any component. The effect of supports with different excitation is taken into consideration, at is the description of the resonance ranges. It includes information explaining how the contributions of the eigenforms with higher eigenfrequencies are to be considered. Stocking of floor response spectra is also possible using the method described here. However, modified floor response spectra must now be calculated for each building mode. Once these have been prepared, the calculation of the dynamic component values is practically no more complicated than with the conventional, non-consistent methods. The consistent Response Spectrum Analysis can supply smaller and larger values than the conventional theory, a fact which can be demonstrated using simple examples. The report contains a consistent Response Spectrum Analysis (RSA), which, as far as we know, has been formulated in this way for the first time. A consistent RSA is so important because today this method is preferentially applied as an important tool for the earthquake proof of components in nuclear power plants. (orig./HP)

  4. GRAVITATIONALLY CONSISTENT HALO CATALOGS AND MERGER TREES FOR PRECISION COSMOLOGY

    International Nuclear Information System (INIS)

    Behroozi, Peter S.; Wechsler, Risa H.; Wu, Hao-Yi; Busha, Michael T.; Klypin, Anatoly A.; Primack, Joel R.

    2013-01-01

    We present a new algorithm for generating merger trees and halo catalogs which explicitly ensures consistency of halo properties (mass, position, and velocity) across time steps. Our algorithm has demonstrated the ability to improve both the completeness (through detecting and inserting otherwise missing halos) and purity (through detecting and removing spurious objects) of both merger trees and halo catalogs. In addition, our method is able to robustly measure the self-consistency of halo finders; it is the first to directly measure the uncertainties in halo positions, halo velocities, and the halo mass function for a given halo finder based on consistency between snapshots in cosmological simulations. We use this algorithm to generate merger trees for two large simulations (Bolshoi and Consuelo) and evaluate two halo finders (ROCKSTAR and BDM). We find that both the ROCKSTAR and BDM halo finders track halos extremely well; in both, the number of halos which do not have physically consistent progenitors is at the 1%-2% level across all halo masses. Our code is publicly available at http://code.google.com/p/consistent-trees. Our trees and catalogs are publicly available at http://hipacc.ucsc.edu/Bolshoi/.

  5. Engineering Design vs. Artistic Design: Some Educational Consequences

    Science.gov (United States)

    Eder, Wolfgang Ernst

    2013-01-01

    "Design" can be a noun, or a verb. Six paths for research into engineering design (as verb) are identified, they must be coordinated for internal consistency and plausibility. Design research tries to clarify design processes and their underlying theories--for designing in general, and for particular forms, e.g., design engineering. Theories are a…

  6. Design analysis of the ITER divertor

    International Nuclear Information System (INIS)

    Samuelli, G.; Marin, A.; Roccella, M.; Lucca, F.; Merola, M.; Riccardi, B.; Petrizzi, L.; Villari, R.

    2007-01-01

    The divertor is one of the most challenging components of the ITER machine. Its function is to reduce the impurity in the plasma and consists essentially of two parts: the plasma facing components (PFCs) and a massive support structure called the cassette body (CB). Considerable R and D effort (developed by EFDA CSU GARCHING and the ITER International Team together with the EU Associations and the EU Industries) has been spent in designing divertor components capable of withstanding the expected electromagnetic (EM) loads and to take into account the latest ITER design conditions. In support of such efforts extensive and very detailed Neutronic, Thermal, EM and Structural analyses have been performed. A summary of the analyses performed will be presented. One of the main result is a typical exercise of integration between the different kind of analyses and the importance of keeping the consistency between the different assumptions and simplifications. The models used for the numerical analyses include a detailed geometrical description of the CB, the inlet, outlet hydraulic manifolds, the CB to vacuum vessel locking system and three configurations of the PFU. The effect of electrical bridging, both in poloidal and toroidal direction, of the PFU castellation, due to a possible melting at the W mono-block or tiles, occurring during the plasma disruptions, has been analyzed. For all these configurations 2 VDE scenarios including the effect of the Toroidal Field Variation and the HaloCurrent with the related out of plane induced EM forces have been extensively analyzed and a detailed poloidal and radial distribution of the nuclear heating has been used for the neutronic flux on the divertor components. The aim of this activity is to produce a comprehensive design and assessment of the ITER divertor via: -The estimation of the neutronic heat deposition and shielding capability; -The calculation of the related thermal and mechanical effects and the comparison of the

  7. Design analysis of the ITER divertor

    Energy Technology Data Exchange (ETDEWEB)

    Samuelli, G.; Marin, A.; Roccella, M.; Lucca, F. [L.T. Calcoli SaS, Merate (Lecco) (Italy); Merola, M. [ITER Team, Cadarache (France); Riccardi, B. [EFDA CSU Garching (Germany); Petrizzi, L.; Villari, R. [CRE ENEA sulla Fusione Frascati, Roma (Italy)

    2007-07-01

    The divertor is one of the most challenging components of the ITER machine. Its function is to reduce the impurity in the plasma and consists essentially of two parts: the plasma facing components (PFCs) and a massive support structure called the cassette body (CB). Considerable R and D effort (developed by EFDA CSU GARCHING and the ITER International Team together with the EU Associations and the EU Industries) has been spent in designing divertor components capable of withstanding the expected electromagnetic (EM) loads and to take into account the latest ITER design conditions. In support of such efforts extensive and very detailed Neutronic, Thermal, EM and Structural analyses have been performed. A summary of the analyses performed will be presented. One of the main result is a typical exercise of integration between the different kind of analyses and the importance of keeping the consistency between the different assumptions and simplifications. The models used for the numerical analyses include a detailed geometrical description of the CB, the inlet, outlet hydraulic manifolds, the CB to vacuum vessel locking system and three configurations of the PFU. The effect of electrical bridging, both in poloidal and toroidal direction, of the PFU castellation, due to a possible melting at the W mono-block or tiles, occurring during the plasma disruptions, has been analyzed. For all these configurations 2 VDE scenarios including the effect of the Toroidal Field Variation and the HaloCurrent with the related out of plane induced EM forces have been extensively analyzed and a detailed poloidal and radial distribution of the nuclear heating has been used for the neutronic flux on the divertor components. The aim of this activity is to produce a comprehensive design and assessment of the ITER divertor via: -The estimation of the neutronic heat deposition and shielding capability; -The calculation of the related thermal and mechanical effects and the comparison of the

  8. Context-dependent individual behavioral consistency in Daphnia

    DEFF Research Database (Denmark)

    Heuschele, Jan; Ekvall, Mikael T.; Bianco, Giuseppe

    2017-01-01

    The understanding of consistent individual differences in behavior, often termed "personality," for adapting and coping with threats and novel environmental conditions has advanced considerably during the last decade. However, advancements are almost exclusively associated with higher-order animals......, whereas studies focusing on smaller aquatic organisms are still rare. Here, we show individual differences in the swimming behavior of Daphnia magna, a clonal freshwater invertebrate, before, during, and after being exposed to a lethal threat, ultraviolet radiation (UVR). We show consistency in swimming...... that of adults. Overall, we show that aquatic invertebrates are far from being identical robots, but instead they show considerable individual differences in behavior that can be attributed to both ontogenetic development and individual consistency. Our study also demonstrates, for the first time...

  9. Consistent forcing scheme in the cascaded lattice Boltzmann method

    Science.gov (United States)

    Fei, Linlin; Luo, Kai Hong

    2017-11-01

    In this paper, we give an alternative derivation for the cascaded lattice Boltzmann method (CLBM) within a general multiple-relaxation-time (MRT) framework by introducing a shift matrix. When the shift matrix is a unit matrix, the CLBM degrades into an MRT LBM. Based on this, a consistent forcing scheme is developed for the CLBM. The consistency of the nonslip rule, the second-order convergence rate in space, and the property of isotropy for the consistent forcing scheme is demonstrated through numerical simulations of several canonical problems. Several existing forcing schemes previously used in the CLBM are also examined. The study clarifies the relation between MRT LBM and CLBM under a general framework.

  10. Application of consistent fluid added mass matrix to core seismic

    International Nuclear Information System (INIS)

    Koo, K. H.; Lee, J. H.

    2003-01-01

    In this paper, the application algorithm of a consistent fluid added mass matrix including the coupling terms to the core seismic analysis is developed and installed at SAC-CORE3.0 code. As an example, we assumed the 7-hexagon system of the LMR core and carried out the vibration modal analysis and the nonlinear time history seismic response analysis using SAC-CORE3.0. Used consistent fluid added mass matrix is obtained by using the finite element program of the FAMD(Fluid Added Mass and Damping) code. From the results of the vibration modal analysis, the core duct assemblies reveal strongly coupled vibration modes, which are so different from the case of in-air condition. From the results of the time history seismic analysis, it was verified that the effects of the coupled terms of the consistent fluid added mass matrix are significant in impact responses and the dynamic responses

  11. Self-consistent approximations beyond the CPA: Part II

    International Nuclear Information System (INIS)

    Kaplan, T.; Gray, L.J.

    1982-01-01

    This paper concentrates on a self-consistent approximation for random alloys developed by Kaplan, Leath, Gray, and Diehl. The construction of the augmented space formalism for a binary alloy is sketched, and the notation to be used derived. Using the operator methods of the augmented space, the self-consistent approximation is derived for the average Green's function, and for evaluating the self-energy, taking into account the scattering by clusters of excitations. The particular cluster approximation desired is derived by treating the scattering by the excitations with S /SUB T/ exactly. Fourier transforms on the disorder-space clustersite labels solve the self-consistent set of equations. Expansion to short range order in the alloy is also discussed. A method to reduce the problem to a computationally tractable form is described

  12. Linear augmented plane wave method for self-consistent calculations

    International Nuclear Information System (INIS)

    Takeda, T.; Kuebler, J.

    1979-01-01

    O.K. Andersen has recently introduced a linear augmented plane wave method (LAPW) for the calculation of electronic structure that was shown to be computationally fast. A more general formulation of an LAPW method is presented here. It makes use of a freely disposable number of eigenfunctions of the radial Schroedinger equation. These eigenfunctions can be selected in a self-consistent way. The present formulation also results in a computationally fast method. It is shown that Andersen's LAPW is obtained in a special limit from the present formulation. Self-consistent test calculations for copper show the present method to be remarkably accurate. As an application, scalar-relativistic self-consistent calculations are presented for the band structure of FCC lanthanum. (author)

  13. Self-consistency and coherent effects in nonlinear resonances

    International Nuclear Information System (INIS)

    Hofmann, I.; Franchetti, G.; Qiang, J.; Ryne, R. D.

    2003-01-01

    The influence of space charge on emittance growth is studied in simulations of a coasting beam exposed to a strong octupolar perturbation in an otherwise linear lattice, and under stationary parameters. We explore the importance of self-consistency by comparing results with a non-self-consistent model, where the space charge electric field is kept 'frozen-in' to its initial values. For Gaussian distribution functions we find that the 'frozen-in' model results in a good approximation of the self-consistent model, hence coherent response is practically absent and the emittance growth is self-limiting due to space charge de-tuning. For KV or waterbag distributions, instead, strong coherent response is found, which we explain in terms of absence of Landau damping

  14. A consistent time frame for Chaucer's Canterbury Pilgrimage

    Science.gov (United States)

    Kummerer, K. R.

    2001-08-01

    A consistent time frame for the pilgrimage that Geoffrey Chaucer describes in The Canterbury Tales can be established if the seven celestial assertions related to the journey mentioned in the text can be reconciled with each other and the date of April 18 that is also mentioned. Past attempts to establish such a consistency for all seven celestial assertions have not been successful. The analysis herein, however, indicates that in The Canterbury Tales Chaucer accurately describes the celestial conditions he observed in the April sky above the London(Canterbury region of England in the latter half of the fourteenth century. All seven celestial assertions are in agreement with each other and consistent with the April 18 date. The actual words of Chaucer indicate that the Canterbury journey began during the 'seson' he defines in the General Prologue and ends under the light of the full Moon on the night of April 18, 1391.

  15. An approach to a self-consistent nuclear energy system

    International Nuclear Information System (INIS)

    Fujii-e, Yoichi; Arie, Kazuo; Endo, Hiroshi

    1992-01-01

    A nuclear energy system should provide a stable supply of energy without endangering the environment or humans. If there is fear about exhausting world energy resources, accumulating radionuclides, and nuclear reactor safety, tension is created in human society. Nuclear energy systems of the future should be able to eliminate fear from people's minds. In other words, the whole system, including the nuclear fuel cycle, should be self-consistent. This is the ultimate goal of nuclear energy. If it can be realized, public acceptance of nuclear energy will increase significantly. In a self-consistent nuclear energy system, misunderstandings between experts on nuclear energy and the public should be minimized. The way to achieve this goal is to explain using simple logic. This paper proposes specific targets for self-consistent nuclear energy systems and shows that the fast breeder reactor (FBR) lies on the route to attaining the final goal

  16. Consistent forcing scheme in the cascaded lattice Boltzmann method.

    Science.gov (United States)

    Fei, Linlin; Luo, Kai Hong

    2017-11-01

    In this paper, we give an alternative derivation for the cascaded lattice Boltzmann method (CLBM) within a general multiple-relaxation-time (MRT) framework by introducing a shift matrix. When the shift matrix is a unit matrix, the CLBM degrades into an MRT LBM. Based on this, a consistent forcing scheme is developed for the CLBM. The consistency of the nonslip rule, the second-order convergence rate in space, and the property of isotropy for the consistent forcing scheme is demonstrated through numerical simulations of several canonical problems. Several existing forcing schemes previously used in the CLBM are also examined. The study clarifies the relation between MRT LBM and CLBM under a general framework.

  17. Consistency and Reconciliation Model In Regional Development Planning

    Directory of Open Access Journals (Sweden)

    Dina Suryawati

    2016-10-01

    Full Text Available The aim of this study was to identify the problems and determine the conceptual model of regional development planning. Regional development planning is a systemic, complex and unstructured process. Therefore, this study used soft systems methodology to outline unstructured issues with a structured approach. The conceptual models that were successfully constructed in this study are a model of consistency and a model of reconciliation. Regional development planning is a process that is well-integrated with central planning and inter-regional planning documents. Integration and consistency of regional planning documents are very important in order to achieve the development goals that have been set. On the other hand, the process of development planning in the region involves technocratic system, that is, both top-down and bottom-up system of participation. Both must be balanced, do not overlap and do not dominate each other. regional, development, planning, consistency, reconciliation

  18. Bootstrap-Based Inference for Cube Root Consistent Estimators

    DEFF Research Database (Denmark)

    Cattaneo, Matias D.; Jansson, Michael; Nagasawa, Kenichi

    This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known...... to be inconsistent. Our method restores consistency of the nonparametric bootstrap by altering the shape of the criterion function defining the estimator whose distribution we seek to approximate. This modification leads to a generic and easy-to-implement resampling method for inference that is conceptually distinct...... from other available distributional approximations based on some form of modified bootstrap. We offer simulation evidence showcasing the performance of our inference method in finite samples. An extension of our methodology to general M-estimation problems is also discussed....

  19. Self-consistent modelling of resonant tunnelling structures

    DEFF Research Database (Denmark)

    Fiig, T.; Jauho, A.P.

    1992-01-01

    We report a comprehensive study of the effects of self-consistency on the I-V-characteristics of resonant tunnelling structures. The calculational method is based on a simultaneous solution of the effective-mass Schrödinger equation and the Poisson equation, and the current is evaluated...... applied voltages and carrier densities at the emitter-barrier interface. We include the two-dimensional accumulation layer charge and the quantum well charge in our self-consistent scheme. We discuss the evaluation of the current contribution originating from the two-dimensional accumulation layer charges......, and our qualitative estimates seem consistent with recent experimental studies. The intrinsic bistability of resonant tunnelling diodes is analyzed within several different approximation schemes....

  20. On Consistency Test Method of Expert Opinion in Ecological Security Assessment.

    Science.gov (United States)

    Gong, Zaiwu; Wang, Lihong

    2017-09-04

    To reflect the initiative design and initiative of human security management and safety warning, ecological safety assessment is of great value. In the comprehensive evaluation of regional ecological security with the participation of experts, the expert's individual judgment level, ability and the consistency of the expert's overall opinion will have a very important influence on the evaluation result. This paper studies the consistency measure and consensus measure based on the multiplicative and additive consistency property of fuzzy preference relation (FPR). We firstly propose the optimization methods to obtain the optimal multiplicative consistent and additively consistent FPRs of individual and group judgments, respectively. Then, we put forward a consistency measure by computing the distance between the original individual judgment and the optimal individual estimation, along with a consensus measure by computing the distance between the original collective judgment and the optimal collective estimation. In the end, we make a case study on ecological security for five cities. Result shows that the optimal FPRs are helpful in measuring the consistency degree of individual judgment and the consensus degree of collective judgment.

  1. An Explicit Consistent Geometric Stiffness Matrix for the DKT Element

    Directory of Open Access Journals (Sweden)

    Eliseu Lucena Neto

    Full Text Available Abstract A large number of references dealing with the geometric stiffness matrix of the DKT finite element exist in the literature, where nearly all of them adopt an inconsistent form. While such a matrix may be part of the element to treat nonlinear problems in general, it is of crucial importance for linearized buckling analysis. The present work seems to be the first to obtain an explicit expression for this matrix in a consistent way. Numerical results on linear buckling of plates assess the element performance either with the proposed explicit consistent matrix, or with the most commonly used inconsistent matrix.

  2. The cluster bootstrap consistency in generalized estimating equations

    KAUST Repository

    Cheng, Guang

    2013-03-01

    The cluster bootstrap resamples clusters or subjects instead of individual observations in order to preserve the dependence within each cluster or subject. In this paper, we provide a theoretical justification of using the cluster bootstrap for the inferences of the generalized estimating equations (GEE) for clustered/longitudinal data. Under the general exchangeable bootstrap weights, we show that the cluster bootstrap yields a consistent approximation of the distribution of the regression estimate, and a consistent approximation of the confidence sets. We also show that a computationally more efficient one-step version of the cluster bootstrap provides asymptotically equivalent inference. © 2012.

  3. Consistency in the description of diffusion in compacted bentonite

    International Nuclear Information System (INIS)

    Lehikoinen, J.; Muurinen, A.

    2009-01-01

    A macro-level diffusion model, which aims to provide a unifying framework for explaining the experimentally observed co-ion exclusion and greatly controversial counter-ion surface diffusion in a consistent fashion, is presented. It is explained in detail why a term accounting for the non-zero mobility of the counter-ion surface excess is required in the mathematical form of the macroscopic diffusion flux. The prerequisites for the consistency of the model and the problems associated with the interpretation of diffusion in such complex pore geometries as in compacted smectite clays are discussed. (author)

  4. An energetically consistent vertical mixing parameterization in CCSM4

    DEFF Research Database (Denmark)

    Nielsen, Søren Borg; Jochum, Markus; Eden, Carsten

    2018-01-01

    An energetically consistent stratification-dependent vertical mixing parameterization is implemented in the Community Climate System Model 4 and forced with energy conversion from the barotropic tides to internal waves. The structures of the resulting dissipation and diffusivity fields are compared......, however, depends greatly on the details of the vertical mixing parameterizations, where the new energetically consistent parameterization results in low thermocline diffusivities and a sharper and shallower thermocline. It is also investigated if the ocean state is more sensitive to a change in forcing...

  5. The consistency service of the ATLAS Distributed Data Management system

    CERN Document Server

    Serfon, C; The ATLAS collaboration

    2011-01-01

    With the continuously increasing volume of data produced by ATLAS and stored on the WLCG sites, the probability of data corruption or data losses, due to software and hardware failures is increasing. In order to ensure the consistency of all data produced by ATLAS a Consistency Service has been developed as part of the DQ2 Distributed Data Management system. This service is fed by the different ATLAS tools, i.e. the analysis tools, production tools, DQ2 site services or by site administrators that report corrupted or lost files. It automatically corrects the errors reported and informs the users in case of irrecoverable file loss.

  6. The Consistency Service of the ATLAS Distributed Data Management system

    CERN Document Server

    Serfon, C; The ATLAS collaboration

    2010-01-01

    With the continuously increasing volume of data produced by ATLAS and stored on the WLCG sites, the probability of data corruption or data losses, due to software and hardware failure is increasing. In order to ensure the consistency of all data produced by ATLAS a Consistency Service has been developed as part of the DQ2 Distributed Data Management system. This service is fed by the different ATLAS tools, i.e. the analysis tools, production tools, DQ2 site services or by site administrators that report corrupted or lost files. It automatically correct the errors reported and informs the users in case of irrecoverable file loss.

  7. Consistency among integral measurements of aggregate decay heat power

    Energy Technology Data Exchange (ETDEWEB)

    Takeuchi, H.; Sagisaka, M.; Oyamatsu, K.; Kukita, Y. [Nagoya Univ. (Japan)

    1998-03-01

    Persisting discrepancies between summation calculations and integral measurements force us to assume large uncertainties in the recommended decay heat power. In this paper, we develop a hybrid method to calculate the decay heat power of a fissioning system from those of different fissioning systems. Then, this method is applied to examine consistency among measured decay heat powers of {sup 232}Th, {sup 233}U, {sup 235}U, {sup 238}U and {sup 239}Pu at YAYOI. The consistency among the measured values are found to be satisfied for the {beta} component and fairly well for the {gamma} component, except for cooling times longer than 4000 s. (author)

  8. Standard Model Vacuum Stability and Weyl Consistency Conditions

    DEFF Research Database (Denmark)

    Antipin, Oleg; Gillioz, Marc; Krog, Jens

    2013-01-01

    At high energy the standard model possesses conformal symmetry at the classical level. This is reflected at the quantum level by relations between the different beta functions of the model. These relations are known as the Weyl consistency conditions. We show that it is possible to satisfy them...... order by order in perturbation theory, provided that a suitable coupling constant counting scheme is used. As a direct phenomenological application, we study the stability of the standard model vacuum at high energies and compare with previous computations violating the Weyl consistency conditions....

  9. Weyl consistency conditions in non-relativistic quantum field theory

    Energy Technology Data Exchange (ETDEWEB)

    Pal, Sridip; Grinstein, Benjamín [Department of Physics, University of California,San Diego, 9500 Gilman Drive, La Jolla, CA 92093 (United States)

    2016-12-05

    Weyl consistency conditions have been used in unitary relativistic quantum field theory to impose constraints on the renormalization group flow of certain quantities. We classify the Weyl anomalies and their renormalization scheme ambiguities for generic non-relativistic theories in 2+1 dimensions with anisotropic scaling exponent z=2; the extension to other values of z are discussed as well. We give the consistency conditions among these anomalies. As an application we find several candidates for a C-theorem. We comment on possible candidates for a C-theorem in higher dimensions.

  10. A Van Atta reflector consisting of half-wave dipoles

    DEFF Research Database (Denmark)

    Appel-Hansen, Jørgen

    1966-01-01

    The reradiation pattern of a passive Van Atta reflector consisting of half-wave dipoles is investigated. The character of the reradiation pattern first is deduced by qualitative and physical considerations. Various types of array elements are considered and several geometrical configurations...... of these elements are outlined. Following this, an analysis is made of the reradiation pattern of a linear Van Atta array consisting of four equispaced half-wave dipoles. The general form of the reradiation pattern is studied analytically. The influence of scattering and coupling is determined and the dependence...

  11. A self-consistent theory of the magnetic polaron

    International Nuclear Information System (INIS)

    Marvakov, D.I.; Kuzemsky, A.L.; Vlahov, J.P.

    1984-10-01

    A finite temperature self-consistent theory of magnetic polaron in the s-f model of ferromagnetic semiconductors is developed. The calculations are based on the novel approach of the thermodynamic two-time Green function methods. This approach consists in the introduction of the ''irreducible'' Green functions (IGF) and derivation of the exact Dyson equation and exact self-energy operator. It is shown that IGF method gives a unified and natural approach for a calculation of the magnetic polaron states by taking explicitly into account the damping effects and finite lifetime. (author)

  12. Diagnostic language consistency among multicultural English-speaking nurses.

    Science.gov (United States)

    Wieck, K L

    1996-01-01

    Cultural differences among nurses may influence the choice of terminology applicable to use of a nursing diagnostic statement. This study explored whether defining characteristics are consistently applied by culturally varied nurses in an English language setting. Two diagnoses, pain, and high risk for altered skin integrity, were studied within six cultures: African, Asian, Filipino, East Indian, African-American, and Anglo-American nurses. Overall, there was consistency between the cultural groups. Analysis of variance for the pain scale demonstrated differences among cultures on two characteristics of pain, restlessness and grimace. The only difference on the high risk for altered skin integrity scale was found on the distructor, supple skin.

  13. The least weighted squares II. Consistency and asymptotic normality

    Czech Academy of Sciences Publication Activity Database

    Víšek, Jan Ámos

    2002-01-01

    Roč. 9, č. 16 (2002), s. 1-28 ISSN 1212-074X R&D Projects: GA AV ČR KSK1019101 Grant - others:GA UK(CR) 255/2000/A EK /FSV Institutional research plan: CEZ:AV0Z1075907 Keywords : robust regression * consistency * asymptotic normality Subject RIV: BA - General Mathematics

  14. Consistency relation for the Lorentz invariant single-field inflation

    International Nuclear Information System (INIS)

    Huang, Qing-Guo

    2010-01-01

    In this paper we compute the sizes of equilateral and orthogonal shape bispectrum for the general Lorentz invariant single-field inflation. The stability of field theory implies a non-negative square of sound speed which leads to a consistency relation between the sizes of orthogonal and equilateral shape bispectrum, namely f NL orth. ≤ −0.054f NL equil. . In particular, for the single-field Dirac-Born-Infeld (DBI) inflation, the consistency relation becomes f NL orth. = 0.070f NL equil. ≤ 0. These consistency relations are also valid in the mixed scenario where the quantum fluctuations of some other light scalar fields contribute to a part of total curvature perturbation on the super-horizon scale and may generate a local form bispectrum. A distinguishing prediction of the mixed scenario is τ NL loc. > ((6/5)f NL loc. ) 2 . Comparing these consistency relations to WMAP 7yr data, there is still a big room for the Lorentz invariant inflation, but DBI inflation has been disfavored at more than 68% CL

  15. Short-Cut Estimators of Criterion-Referenced Test Consistency.

    Science.gov (United States)

    Brown, James Dean

    1990-01-01

    Presents simplified methods for deriving estimates of the consistency of criterion-referenced, English-as-a-Second-Language tests, including (1) the threshold loss agreement approach using agreement or kappa coefficients, (2) the squared-error loss agreement approach using the phi(lambda) dependability approach, and (3) the domain score…

  16. SOCIAL COMPARISON, SELF-CONSISTENCY AND THE PRESENTATION OF SELF.

    Science.gov (United States)

    MORSE, STANLEY J.; GERGEN, KENNETH J.

    TO DISCOVER HOW A PERSON'S (P) SELF-CONCEPT IS AFFECTED BY THE CHARACTERISTICS OF ANOTHER (O) WHO SUDDENLY APPEARS IN THE SAME SOCIAL ENVIRONMENT, SEVERAL QUESTIONNAIRES, INCLUDING THE GERGEN-MORSE (1967) SELF-CONSISTENCY SCALE AND HALF THE COOPERSMITH SELF-ESTEEM INVENTORY, WERE ADMINISTERED TO 78 UNDERGRADUATE MEN WHO HAD ANSWERED AN AD FOR WORK…

  17. Consistency of the Takens estimator for the correlation dimension

    NARCIS (Netherlands)

    Borovkova, S.; Burton, Robert; Dehling, H.

    Motivated by the problem of estimating the fractal dimension of a strange attractor, we prove weak consistency of U-statistics for stationary ergodic and mixing sequences when the kernel function is unbounded, extending by this earlier results of Aaronson, Burton, Dehling, Gilat, Hill and Weiss. We

  18. Assessing atmospheric bias correction for dynamical consistency using potential vorticity

    International Nuclear Information System (INIS)

    Rocheta, Eytan; Sharma, Ashish; Evans, Jason P

    2014-01-01

    Correcting biases in atmospheric variables prior to impact studies or dynamical downscaling can lead to new biases as dynamical consistency between the ‘corrected’ fields is not maintained. Use of these bias corrected fields for subsequent impact studies and dynamical downscaling provides input conditions that do not appropriately represent intervariable relationships in atmospheric fields. Here we investigate the consequences of the lack of dynamical consistency in bias correction using a measure of model consistency—the potential vorticity (PV). This paper presents an assessment of the biases present in PV using two alternative correction techniques—an approach where bias correction is performed individually on each atmospheric variable, thereby ignoring the physical relationships that exists between the multiple variables that are corrected, and a second approach where bias correction is performed directly on the PV field, thereby keeping the system dynamically coherent throughout the correction process. In this paper we show that bias correcting variables independently results in increased errors above the tropopause in the mean and standard deviation of the PV field, which are improved when using the alternative proposed. Furthermore, patterns of spatial variability are improved over nearly all vertical levels when applying the alternative approach. Results point to a need for a dynamically consistent atmospheric bias correction technique which results in fields that can be used as dynamically consistent lateral boundaries in follow-up downscaling applications. (letter)

  19. An algebraic method for constructing stable and consistent autoregressive filters

    International Nuclear Information System (INIS)

    Harlim, John; Hong, Hoon; Robbins, Jacob L.

    2015-01-01

    In this paper, we introduce an algebraic method to construct stable and consistent univariate autoregressive (AR) models of low order for filtering and predicting nonlinear turbulent signals with memory depth. By stable, we refer to the classical stability condition for the AR model. By consistent, we refer to the classical consistency constraints of Adams–Bashforth methods of order-two. One attractive feature of this algebraic method is that the model parameters can be obtained without directly knowing any training data set as opposed to many standard, regression-based parameterization methods. It takes only long-time average statistics as inputs. The proposed method provides a discretization time step interval which guarantees the existence of stable and consistent AR model and simultaneously produces the parameters for the AR models. In our numerical examples with two chaotic time series with different characteristics of decaying time scales, we find that the proposed AR models produce significantly more accurate short-term predictive skill and comparable filtering skill relative to the linear regression-based AR models. These encouraging results are robust across wide ranges of discretization times, observation times, and observation noise variances. Finally, we also find that the proposed model produces an improved short-time prediction relative to the linear regression-based AR-models in forecasting a data set that characterizes the variability of the Madden–Julian Oscillation, a dominant tropical atmospheric wave pattern

  20. Delimiting coefficient alpha from internal consistency and unidimensionality

    NARCIS (Netherlands)

    Sijtsma, K.

    2015-01-01

    I discuss the contribution by Davenport, Davison, Liou, & Love (2015) in which they relate reliability represented by coefficient α to formal definitions of internal consistency and unidimensionality, both proposed by Cronbach (1951). I argue that coefficient α is a lower bound to reliability and

  1. Challenges of Predictability and Consistency in the First ...

    African Journals Online (AJOL)

    This article aims to investigate some features of Endemann's (1911) Wörterbuch der Sotho-Sprache (Dictionary of the Sotho language) with the focus on challenges of predictability and consistency in the lemmatization approach, the access alphabet, cross references and article treatments. The dictionary has hitherto ...

  2. The Impact of Orthographic Consistency on German Spoken Word Identification

    Science.gov (United States)

    Beyermann, Sandra; Penke, Martina

    2014-01-01

    An auditory lexical decision experiment was conducted to find out whether sound-to-spelling consistency has an impact on German spoken word processing, and whether such an impact is different at different stages of reading development. Four groups of readers (school children in the second, third and fifth grades, and university students)…

  3. Final Report Fermionic Symmetries and Self consistent Shell Model

    International Nuclear Information System (INIS)

    Zamick, Larry

    2008-01-01

    In this final report in the field of theoretical nuclear physics we note important accomplishments.We were confronted with 'anomoulous' magnetic moments by the experimetalists and were able to expain them. We found unexpected partial dynamical symmetries--completely unknown before, and were able to a large extent to expain them. The importance of a self consistent shell model was emphasized.

  4. Using the Perceptron Algorithm to Find Consistent Hypotheses

    OpenAIRE

    Anthony, M.; Shawe-Taylor, J.

    1993-01-01

    The perceptron learning algorithm yields quite naturally an algorithm for finding a linearly separable boolean function consistent with a sample of such a function. Using the idea of a specifying sample, we give a simple proof that this algorithm is not efficient, in general.

  5. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  6. Consistent seasonal snow cover depth and duration variability over ...

    Indian Academy of Sciences (India)

    Decline in consistent seasonal snow cover depth, duration and changing snow cover build- up pattern over the WH in recent decades indicate that WH has undergone considerable climate change and winter weather patterns are changing in the WH. 1. Introduction. Mountainous regions around the globe are storehouses.

  7. Is There a Future for Education Consistent with Agenda 21?

    Science.gov (United States)

    Smyth, John

    1999-01-01

    Discusses recent experiences in developing and implementing strategies for education consistent with the concept of sustainable development at two different levels: (1) the international level characterized by Agenda 21 along with the efforts of the United Nations Commission on Sustainable Development to foster its progress; and (2) the national…

  8. Diagnosing a Strong-Fault Model by Conflict and Consistency

    Directory of Open Access Journals (Sweden)

    Wenfeng Zhang

    2018-03-01

    Full Text Available The diagnosis method for a weak-fault model with only normal behaviors of each component has evolved over decades. However, many systems now demand a strong-fault models, the fault modes of which have specific behaviors as well. It is difficult to diagnose a strong-fault model due to its non-monotonicity. Currently, diagnosis methods usually employ conflicts to isolate possible fault and the process can be expedited when some observed output is consistent with the model’s prediction where the consistency indicates probably normal components. This paper solves the problem of efficiently diagnosing a strong-fault model by proposing a novel Logic-based Truth Maintenance System (LTMS with two search approaches based on conflict and consistency. At the beginning, the original a strong-fault model is encoded by Boolean variables and converted into Conjunctive Normal Form (CNF. Then the proposed LTMS is employed to reason over CNF and find multiple minimal conflicts and maximal consistencies when there exists fault. The search approaches offer the best candidate efficiency based on the reasoning result until the diagnosis results are obtained. The completeness, coverage, correctness and complexity of the proposals are analyzed theoretically to show their strength and weakness. Finally, the proposed approaches are demonstrated by applying them to a real-world domain—the heat control unit of a spacecraft—where the proposed methods are significantly better than best first and conflict directly with A* search methods.

  9. Consistent dynamical and statistical description of fission and comparison

    Energy Technology Data Exchange (ETDEWEB)

    Shunuan, Wang [Chinese Nuclear Data Center, Beijing, BJ (China)

    1996-06-01

    The research survey of consistent dynamical and statistical description of fission is briefly introduced. The channel theory of fission with diffusive dynamics based on Bohr channel theory of fission and Fokker-Planck equation and Kramers-modified Bohr-Wheeler expression according to Strutinsky method given by P.Frobrich et al. are compared and analyzed. (2 figs.).

  10. Brief Report: Consistency of Search Engine Rankings for Autism Websites

    Science.gov (United States)

    Reichow, Brian; Naples, Adam; Steinhoff, Timothy; Halpern, Jason; Volkmar, Fred R.

    2012-01-01

    The World Wide Web is one of the most common methods used by parents to find information on autism spectrum disorders and most consumers find information through search engines such as Google or Bing. However, little is known about how the search engines operate or the consistency of the results that are returned over time. This study presents the…

  11. Consistency of the Self-Schema in Depression.

    Science.gov (United States)

    Ross, Michael J.; Mueller, John H.

    Depressed individuals may filter or distort environmental information in direct relationship to their self perceptions. To investigate the degree of uncertainty about oneself and others, as measured by consistent/inconsistent responses, 72 college students (32 depressed and 40 nondepressed) rated selected adjectives from the Derry and Kuiper…

  12. Composition consisting of a dendrimer and an active substance

    NARCIS (Netherlands)

    1995-01-01

    The invention relates to a composition consisting of a dendrimer provided with blocking agents and an active substance occluded in the dendrimer. According to the invention a blocking agent is a compound which is sterically of sufficient size, which readily enters into a chemical bond with the

  13. Analytical relativistic self-consistent-field calculations for atoms

    International Nuclear Information System (INIS)

    Barthelat, J.C.; Pelissier, M.; Durand, P.

    1980-01-01

    A new second-order representation of the Dirac equation is presented. This representation which is exact for a hydrogen atom is applied to approximate analytical self-consistent-field calculations for atoms. Results are given for the rare-gas atoms from helium to radon and for lead. The results compare favorably with numerical Dirac-Hartree-Fock solutions

  14. A consistent analysis for the quark condensate in QCD

    International Nuclear Information System (INIS)

    Huang Zheng; Huang Tao

    1988-08-01

    The dynamical symmetry breaking in QCD is analysed based on the vacuum condensates. A self-consistent equation for the quark condensate (φ φ) is derived. A nontrivial solution for (φ φ) ≠ 0 is given in terms of the QCD scale parameter A

  15. The consistency assessment of topological relations in cartographic generalization

    Science.gov (United States)

    Zheng, Chunyan; Guo, Qingsheng; Du, Xiaochu

    2006-10-01

    The field of research in the generalization assessment has been less studied than the generalization process itself, and it is very important to keep topological relation consistency for meeting generalization quality. This paper proposes a methodology to assess the quality of generalized map from topological relations consistency. Taking roads (including railway) and residential areas for examples, from the viewpoint of the spatial cognition, some issues about topological consistency in different map scales are analyzed. The statistic information about the inconsistent topological relations can be obtained by comparing the two matrices: one is the matrix for the topological relations in the generalized map; the other is the theoretical matrix for the topological relations that should be maintained after generalization. Based on the fuzzy set theory and the classification of map object types, the consistency evaluation model of topological relations is established. The paper proves the feasibility of the method through the example about how to evaluate the local topological relations between simple roads and residential area finally.

  16. Numerical consistency check between two approaches to radiative ...

    Indian Academy of Sciences (India)

    approaches for a consistency check on numerical accuracy, and find out the stabil- ... ln(MR/1 GeV) to top-quark mass scale t0(= ln(mt/1 GeV)) where t0 ≤ t ≤ tR, we ..... It is in general to tone down the solar mixing angle through further fine.

  17. Consistency or Discrepancy? Rethinking Schools from Organizational Hypocrisy to Integrity

    Science.gov (United States)

    Kiliçoglu, Gökhan

    2017-01-01

    Consistency in statements, decisions and practices is highly important for both organization members and the image of an organization. It is expected from organizations, especially from their administrators, to "walk the talk"--in other words, to try to practise what they preach. However, in the process of gaining legitimacy and adapting…

  18. Diagnosing a Strong-Fault Model by Conflict and Consistency.

    Science.gov (United States)

    Zhang, Wenfeng; Zhao, Qi; Zhao, Hongbo; Zhou, Gan; Feng, Wenquan

    2018-03-29

    The diagnosis method for a weak-fault model with only normal behaviors of each component has evolved over decades. However, many systems now demand a strong-fault models, the fault modes of which have specific behaviors as well. It is difficult to diagnose a strong-fault model due to its non-monotonicity. Currently, diagnosis methods usually employ conflicts to isolate possible fault and the process can be expedited when some observed output is consistent with the model's prediction where the consistency indicates probably normal components. This paper solves the problem of efficiently diagnosing a strong-fault model by proposing a novel Logic-based Truth Maintenance System (LTMS) with two search approaches based on conflict and consistency. At the beginning, the original a strong-fault model is encoded by Boolean variables and converted into Conjunctive Normal Form (CNF). Then the proposed LTMS is employed to reason over CNF and find multiple minimal conflicts and maximal consistencies when there exists fault. The search approaches offer the best candidate efficiency based on the reasoning result until the diagnosis results are obtained. The completeness, coverage, correctness and complexity of the proposals are analyzed theoretically to show their strength and weakness. Finally, the proposed approaches are demonstrated by applying them to a real-world domain-the heat control unit of a spacecraft-where the proposed methods are significantly better than best first and conflict directly with A* search methods.

  19. Consistency Check for the Bin Packing Constraint Revisited

    Science.gov (United States)

    Dupuis, Julien; Schaus, Pierre; Deville, Yves

    The bin packing problem (BP) consists in finding the minimum number of bins necessary to pack a set of items so that the total size of the items in each bin does not exceed the bin capacity C. The bin capacity is common for all the bins.

  20. Matrix analysis for associated consistency in cooperative game theory

    NARCIS (Netherlands)

    Xu, G.; Driessen, Theo; Sun, H.; Sun, H.

    Hamiache's recent axiomatization of the well-known Shapley value for TU games states that the Shapley value is the unique solution verifying the following three axioms: the inessential game property, continuity and associated consistency. Driessen extended Hamiache's axiomatization to the enlarged

  1. Matrix analysis for associated consistency in cooperative game theory

    NARCIS (Netherlands)

    Xu Genjiu, G.; Driessen, Theo; Sun, H.; Sun, H.

    Hamiache axiomatized the Shapley value as the unique solution verifying the inessential game property, continuity and associated consistency. Driessen extended Hamiache’s axiomatization to the enlarged class of efficient, symmetric, and linear values. In this paper, we introduce the notion of row

  2. Robust Visual Tracking Via Consistent Low-Rank Sparse Learning

    KAUST Repository

    Zhang, Tianzhu

    2014-06-19

    Object tracking is the process of determining the states of a target in consecutive video frames based on properties of motion and appearance consistency. In this paper, we propose a consistent low-rank sparse tracker (CLRST) that builds upon the particle filter framework for tracking. By exploiting temporal consistency, the proposed CLRST algorithm adaptively prunes and selects candidate particles. By using linear sparse combinations of dictionary templates, the proposed method learns the sparse representations of image regions corresponding to candidate particles jointly by exploiting the underlying low-rank constraints. In addition, the proposed CLRST algorithm is computationally attractive since temporal consistency property helps prune particles and the low-rank minimization problem for learning joint sparse representations can be efficiently solved by a sequence of closed form update operations. We evaluate the proposed CLRST algorithm against 14 state-of-the-art tracking methods on a set of 25 challenging image sequences. Experimental results show that the CLRST algorithm performs favorably against state-of-the-art tracking methods in terms of accuracy and execution time.

  3. Structural covariance networks across healthy young adults and their consistency.

    Science.gov (United States)

    Guo, Xiaojuan; Wang, Yan; Guo, Taomei; Chen, Kewei; Zhang, Jiacai; Li, Ke; Jin, Zhen; Yao, Li

    2015-08-01

    To investigate structural covariance networks (SCNs) as measured by regional gray matter volumes with structural magnetic resonance imaging (MRI) from healthy young adults, and to examine their consistency and stability. Two independent cohorts were included in this study: Group 1 (82 healthy subjects aged 18-28 years) and Group 2 (109 healthy subjects aged 20-28 years). Structural MRI data were acquired at 3.0T and 1.5T using a magnetization prepared rapid-acquisition gradient echo sequence for these two groups, respectively. We applied independent component analysis (ICA) to construct SCNs and further applied the spatial overlap ratio and correlation coefficient to evaluate the spatial consistency of the SCNs between these two datasets. Seven and six independent components were identified for Group 1 and Group 2, respectively. Moreover, six SCNs including the posterior default mode network, the visual and auditory networks consistently existed across the two datasets. The overlap ratios and correlation coefficients of the visual network reached the maximums of 72% and 0.71. This study demonstrates the existence of consistent SCNs corresponding to general functional networks. These structural covariance findings may provide insight into the underlying organizational principles of brain anatomy. © 2014 Wiley Periodicals, Inc.

  4. Consistency in behavior of the CEO regarding corporate social responsibility

    NARCIS (Netherlands)

    Elving, W.J.L.; Kartal, D.

    2012-01-01

    Purpose - When corporations adopt a corporate social responsibility (CSR) program and use and name it in their external communications, their members should act in line with CSR. The purpose of this paper is to present an experiment in which the consistent or inconsistent behavior of a CEO was

  5. Self-consistent description of the isospin mixing

    International Nuclear Information System (INIS)

    Gabrakov, S.I.; Pyatov, N.I.; Baznat, M.I.; Salamov, D.I.

    1978-03-01

    The properties of collective 0 + states built of unlike particle-hole excitations in spherical nuclei have been investigated in a self-consistent microscopic approach. These states arise when the broken isospin symmetry of the nuclear shell model Hamiltonian is restored. The numerical calculations were performed with Woods-Saxon wave functions

  6. Potential application of the consistency approach for vaccine potency testing.

    Science.gov (United States)

    Arciniega, J; Sirota, L A

    2012-01-01

    The Consistency Approach offers the possibility of reducing the number of animals used for a potency test. However, it is critical to assess the effect that such reduction may have on assay performance. Consistency of production, sometimes referred to as consistency of manufacture or manufacturing, is an old concept implicit in regulation, which aims to ensure the uninterrupted release of safe and effective products. Consistency of manufacture can be described in terms of process capability, or the ability of a process to produce output within specification limits. For example, the standard method for potency testing of inactivated rabies vaccines is a multiple-dilution vaccination challenge test in mice that gives a quantitative, although highly variable estimate. On the other hand, a single-dilution test that does not give a quantitative estimate, but rather shows if the vaccine meets the specification has been proposed. This simplified test can lead to a considerable reduction in the number of animals used. However, traditional indices of process capability assume that the output population (potency values) is normally distributed, which clearly is not the case for the simplified approach. Appropriate computation of capability indices for the latter case will require special statistical considerations.

  7. Video Design Games

    DEFF Research Database (Denmark)

    Smith, Rachel Charlotte; Christensen, Kasper Skov; Iversen, Ole Sejer

    We introduce Video Design Games to train educators in teaching design. The Video Design Game is a workshop format consisting of three rounds in which participants observe, reflect and generalize based on video snippets from their own practice. The paper reports on a Video Design Game workshop...... in which 25 educators as part of a digital fabrication and design program were able to critically reflect on their teaching practice....

  8. Performance and consistency of indicator groups in two biodiversity hotspots.

    Directory of Open Access Journals (Sweden)

    Joaquim Trindade-Filho

    Full Text Available In a world limited by data availability and limited funds for conservation, scientists and practitioners must use indicator groups to define spatial conservation priorities. Several studies have evaluated the effectiveness of indicator groups, but still little is known about the consistency in performance of these groups in different regions, which would allow their a priori selection.We systematically examined the effectiveness and the consistency of nine indicator groups in representing mammal species in two top-ranked Biodiversity Hotspots (BH: the Brazilian Cerrado and the Atlantic Forest. To test for group effectiveness we first found the best sets of sites able to maximize the representation of each indicator group in the BH and then calculated the average representation of different target species by the indicator groups in the BH. We considered consistent indicator groups whose representation of target species was not statistically different between BH. We called effective those groups that outperformed the target-species representation achieved by random sets of species. Effective indicator groups required the selection of less than 2% of the BH area for representing target species. Restricted-range species were the most effective indicators for the representation of all mammal diversity as well as target species. It was also the only group with high consistency.We show that several indicator groups could be applied as shortcuts for representing mammal species in the Cerrado and the Atlantic Forest to develop conservation plans, however, only restricted-range species consistently held as the most effective indicator group for such a task. This group is of particular importance in conservation planning as it captures high diversity of endemic and endangered species.

  9. Performance and consistency of indicator groups in two biodiversity hotspots.

    Science.gov (United States)

    Trindade-Filho, Joaquim; Loyola, Rafael Dias

    2011-01-01

    In a world limited by data availability and limited funds for conservation, scientists and practitioners must use indicator groups to define spatial conservation priorities. Several studies have evaluated the effectiveness of indicator groups, but still little is known about the consistency in performance of these groups in different regions, which would allow their a priori selection. We systematically examined the effectiveness and the consistency of nine indicator groups in representing mammal species in two top-ranked Biodiversity Hotspots (BH): the Brazilian Cerrado and the Atlantic Forest. To test for group effectiveness we first found the best sets of sites able to maximize the representation of each indicator group in the BH and then calculated the average representation of different target species by the indicator groups in the BH. We considered consistent indicator groups whose representation of target species was not statistically different between BH. We called effective those groups that outperformed the target-species representation achieved by random sets of species. Effective indicator groups required the selection of less than 2% of the BH area for representing target species. Restricted-range species were the most effective indicators for the representation of all mammal diversity as well as target species. It was also the only group with high consistency. We show that several indicator groups could be applied as shortcuts for representing mammal species in the Cerrado and the Atlantic Forest to develop conservation plans, however, only restricted-range species consistently held as the most effective indicator group for such a task. This group is of particular importance in conservation planning as it captures high diversity of endemic and endangered species.

  10. Conformal consistency relations for single-field inflation

    International Nuclear Information System (INIS)

    Creminelli, Paolo; Noreña, Jorge; Simonović, Marko

    2012-01-01

    We generalize the single-field consistency relations to capture not only the leading term in the squeezed limit — going as 1/q 3 , where q is the small wavevector — but also the subleading one, going as 1/q 2 . This term, for an (n+1)-point function, is fixed in terms of the variation of the n-point function under a special conformal transformation; this parallels the fact that the 1/q 3 term is related with the scale dependence of the n-point function. For the squeezed limit of the 3-point function, this conformal consistency relation implies that there are no terms going as 1/q 2 . We verify that the squeezed limit of the 4-point function is related to the conformal variation of the 3-point function both in the case of canonical slow-roll inflation and in models with reduced speed of sound. In the second case the conformal consistency conditions capture, at the level of observables, the relation among operators induced by the non-linear realization of Lorentz invariance in the Lagrangian. These results mean that, in any single-field model, primordial correlation functions of ζ are endowed with an SO(4,1) symmetry, with dilations and special conformal transformations non-linearly realized by ζ. We also verify the conformal consistency relations for any n-point function in models with a modulation of the inflaton potential, where the scale dependence is not negligible. Finally, we generalize (some of) the consistency relations involving tensors and soft internal momenta

  11. Consistency assessment of rating curve data in various locations using Bidirectional Reach (BReach)

    Science.gov (United States)

    Van Eerdenbrugh, Katrien; Van Hoey, Stijn; Coxon, Gemma; Freer, Jim; Verhoest, Niko E. C.

    2017-10-01

    evaluation design about consistent time periods to analyze).

  12. Essays on market design and strategic behaviour in energy markets

    International Nuclear Information System (INIS)

    Lorenczik, Stefan

    2017-01-01

    The thesis at hand consists of four essays which are divided into two parts. In the first part, consisting of the first two essays, market design issues in electricity markets are discussed. More precisely, it deals with concerns regarding security of supply: First, the concerns regarding the availability of sufficient flexibility to cope with intermittent renewable energy electricity generation. And second, the consequences of insufficient investments signals in energy only markets in interconnected electricity markets. Part two deals with strategic behaviour in spatial natural resource markets. Strategic behaviour and the exertion of market power have always been a matter of concern in energy markets, especially in natural resource markets. The exertion of market power can result in deadweight losses - regulatory bodies try to address this by market regulations aiming for a welfare maximising market outcome. The first problem is to detect collusive behaviour as available data is frequently limited. The second question is how regulatory decisions may influence the market outcome. Both topics are investigated by using the example of the international metallurgical coal market.

  13. Essays on market design and strategic behaviour in energy markets

    Energy Technology Data Exchange (ETDEWEB)

    Lorenczik, Stefan

    2017-11-13

    The thesis at hand consists of four essays which are divided into two parts. In the first part, consisting of the first two essays, market design issues in electricity markets are discussed. More precisely, it deals with concerns regarding security of supply: First, the concerns regarding the availability of sufficient flexibility to cope with intermittent renewable energy electricity generation. And second, the consequences of insufficient investments signals in energy only markets in interconnected electricity markets. Part two deals with strategic behaviour in spatial natural resource markets. Strategic behaviour and the exertion of market power have always been a matter of concern in energy markets, especially in natural resource markets. The exertion of market power can result in deadweight losses - regulatory bodies try to address this by market regulations aiming for a welfare maximising market outcome. The first problem is to detect collusive behaviour as available data is frequently limited. The second question is how regulatory decisions may influence the market outcome. Both topics are investigated by using the example of the international metallurgical coal market.

  14. Designing the Urban Microclimate. A framework for a design-decision support tool for the dissemination of knowledge on the urban microclimate to the urban design process

    Directory of Open Access Journals (Sweden)

    Marjolein Pijpers-van Esch

    2015-06-01

    Full Text Available This doctoral thesis presents research on the integration and transfer of knowledge from the specialized field of urban microclimatology into the generic field of urban design. Both fields are studied in order to identify crosslinks and reveal gaps. The main research question of the research is: How can the design of urban neighbourhoods contribute to microclimates that support physical well-being and what kind of information and form of presentation does the urban designer need in order to make design decisions regarding such urban microclimates? This question consists of two parts, which are addressed separately in the first two parts of the dissertation. Part 1 concerns an assessment of relevant knowledge on urban design by literature review, followed by a field study into the use of expert information in the urban design process. Part 2 discusses the influence of the urban environment on its microclimate and, consequently, the living quality of its inhabitants – both by means of literature review. Combined, Parts 1 and 2 serve as a basis for a framework for a design-decision support tool, which is discussed in Part 3. This tool is proposed as a means to integrate knowledge of the urban microclimate into the urban design process, bridging an observed gap. Urban design is concerned with shaping the physical environment to facilitate urban life in all its aspects. This is a complex task, which requires the integration and translation of different stakeholder interests into a proposition for the realization of physical-spatial constructs in the urban environment. Such a proposition comprises different planning elements in the following categories: spatial-functional organization, city plan, public space design and rules for architecture. During the design process, the urban designer has to deal with incomplete, often contradictory and/or changing constraints and quality demands as well as other uncertainties. He/ she handles this complexity by

  15. Designing Communication Design

    DEFF Research Database (Denmark)

    Løvlie, Anders Sundnes

    2016-01-01

    Innovating in the field of new media genres requires methods for producing designs that can succeed in being disseminated and used outside of design research labs. This article uses the author's experiences with the development of university courses in communication design to address the research...... question: How can we design courses to give students the competencies they need to work as designers of new media? Based on existing approaches from UX design and other fields, I present a model that has demonstrated its usefulness in the development of commercial products and services. The model...

  16. Dispersion Differences and Consistency of Artificial Periodic Structures.

    Science.gov (United States)

    Cheng, Zhi-Bao; Lin, Wen-Kai; Shi, Zhi-Fei

    2017-10-01

    Dispersion differences and consistency of artificial periodic structures, including phononic crystals, elastic metamaterials, as well as periodic structures composited of phononic crystals and elastic metamaterials, are investigated in this paper. By developing a K(ω) method, complex dispersion relations and group/phase velocity curves of both the single-mechanism periodic structures and the mixing-mechanism periodic structures are calculated at first, from which dispersion differences of artificial periodic structures are discussed. Then, based on a unified formulation, dispersion consistency of artificial periodic structures is investigated. Through a comprehensive comparison study, the correctness for the unified formulation is verified. Mathematical derivations of the unified formulation for different artificial periodic structures are presented. Furthermore, physical meanings of the unified formulation are discussed in the energy-state space.

  17. Consistent Conformal Extensions of the Standard Model arXiv

    CERN Document Server

    Loebbert, Florian; Plefka, Jan

    The question of whether classically conformal modifications of the standard model are consistent with experimental obervations has recently been subject to renewed interest. The method of Gildener and Weinberg provides a natural framework for the study of the effective potential of the resulting multi-scalar standard model extensions. This approach relies on the assumption of the ordinary loop hierarchy $\\lambda_\\text{s} \\sim g^2_\\text{g}$ of scalar and gauge couplings. On the other hand, Andreassen, Frost and Schwartz recently argued that in the (single-scalar) standard model, gauge invariant results require the consistent scaling $\\lambda_\\text{s} \\sim g^4_\\text{g}$. In the present paper we contrast these two hierarchy assumptions and illustrate the differences in the phenomenological predictions of minimal conformal extensions of the standard model.

  18. Surfactant modified clays’ consistency limits and contact angles

    Directory of Open Access Journals (Sweden)

    S Akbulut

    2012-07-01

    Full Text Available This study was aimed at preparing a surfactant modified clay (SMC and researching the effect of surfactants on clays' contact angles and consistency limits; clay was thus modified by surfactants formodifying their engineering properties. Seven surfactants (trimethylglycine, hydroxyethylcellulose  octyl phenol ethoxylate, linear alkylbenzene sulfonic acid, sodium lauryl ether sulfate, cetyl trimethylammonium chloride and quaternised ethoxylated fatty amine were used as surfactants in this study. The experimental results indicated that SMC consistency limits (liquid and plastic limits changedsignificantly compared to those of natural clay. Plasticity index and liquid limit (PI-LL values representing soil class approached the A-line when zwitterion, nonionic, and anionic surfactant percentageincreased. However, cationic SMC became transformed from CH (high plasticity clay to MH (high plasticity silt class soils, according to the unified soil classification system (USCS. Clay modifiedwith cationic and anionic surfactants gave higher and lower contact angles than natural clay, respectively.

  19. Rotating D0-branes and consistent truncations of supergravity

    International Nuclear Information System (INIS)

    Anabalón, Andrés; Ortiz, Thomas; Samtleben, Henning

    2013-01-01

    The fluctuations around the D0-brane near-horizon geometry are described by two-dimensional SO(9) gauged maximal supergravity. We work out the U(1) 4 truncation of this theory whose scalar sector consists of five dilaton and four axion fields. We construct the full non-linear Kaluza–Klein ansatz for the embedding of the dilaton sector into type IIA supergravity. This yields a consistent truncation around a geometry which is the warped product of a two-dimensional domain wall and the sphere S 8 . As an application, we consider the solutions corresponding to rotating D0-branes which in the near-horizon limit approach AdS 2 ×M 8 geometries, and discuss their thermodynamical properties. More generally, we study the appearance of such solutions in the presence of non-vanishing axion fields

  20. Substituting fields within the action: Consistency issues and some applications

    International Nuclear Information System (INIS)

    Pons, Josep M.

    2010-01-01

    In field theory, as well as in mechanics, the substitution of some fields in terms of other fields at the level of the action raises an issue of consistency with respect to the equations of motion. We discuss this issue and give an expression which neatly displays the difference between doing the substitution at the level of the Lagrangian or at the level of the equations of motion. Both operations do not commute in general. A very relevant exception is the case of auxiliary variables, which are discussed in detail together with some of their relevant applications. We discuss the conditions for the preservation of symmetries--Noether as well as non-Noether--under the reduction of degrees of freedom provided by the mechanism of substitution. We also examine how the gauge fixing procedures fit in our framework and give simple examples on the issue of consistency in this case.

  1. On the consistent effect histories approach to quantum mechanics

    International Nuclear Information System (INIS)

    Rudolph, O.

    1996-01-01

    A formulation of the consistent histories approach to quantum mechanics in terms of generalized observables (POV measures) and effect operators is provided. The usual notion of open-quote open-quote history close-quote close-quote is generalized to the notion of open-quote open-quote effect history.close-quote close-quote The space of effect histories carries the structure of a D-poset. Recent results of J. D. Maitland Wright imply that every decoherence functional defined for ordinary histories can be uniquely extended to a bi-additive decoherence functional on the space of effect histories. Omngrave es close-quote logical interpretation is generalized to the present context. The result of this work considerably generalizes and simplifies the earlier formulation of the consistent effect histories approach to quantum mechanics communicated in a previous work of this author. copyright 1996 American Institute of Physics

  2. Consistency Across Standards or Standards in a New Business Model

    Science.gov (United States)

    Russo, Dane M.

    2010-01-01

    Presentation topics include: standards in a changing business model, the new National Space Policy is driving change, a new paradigm for human spaceflight, consistency across standards, the purpose of standards, danger of over-prescriptive standards, a balance is needed (between prescriptive and general standards), enabling versus inhibiting, characteristics of success-oriented standards, characteristics of success-oriented standards, and conclusions. Additional slides include NASA Procedural Requirements 8705.2B identifies human rating standards and requirements, draft health and medical standards for human rating, what's been done, government oversight models, examples of consistency from anthropometry, examples of inconsistency from air quality and appendices of government and non-governmental human factors standards.

  3. Quantitative verification of ab initio self-consistent laser theory.

    Science.gov (United States)

    Ge, Li; Tandy, Robert J; Stone, A D; Türeci, Hakan E

    2008-10-13

    We generalize and test the recent "ab initio" self-consistent (AISC) time-independent semiclassical laser theory. This self-consistent formalism generates all the stationary lasing properties in the multimode regime (frequencies, thresholds, internal and external fields, output power and emission pattern) from simple inputs: the dielectric function of the passive cavity, the atomic transition frequency, and the transverse relaxation time of the lasing transition.We find that the theory gives excellent quantitative agreement with full time-dependent simulations of the Maxwell-Bloch equations after it has been generalized to drop the slowly-varying envelope approximation. The theory is infinite order in the non-linear hole-burning interaction; the widely used third order approximation is shown to fail badly.

  4. Self-consistent studies of magnetic thin film Ni (001)

    International Nuclear Information System (INIS)

    Wang, C.S.; Freeman, A.J.

    1979-01-01

    Advances in experimental methods for studying surface phenomena have provided the stimulus to develop theoretical methods capable of interpreting this wealth of new information. Of particular interest have been the relative roles of bulk and surface contributions since in several important cases agreement between experiment and bulk self-consistent (SC) calculations within the local spin density functional formalism (LSDF) is lacking. We discuss our recent extension of the (LSDF) approach to the study of thin films (slabs) and the role of surface effects on magnetic properties. Results are described for Ni (001) films using our new SC numerical basis set LCAO method. Self-consistency within the superposition of overlapping spherical atomic charge density model is obtained iteratively with the atomic configuration as the adjustable parameter. Results are presented for the electronic charge densities and local density of states. The origin and role of (magnetic) surface states is discussed by comparison with results of earlier bulk calculations

  5. Self-consistent equilibria in the pulsar magnetosphere

    International Nuclear Information System (INIS)

    Endean, V.G.

    1976-01-01

    For a 'collisionless' pulsar magnetosphere the self-consistent equilibrium particle distribution functions are functions of the constants of the motion ony. Reasons are given for concluding that to a good approximation they will be functions of the rotating frame Hamiltonian only. This is shown to result in a rigid rotation of the plasma, which therefore becomes trapped inside the velocity of light cylinder. The self-consistent field equations are derived, and a method of solving them is illustrated. The axial component of the magnetic field decays to zero at the plasma boundary. In practice, some streaming of particles into the wind zone may occur as a second-order effect. Acceleration of such particles to very high energies is expected when they approach the velocity of light cylinder, but they cannot be accelerated to very high energies near the star. (author)

  6. Lagrangian space consistency relation for large scale structure

    International Nuclear Information System (INIS)

    Horn, Bart; Hui, Lam; Xiao, Xiao

    2015-01-01

    Consistency relations, which relate the squeezed limit of an (N+1)-point correlation function to an N-point function, are non-perturbative symmetry statements that hold even if the associated high momentum modes are deep in the nonlinear regime and astrophysically complex. Recently, Kehagias and Riotto and Peloso and Pietroni discovered a consistency relation applicable to large scale structure. We show that this can be recast into a simple physical statement in Lagrangian space: that the squeezed correlation function (suitably normalized) vanishes. This holds regardless of whether the correlation observables are at the same time or not, and regardless of whether multiple-streaming is present. The simplicity of this statement suggests that an analytic understanding of large scale structure in the nonlinear regime may be particularly promising in Lagrangian space

  7. Rotating D0-branes and consistent truncations of supergravity

    Energy Technology Data Exchange (ETDEWEB)

    Anabalón, Andrés [Departamento de Ciencias, Facultad de Artes Liberales, Facultad de Ingeniería y Ciencias, Universidad Adolfo Ibáñez, Av. Padre Hurtado 750, Viña del Mar (Chile); Université de Lyon, Laboratoire de Physique, UMR 5672, CNRS École Normale Supérieure de Lyon 46, allée d' Italie, F-69364 Lyon cedex 07 (France); Ortiz, Thomas; Samtleben, Henning [Université de Lyon, Laboratoire de Physique, UMR 5672, CNRS École Normale Supérieure de Lyon 46, allée d' Italie, F-69364 Lyon cedex 07 (France)

    2013-12-18

    The fluctuations around the D0-brane near-horizon geometry are described by two-dimensional SO(9) gauged maximal supergravity. We work out the U(1){sup 4} truncation of this theory whose scalar sector consists of five dilaton and four axion fields. We construct the full non-linear Kaluza–Klein ansatz for the embedding of the dilaton sector into type IIA supergravity. This yields a consistent truncation around a geometry which is the warped product of a two-dimensional domain wall and the sphere S{sup 8}. As an application, we consider the solutions corresponding to rotating D0-branes which in the near-horizon limit approach AdS{sub 2}×M{sub 8} geometries, and discuss their thermodynamical properties. More generally, we study the appearance of such solutions in the presence of non-vanishing axion fields.

  8. Full self-consistency versus quasiparticle self-consistency in diagrammatic approaches: exactly solvable two-site Hubbard model.

    Science.gov (United States)

    Kutepov, A L

    2015-08-12

    Self-consistent solutions of Hedin's equations (HE) for the two-site Hubbard model (HM) have been studied. They have been found for three-point vertices of increasing complexity (Γ = 1 (GW approximation), Γ1 from the first-order perturbation theory, and the exact vertex Γ(E)). Comparison is made between the cases when an additional quasiparticle (QP) approximation for Green's functions is applied during the self-consistent iterative solving of HE and when QP approximation is not applied. The results obtained with the exact vertex are directly related to the present open question-which approximation is more advantageous for future implementations, GW + DMFT or QPGW + DMFT. It is shown that in a regime of strong correlations only the originally proposed GW + DMFT scheme is able to provide reliable results. Vertex corrections based on perturbation theory (PT) systematically improve the GW results when full self-consistency is applied. The application of QP self-consistency combined with PT vertex corrections shows similar problems to the case when the exact vertex is applied combined with QP sc. An analysis of Ward Identity violation is performed for all studied in this work's approximations and its relation to the general accuracy of the schemes used is provided.

  9. Time-Consistent and Market-Consistent Evaluations (replaced by CentER DP 2012-086)

    NARCIS (Netherlands)

    Pelsser, A.; Stadje, M.A.

    2011-01-01

    We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  10. Time-Consistent and Market-Consistent Evaluations (Revised version of CentER DP 2011-063)

    NARCIS (Netherlands)

    Pelsser, A.; Stadje, M.A.

    2012-01-01

    Abstract: We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  11. Bayesian Geostatistical Design

    DEFF Research Database (Denmark)

    Diggle, Peter; Lophaven, Søren Nymand

    2006-01-01

    locations to, or deletion of locations from, an existing design, and prospective design, which consists of choosing positions for a new set of sampling locations. We propose a Bayesian design criterion which focuses on the goal of efficient spatial prediction whilst allowing for the fact that model...

  12. A DVE Time Management Simulation and Verification Platform Based on Causality Consistency Middleware

    Science.gov (United States)

    Zhou, Hangjun; Zhang, Wei; Peng, Yuxing; Li, Sikun

    During the course of designing a time management algorithm for DVEs, the researchers always become inefficiency for the distraction from the realization of the trivial and fundamental details of simulation and verification. Therefore, a platform having realized theses details is desirable. However, this has not been achieved in any published work to our knowledge. In this paper, we are the first to design and realize a DVE time management simulation and verification platform providing exactly the same interfaces as those defined by the HLA Interface Specification. Moreover, our platform is based on a new designed causality consistency middleware and might offer the comparison of three kinds of time management services: CO, RO and TSO. The experimental results show that the implementation of the platform only costs small overhead, and that the efficient performance of it is highly effective for the researchers to merely focus on the improvement of designing algorithms.

  13. Consistency of eye movements in MOT using horizontally flipped trials

    Czech Academy of Sciences Publication Activity Database

    Děchtěrenko, F.; Lukavský, Jiří

    2013-01-01

    Roč. 42, Suppl (2013), s. 42-42 ISSN 0301-0066. [36th European Conference on Visual Perception. 25.08.2013.-29.08.2013, Brémy] R&D Projects: GA ČR GA13-28709S Institutional support: RVO:68081740 Keywords : eye movements * symmetry * consistency Subject RIV: AN - Psychology http://www.ecvp.uni-bremen.de/~ecvpprog/abstract164.html

  14. Spectrally Consistent Satellite Image Fusion with Improved Image Priors

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Aanæs, Henrik; Jensen, Thomas B.S.

    2006-01-01

    Here an improvement to our previous framework for satellite image fusion is presented. A framework purely based on the sensor physics and on prior assumptions on the fused image. The contributions of this paper are two fold. Firstly, a method for ensuring 100% spectrally consistency is proposed......, even when more sophisticated image priors are applied. Secondly, a better image prior is introduced, via data-dependent image smoothing....

  15. Monetary Poverty, Material Deprivation and Consistent Poverty in Portugal

    OpenAIRE

    Carlos Farinha Rodrigues; Isabel Andrade

    2012-01-01

    In this paper we use the Portuguese component of the European Union Statistics on Income and Living Conditions {EU-SILC) to develop a measure of consistent poverty in Portugal. It is widely agreed that being poor does not simply mean not having enough monetary resources. It also reflects a lack of access to the resources required to enjoy a minimum standard of living and participation in the society one belor]gs to. The coexistence of material deprivation and monetary poverty leads ...

  16. Consistency requirements on Δ contributions to the NN potential

    International Nuclear Information System (INIS)

    Rinat, A.S.

    1982-04-01

    We discuss theories leading to intermediate state NΔ and ΔΔ contributions to Vsub(NN). We focus on the customary addition of Lsub(ΔNπ)' to Lsub(πNN)' in a conventional field theory and argue that overcounting of contributions to tsub(πN) and Vsub(NN) will be the rule. We then discuss the cloudy bag model where a similar interaction naturally arises and which leads to a consistent theory. (author)

  17. Quark mean field theory and consistency with nuclear matter

    International Nuclear Information System (INIS)

    Dey, J.; Tomio, L.; Dey, M.; Frederico, T.

    1989-01-01

    1/N c expansion in QCD (with N c the number of colours) suggests using a potential from meson sector (e.g. Richardson) for baryons. For light quarks a σ field has to be introduced to ensure chiral symmetry breaking ( χ SB). It is found that nuclear matter properties can be used to pin down the χ SB-modelling. All masses, M Ν , m σ , m ω are found to scale with density. The equations are solved self consistently. (author)

  18. Self-consistent T-matrix theory of superconductivity

    Czech Academy of Sciences Publication Activity Database

    Šopík, B.; Lipavský, Pavel; Männel, M.; Morawetz, K.; Matlock, P.

    2011-01-01

    Roč. 84, č. 9 (2011), 094529/1-094529/13 ISSN 1098-0121 R&D Projects: GA ČR GAP204/10/0212; GA ČR(CZ) GAP204/11/0015 Institutional research plan: CEZ:AV0Z10100521 Keywords : superconductivity * T-matrix * superconducting gap * restricted self-consistency Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 3.691, year: 2011

  19. The consistent histories interpretation of quantum fields in curved spacetime

    International Nuclear Information System (INIS)

    Blencowe, M.

    1991-01-01

    As an initial attempt to address some of the foundation problems of quantum mechanics, the author formulates the consistent histories interpretation of quantum field theory on a globally hyperbolic curved space time. He then constructs quasiclassical histories for a free, massive scalar field. In the final part, he points out the shortcomings of the theory and conjecture that one must take into account the fact that gravity is quantized in order to overcome them

  20. EVALUATION OF CONSISTENCY AND SETTING TIME OF IRANIAN DENTAL STONES

    Directory of Open Access Journals (Sweden)

    F GOL BIDI

    2000-09-01

    Full Text Available Introduction. Dental stones are widely used in dentistry and the success or failure of many dental treatments depend on the accuracy of these gypsums. The purpose of this study was the evaluation of Iranian dental stones and comparison between Iranian and foreign ones. In this investigation, consistency and setting time were compared between Pars Dendn, Almas and Hinrizit stones. The latter is accepted by ADA (American Dental Association. Consistency and setting time are 2 of 5 properties that are necessitated by both ADA specification No. 25 and Iranian Standard Organization specification No. 2569 for evaluation of dental stones. Methods. In this study, the number and preparation of specimens and test conditions were done according to the ADA specification No. 25 and all the measurements were done with vicat apparatus. Results. The results of this study showed that the standard consistency of Almas stone was obtained by 42ml water and 100gr powder and the setting time of this stone was 11±0.03 min. Which was with in the limits of ADA specification (12±4 min. The standard consistency of Pars Dandan stone was obrianed by 31ml water and 100 gr powder, but the setting time of this stone was 5± 0.16 min which was nt within the limits of ADA specification. Discussion: Comparison of Iranian and Hinrizit stones properties showed that two probable problems of Iranian stones are:1- Unhemogrnousity of Iranian stoned powder was caused by uncontrolled temperature, pressure and humidity in the production process of stone. 2- Impurities such as sodium chloride was responsible fo shortening of Pars Dendens setting time.

  1. Modeling a Consistent Behavior of PLC-Sensors

    Directory of Open Access Journals (Sweden)

    E. V. Kuzmin

    2014-01-01

    Full Text Available The article extends the cycle of papers dedicated to programming and verificatoin of PLC-programs by LTL-specification. This approach provides the availability of correctness analysis of PLC-programs by the model checking method.The model checking method needs to construct a finite model of a PLC program. For successful verification of required properties it is important to take into consideration that not all combinations of input signals from the sensors can occur while PLC works with a control object. This fact requires more advertence to the construction of the PLC-program model.In this paper we propose to describe a consistent behavior of sensors by three groups of LTL-formulas. They will affect the program model, approximating it to the actual behavior of the PLC program. The idea of LTL-requirements is shown by an example.A PLC program is a description of reactions on input signals from sensors, switches and buttons. In constructing a PLC-program model, the approach to modeling a consistent behavior of PLC sensors allows to focus on modeling precisely these reactions without an extension of the program model by additional structures for realization of a realistic behavior of sensors. The consistent behavior of sensors is taken into account only at the stage of checking a conformity of the programming model to required properties, i. e. a property satisfaction proof for the constructed model occurs with the condition that the model contains only such executions of the program that comply with the consistent behavior of sensors.

  2. Designing the Object Game

    DEFF Research Database (Denmark)

    Filip, Diane; Lindegaard, Hanne

    2016-01-01

    The Object Game is an exploratory design game and an experiment of developing a tangible object that can spark dialogue and retrospection between collaborative partners and act as a boundary object. The objective of this article is to show and elaborate on the development of the Object Game......, and to provide case examples of the game in action. The Object Game has two parts – Story-building and Co-rating of objects – with the aim of stimulating a collaborative reflection on knowledge sharing with different objects. In Story-building, the participants visualize their knowledge sharing process...... these facilitated knowledge transfer, knowledge exchange, knowledge generation, and knowledge integration. The participants collaborative reflected on their use of different objects for knowledge sharing and learn which objects have been effective (and which have not been effective) in their collaborative...

  3. Temporal and contextual consistency of leadership in homing pigeon flocks.

    Directory of Open Access Journals (Sweden)

    Carlos D Santos

    Full Text Available Organized flight of homing pigeons (Columba livia was previously shown to rely on simple leadership rules between flock mates, yet the stability of this social structuring over time and across different contexts remains unclear. We quantified the repeatability of leadership-based flock structures within a flight and across multiple flights conducted with the same animals. We compared two contexts of flock composition: flocks of birds of the same age and flight experience; and, flocks of birds of different ages and flight experience. All flocks displayed consistent leadership-based structures over time, showing that individuals have stable roles in the navigational decisions of the flock. However, flocks of balanced age and flight experience exhibited reduced leadership stability, indicating that these factors promote flock structuring. Our study empirically demonstrates that leadership and followership are consistent behaviours in homing pigeon flocks, but such consistency is affected by the heterogeneity of individual flight experiences and/or age. Similar evidence from other species suggests leadership as an important mechanism for coordinated motion in small groups of animals with strong social bonds.

  4. Consistency checks in beam emission modeling for neutral beam injectors

    International Nuclear Information System (INIS)

    Punyapu, Bharathi; Vattipalle, Prahlad; Sharma, Sanjeev Kumar; Baruah, Ujjwal Kumar; Crowley, Brendan

    2015-01-01

    In positive neutral beam systems, the beam parameters such as ion species fractions, power fractions and beam divergence are routinely measured using Doppler shifted beam emission spectrum. The accuracy with which these parameters are estimated depend on the accuracy of the atomic modeling involved in these estimations. In this work, an effective procedure to check the consistency of the beam emission modeling in neutral beam injectors is proposed. As a first consistency check, at a constant beam voltage and current, the intensity of the beam emission spectrum is measured by varying the pressure in the neutralizer. Then, the scaling of measured intensity of un-shifted (target) and Doppler shifted intensities (projectile) of the beam emission spectrum at these pressure values are studied. If the un-shifted component scales with pressure, then the intensity of this component will be used as a second consistency check on the beam emission modeling. As a further check, the modeled beam fractions and emission cross sections of projectile and target are used to predict the intensity of the un-shifted component and then compared with the value of measured target intensity. An agreement between the predicted and measured target intensities provide the degree of discrepancy in the beam emission modeling. In order to test this methodology, a systematic analysis of Doppler shift spectroscopy data obtained on the JET neutral beam test stand data was carried out

  5. A dynamical mechanism for large volumes with consistent couplings

    Energy Technology Data Exchange (ETDEWEB)

    Abel, Steven [IPPP, Durham University,Durham, DH1 3LE (United Kingdom)

    2016-11-14

    A mechanism for addressing the “decompactification problem” is proposed, which consists of balancing the vacuum energy in Scherk-Schwarzed theories against contributions coming from non-perturbative physics. Universality of threshold corrections ensures that, in such situations, the stable minimum will have consistent gauge couplings for any gauge group that shares the same N=2 beta function for the bulk excitations as the gauge group that takes part in the minimisation. Scherk-Schwarz compactification from 6D to 4D in heterotic strings is discussed explicitly, together with two alternative possibilities for the non-perturbative physics, namely metastable SQCD vacua and a single gaugino condensate. In the former case, it is shown that modular symmetries gives various consistency checks, and allow one to follow soft-terms, playing a similar role to R-symmetry in global SQCD. The latter case is particularly attractive when there is nett Bose-Fermi degeneracy in the massless sector. In such cases, because the original Casimir energy is generated entirely by excited and/or non-physical string modes, it is completely immune to the non-perturbative IR physics. Such a separation between UV and IR contributions to the potential greatly simplifies the analysis of stabilisation, and is a general possibility that has not been considered before.

  6. Consistency of variables in PCS and JASTRO great area database

    International Nuclear Information System (INIS)

    Nishino, Tomohiro; Teshima, Teruki; Abe, Mitsuyuki

    1998-01-01

    To examine whether the Patterns of Care Study (PCS) reflects the data for the major areas in Japan, the consistency of variables in the PCS and in the major area database of the Japanese Society for Therapeutic Radiology and Oncology (JASTRO) were compared. Patients with esophageal or uterine cervical cancer were sampled from the PCS and JASTRO databases. From the JASTRO database, 147 patients with esophageal cancer and 95 patients with uterine cervical cancer were selected according to the eligibility criteria for the PCS. From the PCS, 455 esophageal and 432 uterine cervical cancer patients were surveyed. Six items for esophageal cancer and five items for uterine cervical cancer were selected for a comparative analysis of PCS and JASTRO databases. Esophageal cancer: Age (p=.0777), combination of radiation and surgery (p=.2136), and energy of the external beam (p=.6400) were consistent for PCS and JASTRO. However, the dose of the external beam for the non-surgery group showed inconsistency (p=.0467). Uterine cervical cancer: Age (p=.6301) and clinical stage (p=.8555) were consistent for the two sets of data. However, the energy of the external beam (p<.0001), dose rate of brachytherapy (p<.0001), and brachytherapy utilization by clinical stage (p<.0001) showed inconsistencies. It appears possible that the JASTRO major area database could not account for all patients' backgrounds and factors and that both surveys might have an imbalance in the stratification of institutions including differences in equipment and staffing patterns. (author)

  7. Self-assessment: Strategy for higher standards, consistency, and performance

    International Nuclear Information System (INIS)

    Ide, W.E.

    1996-01-01

    In late 1994, Palo Verde operations underwent a transformation from a unitized structure to a single functional unit. It was necessary to build consistency in watchstanding practices and create a shared mission. Because there was a lack of focus on actual plant operations and because personnel were deeply involved with administrative tasks, command and control of evolutions were weak. Improvement was needed. Consistent performance standards have been set for all three operating units. These expectation focus on nuclear, radiological, and industrial safety. Straightforward descriptions of watchstanding and monitoring practices have been provided to all department personnel. The desired professional and leadership qualities for employee conduct have been defined and communicated thoroughly. A healthy and competitive atmosphere developed with the successful implementation of these standards. Overall performance improved. The auxiliary operators demonstrated increased pride and ownership in the performance of their work activities. In addition, their morale improved. Crew teamwork improved as well as the quality of shift briefs. There was a decrease in the noise level and the administrative functions in the control room. The use of self-assessment helped to anchor and define higher and more consistent standards. The proof of Palo Verde's success was evident when an Institute of Nuclear Power Operations finding was turned into a strength within 1 yr

  8. Wide baseline stereo matching based on double topological relationship consistency

    Science.gov (United States)

    Zou, Xiaohong; Liu, Bin; Song, Xiaoxue; Liu, Yang

    2009-07-01

    Stereo matching is one of the most important branches in computer vision. In this paper, an algorithm is proposed for wide-baseline stereo vision matching. Here, a novel scheme is presented called double topological relationship consistency (DCTR). The combination of double topological configuration includes the consistency of first topological relationship (CFTR) and the consistency of second topological relationship (CSTR). It not only sets up a more advanced model on matching, but discards mismatches by iteratively computing the fitness of the feature matches and overcomes many problems of traditional methods depending on the powerful invariance to changes in the scale, rotation or illumination across large view changes and even occlusions. Experimental examples are shown where the two cameras have been located in very different orientations. Also, epipolar geometry can be recovered using RANSAC by far the most widely method adopted possibly. By the method, we can obtain correspondences with high precision on wide baseline matching problems. Finally, the effectiveness and reliability of this method are demonstrated in wide-baseline experiments on the image pairs.

  9. Consistently Trained Artificial Neural Network for Automatic Ship Berthing Control

    Directory of Open Access Journals (Sweden)

    Y.A. Ahmed

    2015-09-01

    Full Text Available In this paper, consistently trained Artificial Neural Network controller for automatic ship berthing is discussed. Minimum time course changing manoeuvre is utilised to ensure such consistency and a new concept named ‘virtual window’ is introduced. Such consistent teaching data are then used to train two separate multi-layered feed forward neural networks for command rudder and propeller revolution output. After proper training, several known and unknown conditions are tested to judge the effectiveness of the proposed controller using Monte Carlo simulations. After getting acceptable percentages of success, the trained networks are implemented for the free running experiment system to judge the network’s real time response for Esso Osaka 3-m model ship. The network’s behaviour during such experiments is also investigated for possible effect of initial conditions as well as wind disturbances. Moreover, since the final goal point of the proposed controller is set at some distance from the actual pier to ensure safety, therefore a study on automatic tug assistance is also discussed for the final alignment of the ship with actual pier.

  10. The study of consistent properties of gelatinous shampoo with minoxidil

    Directory of Open Access Journals (Sweden)

    I. V. Gnitko

    2016-04-01

    Full Text Available The aim of the work is the study of consistent properties of gelatinous shampoo with minoxidil 1% for the complex therapy and prevention of alopecia. This shampoo with minoxidil was selected according to the complex physical-chemical, biopharmaceutical and microbiological investigations. Methods and results. It has been established that consistent properties of the gelatinous minoxidil 1% shampoo and the «mechanical stability» (1.70 describe the formulation as exceptionally thixotropic composition with possibility of restoration after mechanical loads. Also this fact allows to predict stability of the consistent properties during long storage. Conclusion. Factors of dynamic flowing for the foam detergent gel with minoxidil (Кd1=38.9%; Kd2=78.06% quantitatively confirm sufficient degree of distribution at the time of spreading composition on the skin surface of the hairy part of head or during technological operations of manufacturing. Insignificant difference of «mechanical stability» for the gelatinous minoxidil 1% shampoo and its base indicates the absence of interactions between active substance and the base.

  11. Consistent Kaluza-Klein truncations via exceptional field theory

    Energy Technology Data Exchange (ETDEWEB)

    Hohm, Olaf [Center for Theoretical Physics, Massachusetts Institute of Technology,Cambridge, MA 02139 (United States); Samtleben, Henning [Université de Lyon, Laboratoire de Physique, UMR 5672, CNRS,École Normale Supérieure de Lyon, 46, allée d’Italie, F-69364 Lyon cedex 07 (France)

    2015-01-26

    We present the generalized Scherk-Schwarz reduction ansatz for the full supersymmetric exceptional field theory in terms of group valued twist matrices subject to consistency equations. With this ansatz the field equations precisely reduce to those of lower-dimensional gauged supergravity parametrized by an embedding tensor. We explicitly construct a family of twist matrices as solutions of the consistency equations. They induce gauged supergravities with gauge groups SO(p,q) and CSO(p,q,r). Geometrically, they describe compactifications on internal spaces given by spheres and (warped) hyperboloides H{sup p,q}, thus extending the applicability of generalized Scherk-Schwarz reductions beyond homogeneous spaces. Together with the dictionary that relates exceptional field theory to D=11 and IIB supergravity, respectively, the construction defines an entire new family of consistent truncations of the original theories. These include not only compactifications on spheres of different dimensions (such as AdS{sub 5}×S{sup 5}), but also various hyperboloid compactifications giving rise to a higher-dimensional embedding of supergravities with non-compact and non-semisimple gauge groups.

  12. Marginal Consistency: Upper-Bounding Partition Functions over Commutative Semirings.

    Science.gov (United States)

    Werner, Tomás

    2015-07-01

    Many inference tasks in pattern recognition and artificial intelligence lead to partition functions in which addition and multiplication are abstract binary operations forming a commutative semiring. By generalizing max-sum diffusion (one of convergent message passing algorithms for approximate MAP inference in graphical models), we propose an iterative algorithm to upper bound such partition functions over commutative semirings. The iteration of the algorithm is remarkably simple: change any two factors of the partition function such that their product remains the same and their overlapping marginals become equal. In many commutative semirings, repeating this iteration for different pairs of factors converges to a fixed point when the overlapping marginals of every pair of factors coincide. We call this state marginal consistency. During that, an upper bound on the partition function monotonically decreases. This abstract algorithm unifies several existing algorithms, including max-sum diffusion and basic constraint propagation (or local consistency) algorithms in constraint programming. We further construct a hierarchy of marginal consistencies of increasingly higher levels and show than any such level can be enforced by adding identity factors of higher arity (order). Finally, we discuss instances of the framework for several semirings, including the distributive lattice and the max-sum and sum-product semirings.

  13. Consistency relation in power law G-inflation

    International Nuclear Information System (INIS)

    Unnikrishnan, Sanil; Shankaranarayanan, S.

    2014-01-01

    In the standard inflationary scenario based on a minimally coupled scalar field, canonical or non-canonical, the subluminal propagation of speed of scalar perturbations ensures the following consistency relation: r ≤ −8n T , where r is the tensor-to-scalar-ratio and n T is the spectral index for tensor perturbations. However, recently, it has been demonstrated that this consistency relation could be violated in Galilean inflation models even in the absence of superluminal propagation of scalar perturbations. It is therefore interesting to investigate whether the subluminal propagation of scalar field perturbations impose any bound on the ratio r/|n T | in G-inflation models. In this paper, we derive the consistency relation for a class of G-inflation models that lead to power law inflation. Within these class of models, it turns out that one can have r > −8n T or r ≤ −8n T depending on the model parameters. However, the subluminal propagation of speed of scalar field perturbations, as required by causality, restricts r ≤ −(32/3) n T

  14. CONSISTENCY UNDER SAMPLING OF EXPONENTIAL RANDOM GRAPH MODELS.

    Science.gov (United States)

    Shalizi, Cosma Rohilla; Rinaldo, Alessandro

    2013-04-01

    The growing availability of network data and of scientific interest in distributed systems has led to the rapid development of statistical models of network structure. Typically, however, these are models for the entire network, while the data consists only of a sampled sub-network. Parameters for the whole network, which is what is of interest, are estimated by applying the model to the sub-network. This assumes that the model is consistent under sampling , or, in terms of the theory of stochastic processes, that it defines a projective family. Focusing on the popular class of exponential random graph models (ERGMs), we show that this apparently trivial condition is in fact violated by many popular and scientifically appealing models, and that satisfying it drastically limits ERGM's expressive power. These results are actually special cases of more general results about exponential families of dependent random variables, which we also prove. Using such results, we offer easily checked conditions for the consistency of maximum likelihood estimation in ERGMs, and discuss some possible constructive responses.

  15. Self-consistent average-atom scheme for electronic structure of hot and dense plasmas of mixture

    International Nuclear Information System (INIS)

    Yuan Jianmin

    2002-01-01

    An average-atom model is proposed to treat the electronic structures of hot and dense plasmas of mixture. It is assumed that the electron density consists of two parts. The first one is a uniform distribution with a constant value, which is equal to the electron density at the boundaries between the atoms. The second one is the total electron density minus the first constant distribution. The volume of each kind of atom is proportional to the sum of the charges of the second electron part and of the nucleus within each atomic sphere. By this way, one can make sure that electrical neutrality is satisfied within each atomic sphere. Because the integration of the electron charge within each atom needs the size of that atom in advance, the calculation is carried out in a usual self-consistent way. The occupation numbers of electron on the orbitals of each kind of atom are determined by the Fermi-Dirac distribution with the same chemical potential for all kinds of atoms. The wave functions and the orbital energies are calculated with the Dirac-Slater equations. As examples, the electronic structures of the mixture of Au and Cd, water (H 2 O), and CO 2 at a few temperatures and densities are presented

  16. Self-consistent average-atom scheme for electronic structure of hot and dense plasmas of mixture.

    Science.gov (United States)

    Yuan, Jianmin

    2002-10-01

    An average-atom model is proposed to treat the electronic structures of hot and dense plasmas of mixture. It is assumed that the electron density consists of two parts. The first one is a uniform distribution with a constant value, which is equal to the electron density at the boundaries between the atoms. The second one is the total electron density minus the first constant distribution. The volume of each kind of atom is proportional to the sum of the charges of the second electron part and of the nucleus within each atomic sphere. By this way, one can make sure that electrical neutrality is satisfied within each atomic sphere. Because the integration of the electron charge within each atom needs the size of that atom in advance, the calculation is carried out in a usual self-consistent way. The occupation numbers of electron on the orbitals of each kind of atom are determined by the Fermi-Dirac distribution with the same chemical potential for all kinds of atoms. The wave functions and the orbital energies are calculated with the Dirac-Slater equations. As examples, the electronic structures of the mixture of Au and Cd, water (H2O), and CO2 at a few temperatures and densities are presented.

  17. Macro-environmental policy: Principles and design

    International Nuclear Information System (INIS)

    Huppes, G.

    1993-01-01

    The central theme of this book is how macro-environmental policy can be developed, which does not prescribe or suggest specific technologies and products bu realizes the environmental quality desired by changing the general context. The publication is composed of four main parts. The framework for analysis and the normative principles for policy design and evaluation, the first two parts, form the analytic core. The framework for analysis gives a classification of instruments in terms of permutations of a limited number of defining elements. The normative principles guide choices in instrument design and, as the flexible response strategy, guide their application in specific policies. Detailing two main new instruments (the standard method for life cycle analysis and the substance deposit, and applying the instrument strategy as developed to the cases make up the next two parts

  18. Design, construction and testing of a radon experimental chamber

    International Nuclear Information System (INIS)

    Chavez B, A.; Balcazar G, M.

    1991-10-01

    To carry out studies on the radon behavior under controlled and stable conditions it was designed and constructed a system that consists of two parts: a container of mineral rich in Uranium and an experimentation chamber with radon united one to the other one by a step valve. The container of uranium mineral approximately contains 800 gr of uranium with a law of 0.28%; the radon gas emanated by the mineral is contained tightly by the container. When the valve opens up the radon gas it spreads to the radon experimental chamber; this contains 3 accesses that allow to install different types of detectors. The versatility of the system is exemplified with two experiments: 1. With the radon experimental chamber and an associated spectroscopic system, the radon and two of its decay products are identified. 2. The design of the system allows to couple the mineral container to other experimental geometries to demonstrate this fact it was coupled and proved a new automatic exchanger system of passive detectors of radon. The results of the new automatic exchanger system when it leave to flow the radon freely among the container and the automatic exchanger through a plastic membrane of 15 m. are shown. (Author)

  19. Sport fans: evaluating the consistency between implicit and explicit attitudes toward favorite and rival teams.

    Science.gov (United States)

    Wenger, Jay L; Brown, Roderick O

    2014-04-01

    Sport fans often foster very positive attitudes for their favorite teams and less favorable attitudes for opponents. The current research was designed to evaluate the consistency that might exist between implicit and explicit measures of those attitudes. College students (24 women, 16 men) performed a version of the Implicit Association Test related to their favorite and rival teams. Participants also reported their attitudes for these teams explicitly, via self-report instruments. When responding to the IAT, participants' responses were faster when they paired positive words with concepts related to favorite teams and negative words with rival teams, indicating implicit favorability for favorite teams and implicit negativity for rival teams. This pattern of implicit favorability and negativity was consistent with what participants reported explicitly via self-report. The importance of evaluating implicit attitudes and the corresponding consistency with explicit attitudes are discussed.

  20. The Bioenvironmental modeling of Bahar city based on Climate-consistent Architecture

    OpenAIRE

    Parna Kazemian

    2014-01-01

    The identification of the climate of a particularplace and the analysis of the climatic needs in terms of human comfort and theuse of construction materials is one of the prerequisites of aclimate-consistent design. In studies on climate and weather, usingillustrative reports, first a picture of the state of climate is offered. Then,based on the obtained results, the range of changes is determined, and thecause-effect relationships at different scales are identified. Finally, by ageneral exam...

  1. Visual Design Principles: An Empirical Study of Design Lore

    Science.gov (United States)

    Kimball, Miles A.

    2013-01-01

    Many books, designers, and design educators talk about visual design principles such as balance, contrast, and alignment, but with little consistency. This study uses empirical methods to explore the lore surrounding design principles. The study took the form of two stages: a quantitative literature review to determine what design principles are…

  2. Dictionary-based fiber orientation estimation with improved spatial consistency.

    Science.gov (United States)

    Ye, Chuyang; Prince, Jerry L

    2018-02-01

    Diffusion magnetic resonance imaging (dMRI) has enabled in vivo investigation of white matter tracts. Fiber orientation (FO) estimation is a key step in tract reconstruction and has been a popular research topic in dMRI analysis. In particular, the sparsity assumption has been used in conjunction with a dictionary-based framework to achieve reliable FO estimation with a reduced number of gradient directions. Because image noise can have a deleterious effect on the accuracy of FO estimation, previous works have incorporated spatial consistency of FOs in the dictionary-based framework to improve the estimation. However, because FOs are only indirectly determined from the mixture fractions of dictionary atoms and not modeled as variables in the objective function, these methods do not incorporate FO smoothness directly, and their ability to produce smooth FOs could be limited. In this work, we propose an improvement to Fiber Orientation Reconstruction using Neighborhood Information (FORNI), which we call FORNI+; this method estimates FOs in a dictionary-based framework where FO smoothness is better enforced than in FORNI alone. We describe an objective function that explicitly models the actual FOs and the mixture fractions of dictionary atoms. Specifically, it consists of data fidelity between the observed signals and the signals represented by the dictionary, pairwise FO dissimilarity that encourages FO smoothness, and weighted ℓ 1 -norm terms that ensure the consistency between the actual FOs and the FO configuration suggested by the dictionary representation. The FOs and mixture fractions are then jointly estimated by minimizing the objective function using an iterative alternating optimization strategy. FORNI+ was evaluated on a simulation phantom, a physical phantom, and real brain dMRI data. In particular, in the real brain dMRI experiment, we have qualitatively and quantitatively evaluated the reproducibility of the proposed method. Results demonstrate that

  3. Nonlinear and self-consistent treatment of ECRH

    Energy Technology Data Exchange (ETDEWEB)

    Tsironis, C.; Vlahos, L.

    2005-07-01

    A self-consistent formulation for the nonlinear interaction of electromagnetic waves with relativistic magnetized electrons is applied for the description of the ECRH. In general, electron-cyclotron absorption is the result of resonances between the cyclotron harmonics and the Doppler-shifted waver frequency. The resonant interaction results to an intense wave-particle energy exchange and an electron acceleration, and for that reason it is widely applied in fusion experiments for plasma heating and current drive. The linear theory, for the wave absorption, as well as the quasilinear theory for the electron distribution function, are the most frequently-used tools for the study of wave-particle interactions. However, in many cases the validity of these theories is violated, namely cases where nonlinear effects, like, e. g. particle trapping in the wave field, are dominant in the particle phase-space. Our model consists of electrons streaming and gyrating in a tokamak plasma slab, which is finite in the directions perpendicular to the main magnetic field. The particles interact with an electromagnetic electron-cyclotron wave of the ordinary (O-) or the extraordinary (X-) mode. A set of nonlinear and relativistic equations is derived, which take into account the effects of the charged particle motions on the wave. These consist of the equations of motion for the plasma electrons in the slab, as well as the wave equation in terms of the vector potential. The effect of the electron motions on the temporal evolution of the wave is reflected in the current density source term. (Author)

  4. Nonlinear and self-consistent treatment of ECRH

    International Nuclear Information System (INIS)

    Tsironis, C.; Vlahos, L.

    2005-01-01

    A self-consistent formulation for the nonlinear interaction of electromagnetic waves with relativistic magnetized electrons is applied for the description of the ECRH. In general, electron-cyclotron absorption is the result of resonances between the cyclotron harmonics and the Doppler-shifted waver frequency. The resonant interaction results to an intense wave-particle energy exchange and an electron acceleration, and for that reason it is widely applied in fusion experiments for plasma heating and current drive. The linear theory, for the wave absorption, as well as the quasilinear theory for the electron distribution function, are the most frequently-used tools for the study of wave-particle interactions. However, in many cases the validity of these theories is violated, namely cases where nonlinear effects, like, e. g. particle trapping in the wave field, are dominant in the particle phase-space. Our model consists of electrons streaming and gyrating in a tokamak plasma slab, which is finite in the directions perpendicular to the main magnetic field. The particles interact with an electromagnetic electron-cyclotron wave of the ordinary (O-) or the extraordinary (X-) mode. A set of nonlinear and relativistic equations is derived, which take into account the effects of the charged particle motions on the wave. These consist of the equations of motion for the plasma electrons in the slab, as well as the wave equation in terms of the vector potential. The effect of the electron motions on the temporal evolution of the wave is reflected in the current density source term. (Author)

  5. Development of a Consistent and Reproducible Porcine Scald Burn Model

    Science.gov (United States)

    Kempf, Margit; Kimble, Roy; Cuttle, Leila

    2016-01-01

    There are very few porcine burn models that replicate scald injuries similar to those encountered by children. We have developed a robust porcine burn model capable of creating reproducible scald burns for a wide range of burn conditions. The study was conducted with juvenile Large White pigs, creating replicates of burn combinations; 50°C for 1, 2, 5 and 10 minutes and 60°C, 70°C, 80°C and 90°C for 5 seconds. Visual wound examination, biopsies and Laser Doppler Imaging were performed at 1, 24 hours and at 3 and 7 days post-burn. A consistent water temperature was maintained within the scald device for long durations (49.8 ± 0.1°C when set at 50°C). The macroscopic and histologic appearance was consistent between replicates of burn conditions. For 50°C water, 10 minute duration burns showed significantly deeper tissue injury than all shorter durations at 24 hours post-burn (p ≤ 0.0001), with damage seen to increase until day 3 post-burn. For 5 second duration burns, by day 7 post-burn the 80°C and 90°C scalds had damage detected significantly deeper in the tissue than the 70°C scalds (p ≤ 0.001). A reliable and safe model of porcine scald burn injury has been successfully developed. The novel apparatus with continually refreshed water improves consistency of scald creation for long exposure times. This model allows the pathophysiology of scald burn wound creation and progression to be examined. PMID:27612153

  6. Consistent individual differences in fathering in threespined stickleback Gasterosteus aculeatus

    Directory of Open Access Journals (Sweden)

    Laura R. STEIN, Alison M. BELL

    2012-02-01

    Full Text Available There is growing evidence that individual animals show consistent differences in behavior. For example, individual threespined stickleback fish differ in how they react to predators and how aggressive they are during social interactions with conspecifics. A relatively unexplored but potentially important axis of variation is parental behavior. In sticklebacks, fathers provide all of the parental care that is necessary for offspring survival; therefore paternal care is directly tied to fitness. In this study, we assessed whether individual male sticklebacks differ consistently from each other in parental behavior. We recorded visits to nest, total time fanning, and activity levels of 11 individual males every day throughout one clutch, and then allowed the males to breed again. Half of the males were exposed to predation risk while parenting during the first clutch, and the other half of the males experienced predation risk during the second clutch. We detected dramatic temporal changes in parental behaviors over the course of the clutch: for example, total time fanning increased six-fold prior to eggs hatching, then decreased to approximately zero. Despite these temporal changes, males retained their individually-distinctive parenting styles within a clutch that could not be explained by differences in body size or egg mass. Moreover, individual differences in parenting were maintained when males reproduced for a second time. Males that were exposed to simulated predation risk briefly decreased fanning and increased activity levels. Altogether, these results show that individual sticklebacks consistently differ from each other in how they behave as parents [Current Zoology 58 (1: 45–52, 2012].

  7. Consistent individual differences in fathering in threespined stickleback Gasterosteus aculeatus

    Institute of Scientific and Technical Information of China (English)

    Laura R. STEIN; Alison M. BELL

    2012-01-01

    There is growing evidence that individual animals show consistent differences in behavior.For example,individual threespined stickleback fish differ in how they react to predators and how aggressive they are during social interactions with conspecifics.A relatively unexplored but potentially important axis of variation is parental behavior.In sticklebacks,fathers provide all of the parental care that is necessary for offspring survival; therefore paternal care is directly tied to fimess.In this study,we assessed whether individual male sticklebacks differ consistently from each other in parental behavior.We recorded visits to nest,total time fanning,and activity levels of 11 individual males every day throughout one clutch,and then allowed the males to breed again.Half of the males were exposed to predation risk while parenting during the fast clutch,and the other half of the males experienced predation risk during the second clutch.We detected dramatic temporal changes in parental behaviors over the course of the clutch:for example,total time fanning increased six-fold prior to eggs hatching,then decreased to approximately zero.Despite these temporal changes,males retained their individually-distinctive parenting styles within a clutch that could not be explained by differences in body size or egg mass.Moreover,individual differences in parenting were maintained when males reproduced for a second time.Males that were exposed to simulated predation risk briefly decreased fanning and increased activity levels.Altogether,these results show that individual sticklebacks consistently differ from each other in how they behave as parents [Current Zoology 58 (1):45-52,2012].

  8. Flood damage curves for consistent global risk assessments

    Science.gov (United States)

    de Moel, Hans; Huizinga, Jan; Szewczyk, Wojtek

    2016-04-01

    Assessing potential damage of flood events is an important component in flood risk management. Determining direct flood damage is commonly done using depth-damage curves, which denote the flood damage that would occur at specific water depths per asset or land-use class. Many countries around the world have developed flood damage models using such curves which are based on analysis of past flood events and/or on expert judgement. However, such damage curves are not available for all regions, which hampers damage assessments in those regions. Moreover, due to different methodologies employed for various damage models in different countries, damage assessments cannot be directly compared with each other, obstructing also supra-national flood damage assessments. To address these problems, a globally consistent dataset of depth-damage curves has been developed. This dataset contains damage curves depicting percent of damage as a function of water depth as well as maximum damage values for a variety of assets and land use classes (i.e. residential, commercial, agriculture). Based on an extensive literature survey concave damage curves have been developed for each continent, while differentiation in flood damage between countries is established by determining maximum damage values at the country scale. These maximum damage values are based on construction cost surveys from multinational construction companies, which provide a coherent set of detailed building cost data across dozens of countries. A consistent set of maximum flood damage values for all countries was computed using statistical regressions with socio-economic World Development Indicators from the World Bank. Further, based on insights from the literature survey, guidance is also given on how the damage curves and maximum damage values can be adjusted for specific local circumstances, such as urban vs. rural locations, use of specific building material, etc. This dataset can be used for consistent supra

  9. Coagulation of Agglomerates Consisting of Polydisperse Primary Particles.

    Science.gov (United States)

    Goudeli, E; Eggersdorfer, M L; Pratsinis, S E

    2016-09-13

    The ballistic agglomeration of polydisperse particles is investigated by an event-driven (ED) method and compared to the coagulation of spherical particles and agglomerates consisting of monodisperse primary particles (PPs). It is shown for the first time to our knowledge that increasing the width or polydispersity of the PP size distribution initially accelerates the coagulation rate of their agglomerates but delays the attainment of their asymptotic fractal-like structure and self-preserving size distribution (SPSD) without altering them, provided that sufficiently large numbers of PPs are employed. For example, the standard asymptotic mass fractal dimension, Df, of 1.91 is attained when clusters are formed containing, on average, about 15 monodisperse PPs, consistent with fractal theory and the literature. In contrast, when polydisperse PPs with a geometric standard deviation of 3 are employed, about 500 PPs are needed to attain that Df. Even though the same asymptotic Df and mass-mobility exponent, Dfm, are attained regardless of PP polydispersity, the asymptotic prefactors or lacunarities of Df and Dfm increase with PP polydispersity. For monodisperse PPs, the average agglomerate radius of gyration, rg, becomes larger than the mobility radius, rm, when agglomerates consist of more than 15 PPs. Increasing PP polydispersity increases that number of PPs similarly to the above for the attainment of the asymptotic Df or Dfm. The agglomeration kinetics are quantified by the overall collision frequency function. When the SPSD is attained, the collision frequency is independent of PP polydispersity. Accounting for the SPSD polydispersity in the overall agglomerate collision frequency is in good agreement with that frequency from detailed ED simulations once the SPSD is reached. Most importantly, the coagulation of agglomerates is described well by a monodisperse model for agglomerate and PP sizes, whereas the detailed agglomerate size distribution can be obtained by

  10. Near-resonant absorption in the time-dependent self-consistent field and multiconfigurational self-consistent field approximations

    DEFF Research Database (Denmark)

    Norman, Patrick; Bishop, David M.; Jensen, Hans Jørgen Aa

    2001-01-01

    Computationally tractable expressions for the evaluation of the linear response function in the multiconfigurational self-consistent field approximation were derived and implemented. The finite lifetime of the electronically excited states was considered and the linear response function was shown...... to be convergent in the whole frequency region. This was achieved through the incorporation of phenomenological damping factors that lead to complex response function values....

  11. Self-consistent adjoint analysis for topology optimization of electromagnetic waves

    Science.gov (United States)

    Deng, Yongbo; Korvink, Jan G.

    2018-05-01

    In topology optimization of electromagnetic waves, the Gâteaux differentiability of the conjugate operator to the complex field variable results in the complexity of the adjoint sensitivity, which evolves the original real-valued design variable to be complex during the iterative solution procedure. Therefore, the self-inconsistency of the adjoint sensitivity is presented. To enforce the self-consistency, the real part operator has been used to extract the real part of the sensitivity to keep the real-value property of the design variable. However, this enforced self-consistency can cause the problem that the derived structural topology has unreasonable dependence on the phase of the incident wave. To solve this problem, this article focuses on the self-consistent adjoint analysis of the topology optimization problems for electromagnetic waves. This self-consistent adjoint analysis is implemented by splitting the complex variables of the wave equations into the corresponding real parts and imaginary parts, sequentially substituting the split complex variables into the wave equations with deriving the coupled equations equivalent to the original wave equations, where the infinite free space is truncated by the perfectly matched layers. Then, the topology optimization problems of electromagnetic waves are transformed into the forms defined on real functional spaces instead of complex functional spaces; the adjoint analysis of the topology optimization problems is implemented on real functional spaces with removing the variational of the conjugate operator; the self-consistent adjoint sensitivity is derived, and the phase-dependence problem is avoided for the derived structural topology. Several numerical examples are implemented to demonstrate the robustness of the derived self-consistent adjoint analysis.

  12. Consistent and efficient processing of ADCP streamflow measurements

    Science.gov (United States)

    Mueller, David S.; Constantinescu, George; Garcia, Marcelo H.; Hanes, Dan

    2016-01-01

    The use of Acoustic Doppler Current Profilers (ADCPs) from a moving boat is a commonly used method for measuring streamflow. Currently, the algorithms used to compute the average depth, compute edge discharge, identify invalid data, and estimate velocity and discharge for invalid data vary among manufacturers. These differences could result in different discharges being computed from identical data. Consistent computational algorithm, automated filtering, and quality assessment of ADCP streamflow measurements that are independent of the ADCP manufacturer are being developed in a software program that can process ADCP moving-boat discharge measurements independent of the ADCP used to collect the data.

  13. Consistency of FMEA used in the validation of analytical procedures

    DEFF Research Database (Denmark)

    Oldenhof, M.T.; van Leeuwen, J.F.; Nauta, Maarten

    2011-01-01

    is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating......In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection...

  14. The numerical multiconfiguration self-consistent field approach for atoms

    International Nuclear Information System (INIS)

    Stiehler, Johannes

    1995-12-01

    The dissertation uses the Multiconfiguration Self-Consistent Field Approach to specify the electronic wave function of N electron atoms in a static electrical field. It presents numerical approaches to describe the wave functions and introduces new methods to compute the numerical Fock equations. Based on results computed with an implemented computer program the universal application, flexibility and high numerical precision of the presented approach is shown. RHF results and for the first time MCSCF results for polarizabilities and hyperpolarizabilities of various states of the atoms He to Kr are discussed. In addition, an application to interpret a plasma spectrum of gallium is presented. (orig.)

  15. Self-consistent potential variations in magnetic wells

    International Nuclear Information System (INIS)

    Kesner, J.; Knorr, G.; Nicholson, D.R.

    1981-01-01

    Self-consistent electrostatic potential variations are considered in a spatial region of weak magnetic field, as in the proposed tandem mirror thermal barriers (with no trapped ions). For some conditions, equivalent to ion distributions with a sufficiently high net drift speed along the magnetic field, the desired potential depressions are found. When the net drift speed is not high enough, potential depressions are found only in combination with strong electric fields on the boundaries of the system. These potential depressions are not directly related to the magnetic field depression. (author)

  16. Applicability of self-consistent mean-field theory

    International Nuclear Information System (INIS)

    Guo Lu; Sakata, Fumihiko; Zhao Enguang

    2005-01-01

    Within the constrained Hartree-Fock (CHF) theory, an analytic condition is derived to estimate whether a concept of the self-consistent mean field is realized in the level repulsive region. The derived condition states that an iterative calculation of the CHF equation does not converge when the quantum fluctuations coming from two-body residual interaction and quadrupole deformation become larger than a single-particle energy difference between two avoided crossing orbits. By means of numerical calculation, it is shown that the analytic condition works well for a realistic case

  17. Island of stability for consistent deformations of Einstein's gravity.

    Science.gov (United States)

    Berkhahn, Felix; Dietrich, Dennis D; Hofmann, Stefan; Kühnel, Florian; Moyassari, Parvin

    2012-03-30

    We construct deformations of general relativity that are consistent and phenomenologically viable, since they respect, in particular, cosmological backgrounds. These deformations have unique symmetries in accordance with their Minkowski cousins (Fierz-Pauli theory for massive gravitons) and incorporate a background curvature induced self-stabilizing mechanism. Self-stabilization is essential in order to guarantee hyperbolic evolution in and unitarity of the covariantized theory, as well as the deformation's uniqueness. We show that the deformation's parameter space contains islands of absolute stability that are persistent through the entire cosmic evolution.

  18. The self-consistent dynamic pole tide in global oceans

    Science.gov (United States)

    Dickman, S. R.

    1985-01-01

    The dynamic pole tide is characterized in a self-consistent manner by means of introducing a single nondifferential matrix equation compatible with the Liouville equation, modelling the ocean as global and of uniform depth. The deviations of the theory from the realistic ocean, associated with the nonglobality of the latter, are also given consideration, with an inference that in realistic oceans long-period modes of resonances would be increasingly likely to exist. The analysis of the nature of the pole tide and its effects on the Chandler wobble indicate that departures of the pole tide from the equilibrium may indeed be minimal.

  19. Simplified models for dark matter face their consistent completions

    Energy Technology Data Exchange (ETDEWEB)

    Gonçalves, Dorival; Machado, Pedro A. N.; No, Jose Miguel

    2017-03-01

    Simplified dark matter models have been recently advocated as a powerful tool to exploit the complementarity between dark matter direct detection, indirect detection and LHC experimental probes. Focusing on pseudoscalar mediators between the dark and visible sectors, we show that the simplified dark matter model phenomenology departs significantly from that of consistent ${SU(2)_{\\mathrm{L}} \\times U(1)_{\\mathrm{Y}}}$ gauge invariant completions. We discuss the key physics simplified models fail to capture, and its impact on LHC searches. Notably, we show that resonant mono-Z searches provide competitive sensitivities to standard mono-jet analyses at $13$ TeV LHC.

  20. Two-particle self-consistent approach to unconventional superconductivity

    Energy Technology Data Exchange (ETDEWEB)

    Otsuki, Junya [Department of Physics, Tohoku University, Sendai (Japan); Theoretische Physik III, Zentrum fuer Elektronische Korrelationen und Magnetismus, Universitaet Augsburg (Germany)

    2013-07-01

    A non-perturbative approach to unconventional superconductivity is developed based on the idea of the two-particle self-consistent (TPSC) theory. An exact sum-rule which the momentum-dependent pairing susceptibility satisfies is derived. Effective pairing interactions between quasiparticles are determined so that an approximate susceptibility should fulfill this sum-rule, in which fluctuations belonging to different symmetries mix at finite momentum. The mixing leads to a suppression of the d{sub x{sup 2}-y{sup 2}} pairing close to the half-filling, resulting in a maximum of T{sub c} away from half-filling.

  1. Correlations and self-consistency in pion scattering. II

    International Nuclear Information System (INIS)

    Johnson, M.B.; Keister, B.D.

    1978-01-01

    In an attempt to overcome certain difficulties of summing higher order processes in pion multiple scattering theories, a new, systematic expansion for the interaction of a pion in nuclear matter is derived within the context of the Foldy-Walecka theory, incorporating nucleon-nucleon correlations and an idea of self-consistency. The first two orders in the expansion are evaluated as a function of the nonlocality range; the expansion appears to be rapidly converging, in contrast to expansion schemes previously examined. (Auth.)

  2. Quark mean field theory and consistency with nuclear matter

    International Nuclear Information System (INIS)

    Dey, J.; Dey, M.; Frederico, T.; Tomio, L.

    1990-09-01

    1/N c expansion in QCD (with N c the number of colours) suggests using a potential from meson sector (e.g. Richardson) for baryons. For light quarks a σ field has to be introduced to ensure chiral symmetry breaking ( χ SB). It is found that nuclear matter properties can be used to pin down the χ SB-modelling. All masses, M N , m σ , m ω are found to scale with density. The equations are solved self consistently. (author). 29 refs, 2 tabs

  3. A self-consistent model of an isothermal tokamak

    Science.gov (United States)

    McNamara, Steven; Lilley, Matthew

    2014-10-01

    Continued progress in liquid lithium coating technologies have made the development of a beam driven tokamak with minimal edge recycling a feasibly possibility. Such devices are characterised by improved confinement due to their inherent stability and the suppression of thermal conduction. Particle and energy confinement become intrinsically linked and the plasma thermal energy content is governed by the injected beam. A self-consistent model of a purely beam fuelled isothermal tokamak is presented, including calculations of the density profile, bulk species temperature ratios and the fusion output. Stability considerations constrain the operating parameters and regions of stable operation are identified and their suitability to potential reactor applications discussed.

  4. Self-consistent calculation of 208Pb spectrum

    International Nuclear Information System (INIS)

    Pal'chik, V.V.; Pyatov, N.I.; Fayans, S.A.

    1981-01-01

    The self-consistent model with exact accounting for one-particle continuum is applied to calculate all discrete particle-hole natural parity states with 2 208 Pb nucleus (up to the neutron emission threshold, 7.4 MeV). Contributions to the energy-weighted sum rules S(EL) of the first collective levels and total contributions of all discrete levels are evaluated. Most strongly the collectivization is manifested for octupole states. With multipolarity growth L contributions of discrete levels are sharply reduced. The results are compared with other models and the experimental data obtained in (e, e'), (p, p') reactions and other data [ru

  5. Poisson solvers for self-consistent multi-particle simulations

    International Nuclear Information System (INIS)

    Qiang, J; Paret, S

    2014-01-01

    Self-consistent multi-particle simulation plays an important role in studying beam-beam effects and space charge effects in high-intensity beams. The Poisson equation has to be solved at each time-step based on the particle density distribution in the multi-particle simulation. In this paper, we review a number of numerical methods that can be used to solve the Poisson equation efficiently. The computational complexity of those numerical methods will be O(N log(N)) or O(N) instead of O(N2), where N is the total number of grid points used to solve the Poisson equation

  6. Consistency of differential and integral thermonuclear neutronics data

    International Nuclear Information System (INIS)

    Reupke, W.A.

    1978-01-01

    To increase the accuracy of the neutronics analysis of nuclear reactors, physicists and engineers have employed a variety of techniques, including the adjustment of multigroup differential data to improve consistency with integral data. Of the various adjustment strategies, a generalized least-squares procedure which adjusts the combined differential and integral data can significantly improve the accuracy of neutronics calculations compared to calculations employing only differential data. This investigation analyzes 14 MeV neutron-driven integral experiments, using a more extensively developed methodology and a newly developed computer code, to extend the domain of adjustment from the energy range of fission reactors to the energy range of fusion reactors

  7. Consistent treatment of one-body dynamics and collective fluctuations

    International Nuclear Information System (INIS)

    Pfitzner, A.

    1986-09-01

    We show how the residual coupling deltaV between collective and intrinsic motion induces correlations, which lead to fluctuations of the collective variables and to a redistribution of single-particle occupation numbers rho/sub α/. The evolution of rho/sub α/ and of the collective fluctuations is consistently described by a coupled system of equations, which accounts for the dependence of the transport coefficients on rho/sub α/, and for the dependence of the transition rates in the master equation on the collective variances. (author)

  8. Mean-field theory and self-consistent dynamo modeling

    International Nuclear Information System (INIS)

    Yoshizawa, Akira; Yokoi, Nobumitsu

    2001-12-01

    Mean-field theory of dynamo is discussed with emphasis on the statistical formulation of turbulence effects on the magnetohydrodynamic equations and the construction of a self-consistent dynamo model. The dynamo mechanism is sought in the combination of the turbulent residual-helicity and cross-helicity effects. On the basis of this mechanism, discussions are made on the generation of planetary magnetic fields such as geomagnetic field and sunspots and on the occurrence of flow by magnetic fields in planetary and fusion phenomena. (author)

  9. The Consistency Of High Attorney Of Papua In Corruption Investigation

    Directory of Open Access Journals (Sweden)

    Samsul Tamher

    2015-08-01

    Full Text Available This study aimed to determine the consistency of High Attorney of Papua in corruption investigation and efforts to return the state financial loss. The type of study used in this paper is a normative-juridical and empirical-juridical. The results showed that the High Attorney of Papua in corruption investigation is not optimal due to the political interference on a case that involving local officials so that the High Attorney in decide the case is not accordance with the rule of law. The efforts of the High Attorney of Papua to return the state financial loss through State Auction Body civil- and criminal laws.

  10. Wavelets in self-consistent electronic structure calculations

    International Nuclear Information System (INIS)

    Wei, S.; Chou, M.Y.

    1996-01-01

    We report the first implementation of orthonormal wavelet bases in self-consistent electronic structure calculations within the local-density approximation. These local bases of different scales efficiently describe localized orbitals of interest. As an example, we studied two molecules, H 2 and O 2 , using pseudopotentials and supercells. Considerably fewer bases are needed compared with conventional plane-wave approaches, yet calculated binding properties are similar. Our implementation employs fast wavelet and Fourier transforms, avoiding evaluating any three-dimensional integral numerically. copyright 1996 The American Physical Society

  11. Self-consistent electronic-structure calculations for interface geometries

    International Nuclear Information System (INIS)

    Sowa, E.C.; Gonis, A.; MacLaren, J.M.; Zhang, X.G.

    1992-01-01

    This paper describes a technique for computing self-consistent electronic structures and total energies of planar defects, such as interfaces, which are embedded in an otherwise perfect crystal. As in the Layer Korringa-Kohn-Rostoker approach, the solid is treated as a set of coupled layers of atoms, using Bloch's theorem to take advantage of the two-dimensional periodicity of the individual layers. The layers are coupled using the techniques of the Real-Space Multiple-Scattering Theory, avoiding artificial slab or supercell boundary conditions. A total-energy calculation on a Cu crystal, which has been split apart at a (111) plane, is used to illustrate the method

  12. Sensor and control for consistent seed drill coulter depth

    DEFF Research Database (Denmark)

    Kirkegaard Nielsen, Søren; Nørremark, Michael; Green, Ole

    2016-01-01

    The consistent depth placement of seeds is vital for achieving the optimum yield of agricultural crops. In state-of-the-art seeding machines, the depth of drill coulters will vary with changes in soil resistance. This paper presents the retrofitting of an angle sensor to the pivoting point...... by a sub-millimetre accurate positioning system (iGPS, Nikon Metrology NV, Belgium) mounted on the drill coulter. At a drill coulter depth of 55 mm and controlled by an ordinary fixed spring loaded down force only, the change in soil resistance decreased the mean depth by 23 mm. By dynamically controlling...

  13. SIMPLE ESTIMATOR AND CONSISTENT STRONGLY OF STABLE DISTRIBUTIONS

    Directory of Open Access Journals (Sweden)

    Cira E. Guevara Otiniano

    2016-06-01

    Full Text Available Stable distributions are extensively used to analyze earnings of financial assets, such as exchange rates and stock prices assets. In this paper we propose a simple and strongly consistent estimator for the scale parameter of a symmetric stable L´evy distribution. The advantage of this estimator is that your computational time is minimum thus it can be used to initialize intensive computational procedure such as maximum likelihood. With random samples of sized n we tested the efficacy of these estimators by Monte Carlo method. We also included applications for three data sets.

  14. Tunneling in a self-consistent dynamic image potential

    International Nuclear Information System (INIS)

    Rudberg, B.G.R.; Jonson, M.

    1991-01-01

    We have calculated the self-consistent effective potential for an electron tunneling through a square barrier while interacting with surface plasmons. This potential reduces to the classical image potential in the static limit. In the opposite limit, when the ''velocity'' of the tunneling electron is large, it reduces to the unperturbed square-barrier potential. For a wide variety of parameters the dynamic effects on the transmission coefficient T=|t 2 | can, for instance, be related to the Buettiker-Landauer traversal time for tunneling, given by τ BL =ℎ|d lnt/dV|

  15. On the hydrodynamic limit of self-consistent field equations

    International Nuclear Information System (INIS)

    Pauli, H.C.

    1980-01-01

    As an approximation to the nuclear many-body problem, the hydrodynamical limit of self-consistent field equations is worked out and applied to the treatment of vibrational and rotational motion. Its validity is coupled to the value of a smallness parameter, behaving as 20Asup(-2/3) with the number of nucleons. For finite nuclei, this number is not small enough as compared to 1, and indeed one observes a discrepancy of roughly a factor of 5 between the hydrodynamic frequencies and the relevant experimental numbers. (orig.)

  16. Multiconfigurational self-consistent reaction field theory for nonequilibrium solvation

    DEFF Research Database (Denmark)

    Mikkelsen, Kurt V.; Cesar, Amary; Ågren, Hans

    1995-01-01

    electronic structure whereas the inertial polarization vector is not necessarily in equilibrium with the actual electronic structure. The electronic structure of the compound is described by a correlated electronic wave function - a multiconfigurational self-consistent field (MCSCF) wave function. This wave......, open-shell, excited, and transition states. We demonstrate the theory by computing solvatochromatic shifts in optical/UV spectra of some small molecules and electron ionization and electron detachment energies of the benzene molecule. It is shown that the dependency of the solvent induced affinity...

  17. A Consistent Pricing Model for Index Options and Volatility Derivatives

    DEFF Research Database (Denmark)

    Kokholm, Thomas

    to be priced consistently, while allowing for jumps in volatility and returns. An affine specification using Lévy processes as building blocks leads to analytically tractable pricing formulas for volatility derivatives, such as VIX options, as well as efficient numerical methods for pricing of European options...... on the underlying asset. The model has the convenient feature of decoupling the vanilla skews from spot/volatility correlations and allowing for different conditional correlations in large and small spot/volatility moves. We show that our model can simultaneously fit prices of European options on S&P 500 across...

  18. A Consistent Pricing Model for Index Options and Volatility Derivatives

    DEFF Research Database (Denmark)

    Cont, Rama; Kokholm, Thomas

    2013-01-01

    to be priced consistently, while allowing for jumps in volatility and returns. An affine specification using Lévy processes as building blocks leads to analytically tractable pricing formulas for volatility derivatives, such as VIX options, as well as efficient numerical methods for pricing of European options...... on the underlying asset. The model has the convenient feature of decoupling the vanilla skews from spot/volatility correlations and allowing for different conditional correlations in large and small spot/volatility moves. We show that our model can simultaneously fit prices of European options on S&P 500 across...

  19. Cetuximab in combination with irinotecan/5-fluorouracil/folinic acid (FOLFIRI) in the initial treatment of metastatic colorectal cancer: a multicentre two-part phase I/II study

    International Nuclear Information System (INIS)

    Raoul, Jean-Luc; Van Laethem, Jean-Luc; Peeters, Marc; Brezault, Catherine; Husseini, Fares; Cals, Laurent; Nippgen, Johannes; Loos, Anja-Helena; Rougier, Philippe

    2009-01-01

    This study was designed to investigate the efficacy and safety of the epidermal growth factor receptor (EGFR) inhibitor cetuximab combined with irinotecan, folinic acid (FA) and two different doses of infusional 5-fluorouracil (5-FU) in the first-line treatment of EGFR-detectable metastatic colorectal cancer. The 5-FU dose was selected on the basis of dose-limiting toxicities (DLTs) during part I of the study. Patients received cetuximab (400 mg/m 2 initial dose and 250 mg/m 2 /week thereafter) and every 2 weeks irinotecan (180 mg/m 2 ), FA (400 mg/m 2 ) and 5-FU (either low dose [LD], 300 mg/m 2 bolus plus 2,000 mg/m 2 46-hour infusion, n = 7; or, high-dose [HD], 400 mg/m 2 bolus plus 2,400 mg/m 2 ; n = 45). Only two DLTs occurred in the HD group, and HD 5-FU was selected for use in part II. Apart from rash, commonly observed grade 3/4 adverse events such as leucopenia, diarrhoea, vomiting and asthenia occurred within the expected range for FOLFIRI. Among 52 patients, the overall response rate was 48%. Median progression-free survival (PFS) was 8.6 months (counting all reported progressions) and the median overall survival was 22.4 months. Treatment facilitated the resection of initially unresectable metastases in fourteen patients (27%): of these, 10 patients (71%) had no residual tumour after surgery, and these resections hindered the estimation of PFS. The combination of cetuximab and FOLFIRI was active and well tolerated in this setting. Initially unresectable metastases became resectable in one-quarter of patients, with a high number of complete resections, and these promising results formed the basis for the investigation of FOLFIRI with and without cetuximab in the phase III CRYSTAL trial

  20. Cetuximab in combination with irinotecan/5-fluorouracil/folinic acid (FOLFIRI in the initial treatment of metastatic colorectal cancer: a multicentre two-part phase I/II study

    Directory of Open Access Journals (Sweden)

    Cals Laurent

    2009-04-01

    Full Text Available Abstract Background This study was designed to investigate the efficacy and safety of the epidermal growth factor receptor (EGFR inhibitor cetuximab combined with irinotecan, folinic acid (FA and two different doses of infusional 5-fluorouracil (5-FU in the first-line treatment of EGFR-detectable metastatic colorectal cancer. Methods The 5-FU dose was selected on the basis of dose-limiting toxicities (DLTs during part I of the study. Patients received cetuximab (400 mg/m2 initial dose and 250 mg/m2/week thereafter and every 2 weeks irinotecan (180 mg/m2, FA (400 mg/m2 and 5-FU (either low dose [LD], 300 mg/m2 bolus plus 2,000 mg/m2 46-hour infusion, n = 7; or, high-dose [HD], 400 mg/m2 bolus plus 2,400 mg/m2; n = 45. Results Only two DLTs occurred in the HD group, and HD 5-FU was selected for use in part II. Apart from rash, commonly observed grade 3/4 adverse events such as leucopenia, diarrhoea, vomiting and asthenia occurred within the expected range for FOLFIRI. Among 52 patients, the overall response rate was 48%. Median progression-free survival (PFS was 8.6 months (counting all reported progressions and the median overall survival was 22.4 months. Treatment facilitated the resection of initially unresectable metastases in fourteen patients (27%: of these, 10 patients (71% had no residual tumour after surgery, and these resections hindered the estimation of PFS. Conclusion The combination of cetuximab and FOLFIRI was active and well tolerated in this setting. Initially unresectable metastases became resectable in one-quarter of patients, with a high number of complete resections, and these promising results formed the basis for the investigation of FOLFIRI with and without cetuximab in the phase III CRYSTAL trial.

  1. Self-consistent viscous heating of rapidly compressed turbulence

    Science.gov (United States)

    Campos, Alejandro; Morgan, Brandon

    2017-11-01

    Given turbulence subjected to infinitely rapid deformations, linear terms representing interactions between the mean flow and the turbulence dictate the evolution of the flow, whereas non-linear terms corresponding to turbulence-turbulence interactions are safely ignored. For rapidly deformed flows where the turbulence Reynolds number is not sufficiently large, viscous effects can't be neglected and tend to play a prominent role, as shown in the study of Davidovits & Fisch (2016). For such a case, the rapid increase of viscosity in a plasma-as compared to the weaker scaling of viscosity in a fluid-leads to the sudden viscous dissipation of turbulent kinetic energy. As shown in Davidovits & Fisch, increases in temperature caused by the direct compression of the plasma drive sufficiently large values of viscosity. We report on numerical simulations of turbulence where the increase in temperature is the result of both the direct compression (an inviscid mechanism) and the self-consistent viscous transfer of energy from the turbulent scales towards the thermal energy. A comparison between implicit large-eddy simulations against well-resolved direct numerical simulations is included to asses the effect of the numerical and subgrid-scale dissipation on the self-consistent viscous This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  2. Parton Distributions based on a Maximally Consistent Dataset

    Science.gov (United States)

    Rojo, Juan

    2016-04-01

    The choice of data that enters a global QCD analysis can have a substantial impact on the resulting parton distributions and their predictions for collider observables. One of the main reasons for this has to do with the possible presence of inconsistencies, either internal within an experiment or external between different experiments. In order to assess the robustness of the global fit, different definitions of a conservative PDF set, that is, a PDF set based on a maximally consistent dataset, have been introduced. However, these approaches are typically affected by theory biases in the selection of the dataset. In this contribution, after a brief overview of recent NNPDF developments, we propose a new, fully objective, definition of a conservative PDF set, based on the Bayesian reweighting approach. Using the new NNPDF3.0 framework, we produce various conservative sets, which turn out to be mutually in agreement within the respective PDF uncertainties, as well as with the global fit. We explore some of their implications for LHC phenomenology, finding also good consistency with the global fit result. These results provide a non-trivial validation test of the new NNPDF3.0 fitting methodology, and indicate that possible inconsistencies in the fitted dataset do not affect substantially the global fit PDFs.

  3. Self-consistent modeling of electron cyclotron resonance ion sources

    International Nuclear Information System (INIS)

    Girard, A.; Hitz, D.; Melin, G.; Serebrennikov, K.; Lecot, C.

    2004-01-01

    In order to predict the performances of electron cyclotron resonance ion source (ECRIS), it is necessary to perfectly model the different parts of these sources: (i) magnetic configuration; (ii) plasma characteristics; (iii) extraction system. The magnetic configuration is easily calculated via commercial codes; different codes also simulate the ion extraction, either in two dimension, or even in three dimension (to take into account the shape of the plasma at the extraction influenced by the hexapole). However the characteristics of the plasma are not always mastered. This article describes the self-consistent modeling of ECRIS: we have developed a code which takes into account the most important construction parameters: the size of the plasma (length, diameter), the mirror ratio and axial magnetic profile, whether a biased probe is installed or not. These input parameters are used to feed a self-consistent code, which calculates the characteristics of the plasma: electron density and energy, charge state distribution, plasma potential. The code is briefly described, and some of its most interesting results are presented. Comparisons are made between the calculations and the results obtained experimentally

  4. Self-consistent modeling of electron cyclotron resonance ion sources

    Science.gov (United States)

    Girard, A.; Hitz, D.; Melin, G.; Serebrennikov, K.; Lécot, C.

    2004-05-01

    In order to predict the performances of electron cyclotron resonance ion source (ECRIS), it is necessary to perfectly model the different parts of these sources: (i) magnetic configuration; (ii) plasma characteristics; (iii) extraction system. The magnetic configuration is easily calculated via commercial codes; different codes also simulate the ion extraction, either in two dimension, or even in three dimension (to take into account the shape of the plasma at the extraction influenced by the hexapole). However the characteristics of the plasma are not always mastered. This article describes the self-consistent modeling of ECRIS: we have developed a code which takes into account the most important construction parameters: the size of the plasma (length, diameter), the mirror ratio and axial magnetic profile, whether a biased probe is installed or not. These input parameters are used to feed a self-consistent code, which calculates the characteristics of the plasma: electron density and energy, charge state distribution, plasma potential. The code is briefly described, and some of its most interesting results are presented. Comparisons are made between the calculations and the results obtained experimentally.

  5. Consistent Quantum Histories: Towards a Universal Language of Physics

    International Nuclear Information System (INIS)

    Grygiel, W.P.

    2007-01-01

    The consistent histories interpretation of quantum mechanics is a reformulation of the standard Copenhagen interpretation that aims at incorporating quantum probabilities as part of the axiomatic foundations of the theory. It is not only supposed to equip quantum mechanics with clear criteria of its own experimental verification but, first and foremost, to alleviate one of the stumbling blocks of the theory - the measurement problem. Since the consistent histories interpretation operates with a series of quantum events integrated into one quantum history, the measurement problem is naturally absorbed as one of the events that build up a history. The interpretation rests upon the two following assumptions, proposed already by J. von Neumann: (1) both the microscopic and macroscopic regimes are subject to the same set of quantum laws and (2) a projector operator that is assigned to each event within a history permits to transcribe the history into a set of propositions that relate the entire course of quantum events. Based on this, a universal language of physics is expected to emerge that will bring the quantum apparatus back to common sense propositional logic. The basic philosophical issue raised this study is whether one should justify quantum mechanics by means of what emerges from it, that is, the properties of the macroscopic world, or use the axioms of quantum mechanics to demonstrate the mechanisms how the macroscopic world comes about from the quantum regime. (author)

  6. Feeling Expression Using Avatars and Its Consistency for Subjective Annotation

    Science.gov (United States)

    Ito, Fuyuko; Sasaki, Yasunari; Hiroyasu, Tomoyuki; Miki, Mitsunori

    Consumer Generated Media(CGM) is growing rapidly and the amount of content is increasing. However, it is often difficult for users to extract important contents and the existence of contents recording their experiences can easily be forgotten. As there are no methods or systems to indicate the subjective value of the contents or ways to reuse them, subjective annotation appending subjectivity, such as feelings and intentions, to contents is needed. Representation of subjectivity depends on not only verbal expression, but also nonverbal expression. Linguistically expressed annotation, typified by collaborative tagging in social bookmarking systems, has come into widespread use, but there is no system of nonverbally expressed annotation on the web. We propose the utilization of controllable avatars as a means of nonverbal expression of subjectivity, and confirmed the consistency of feelings elicited by avatars over time for an individual and in a group. In addition, we compared the expressiveness and ease of subjective annotation between collaborative tagging and controllable avatars. The result indicates that the feelings evoked by avatars are consistent in both cases, and using controllable avatars is easier than collaborative tagging for representing feelings elicited by contents that do not express meaning, such as photos.

  7. Analytic Intermodel Consistent Modeling of Volumetric Human Lung Dynamics.

    Science.gov (United States)

    Ilegbusi, Olusegun; Seyfi, Behnaz; Neylon, John; Santhanam, Anand P

    2015-10-01

    Human lung undergoes breathing-induced deformation in the form of inhalation and exhalation. Modeling the dynamics is numerically complicated by the lack of information on lung elastic behavior and fluid-structure interactions between air and the tissue. A mathematical method is developed to integrate deformation results from a deformable image registration (DIR) and physics-based modeling approaches in order to represent consistent volumetric lung dynamics. The computational fluid dynamics (CFD) simulation assumes the lung is a poro-elastic medium with spatially distributed elastic property. Simulation is performed on a 3D lung geometry reconstructed from four-dimensional computed tomography (4DCT) dataset of a human subject. The heterogeneous Young's modulus (YM) is estimated from a linear elastic deformation model with the same lung geometry and 4D lung DIR. The deformation obtained from the CFD is then coupled with the displacement obtained from the 4D lung DIR by means of the Tikhonov regularization (TR) algorithm. The numerical results include 4DCT registration, CFD, and optimal displacement data which collectively provide consistent estimate of the volumetric lung dynamics. The fusion method is validated by comparing the optimal displacement with the results obtained from the 4DCT registration.

  8. Detection and quantification of flow consistency in business process models.

    Science.gov (United States)

    Burattin, Andrea; Bernstein, Vered; Neurauter, Manuel; Soffer, Pnina; Weber, Barbara

    2018-01-01

    Business process models abstract complex business processes by representing them as graphical models. Their layout, as determined by the modeler, may have an effect when these models are used. However, this effect is currently not fully understood. In order to systematically study this effect, a basic set of measurable key visual features is proposed, depicting the layout properties that are meaningful to the human user. The aim of this research is thus twofold: first, to empirically identify key visual features of business process models which are perceived as meaningful to the user and second, to show how such features can be quantified into computational metrics, which are applicable to business process models. We focus on one particular feature, consistency of flow direction, and show the challenges that arise when transforming it into a precise metric. We propose three different metrics addressing these challenges, each following a different view of flow consistency. We then report the results of an empirical evaluation, which indicates which metric is more effective in predicting the human perception of this feature. Moreover, two other automatic evaluations describing the performance and the computational capabilities of our metrics are reported as well.

  9. Effect of irradiation on Brazilian honeys' consistency and their acceptability

    International Nuclear Information System (INIS)

    Matsuda, A.H.; Sabato, S.F.

    2004-01-01

    Contamination of bee products may occur during packing or even during the process of collection. Gamma irradiation was found to decrease the number of bacteria and fungi. However, little information is available on the effects of gamma irradiation on viscosity which is an important property of honey. In this work the viscosity of two varieties of Brazilian honey was measured when they were irradiated at 5 and 10 kGy. The viscosity was measured at four temperatures (25 deg. C, 30 deg. C, 35 deg. C and 40 deg. C) for both samples and compared with control and within the doses. The sensory evaluation was carried on for the parameters color, odor, taste and consistency, using a 9-point hedonic scale. All the data were treated with a statistical tool (Statistica 5.1, StatSoft, 1998). The viscosity was not impaired significantly by gamma irradiation in doses 5 and 10 kGy (p<0.05). The effect of gamma irradiation on sensorial characteristics (odor, color, taste and consistency) is presented. The taste for Parana type indicated a significant difference among irradiation doses (p<0.05) but the higher value was for 5 kGy dose, demonstrating the acceptability for this case. The Organic honey presented the taste parameter for 10 kGy, significantly lower than the control mean but it did not differ significantly from the 5 kGy value

  10. ACHIEVING CONSISTENT DOPPLER MEASUREMENTS FROM SDO /HMI VECTOR FIELD INVERSIONS

    International Nuclear Information System (INIS)

    Schuck, Peter W.; Antiochos, S. K.; Leka, K. D.; Barnes, Graham

    2016-01-01

    NASA’s Solar Dynamics Observatory is delivering vector magnetic field observations of the full solar disk with unprecedented temporal and spatial resolution; however, the satellite is in a highly inclined geosynchronous orbit. The relative spacecraft–Sun velocity varies by ±3 km s −1 over a day, which introduces major orbital artifacts in the Helioseismic Magnetic Imager (HMI) data. We demonstrate that the orbital artifacts contaminate all spatial and temporal scales in the data. We describe a newly developed three-stage procedure for mitigating these artifacts in the Doppler data obtained from the Milne–Eddington inversions in the HMI pipeline. The procedure ultimately uses 32 velocity-dependent coefficients to adjust 10 million pixels—a remarkably sparse correction model given the complexity of the orbital artifacts. This procedure was applied to full-disk images of AR 11084 to produce consistent Dopplergrams. The data adjustments reduce the power in the orbital artifacts by 31 dB. Furthermore, we analyze in detail the corrected images and show that our procedure greatly improves the temporal and spectral properties of the data without adding any new artifacts. We conclude that this new procedure makes a dramatic improvement in the consistency of the HMI data and in its usefulness for precision scientific studies.

  11. Consistent three-equation model for thin films

    Science.gov (United States)

    Richard, Gael; Gisclon, Marguerite; Ruyer-Quil, Christian; Vila, Jean-Paul

    2017-11-01

    Numerical simulations of thin films of newtonian fluids down an inclined plane use reduced models for computational cost reasons. These models are usually derived by averaging over the fluid depth the physical equations of fluid mechanics with an asymptotic method in the long-wave limit. Two-equation models are based on the mass conservation equation and either on the momentum balance equation or on the work-energy theorem. We show that there is no two-equation model that is both consistent and theoretically coherent and that a third variable and a three-equation model are required to solve all theoretical contradictions. The linear and nonlinear properties of two and three-equation models are tested on various practical problems. We present a new consistent three-equation model with a simple mathematical structure which allows an easy and reliable numerical resolution. The numerical calculations agree fairly well with experimental measurements or with direct numerical resolutions for neutral stability curves, speed of kinematic waves and of solitary waves and depth profiles of wavy films. The model can also predict the flow reversal at the first capillary trough ahead of the main wave hump.

  12. A model for cytoplasmic rheology consistent with magnetic twisting cytometry.

    Science.gov (United States)

    Butler, J P; Kelly, S M

    1998-01-01

    Magnetic twisting cytometry is gaining wide applicability as a tool for the investigation of the rheological properties of cells and the mechanical properties of receptor-cytoskeletal interactions. Current technology involves the application and release of magnetically induced torques on small magnetic particles bound to or inside cells, with measurements of the resulting angular rotation of the particles. The properties of purely elastic or purely viscous materials can be determined by the angular strain and strain rate, respectively. However, the cytoskeleton and its linkage to cell surface receptors display elastic, viscous, and even plastic deformation, and the simultaneous characterization of these properties using only elastic or viscous models is internally inconsistent. Data interpretation is complicated by the fact that in current technology, the applied torques are not constant in time, but decrease as the particles rotate. This paper describes an internally consistent model consisting of a parallel viscoelastic element in series with a parallel viscoelastic element, and one approach to quantitative parameter evaluation. The unified model reproduces all essential features seen in data obtained from a wide variety of cell populations, and contains the pure elastic, viscoelastic, and viscous cases as subsets.

  13. Temporal consistent depth map upscaling for 3DTV

    Science.gov (United States)

    Schwarz, Sebastian; Sjöström, Mârten; Olsson, Roger

    2014-03-01

    The ongoing success of three-dimensional (3D) cinema fuels increasing efforts to spread the commercial success of 3D to new markets. The possibilities of a convincing 3D experience at home, such as three-dimensional television (3DTV), has generated a great deal of interest within the research and standardization community. A central issue for 3DTV is the creation and representation of 3D content. Acquiring scene depth information is a fundamental task in computer vision, yet complex and error-prone. Dedicated range sensors, such as the Time­ of-Flight camera (ToF), can simplify the scene depth capture process and overcome shortcomings of traditional solutions, such as active or passive stereo analysis. Admittedly, currently available ToF sensors deliver only a limited spatial resolution. However, sophisticated depth upscaling approaches use texture information to match depth and video resolution. At Electronic Imaging 2012 we proposed an upscaling routine based on error energy minimization, weighted with edge information from an accompanying video source. In this article we develop our algorithm further. By adding temporal consistency constraints to the upscaling process, we reduce disturbing depth jumps and flickering artifacts in the final 3DTV content. Temporal consistency in depth maps enhances the 3D experience, leading to a wider acceptance of 3D media content. More content in better quality can boost the commercial success of 3DTV.

  14. Self-consistent chaos in the beam-plasma instability

    International Nuclear Information System (INIS)

    Tennyson, J.L.; Meiss, J.D.

    1993-01-01

    The effect of self-consistency on Hamiltonian systems with a large number of degrees-of-freedom is investigated for the beam-plasma instability using the single-wave model of O'Neil, Winfrey, and Malmberg.The single-wave model is reviewed and then rederived within the Hamiltonian context, which leads naturally to canonical action- angle variables. Simulations are performed with a large (10 4 ) number of beam particles interacting with the single wave. It is observed that the system relaxes into a time asymptotic periodic state where only a few collective degrees are active; namely, a clump of trapped particles oscillating in a modulated wave, within a uniform chaotic sea with oscillating phase space boundaries. Thus self-consistency is seen to effectively reduce the number of degrees- of-freedom. A simple low degree-of-freedom model is derived that treats the clump as a single macroparticle, interacting with the wave and chaotic sea. The uniform chaotic sea is modeled by a fluid waterbag, where the waterbag boundaries correspond approximately to invariant tori. This low degree-of-freedom model is seen to compare well with the simulation

  15. Sparse PDF Volumes for Consistent Multi-Resolution Volume Rendering

    KAUST Repository

    Sicat, Ronell Barrera

    2014-12-31

    This paper presents a new multi-resolution volume representation called sparse pdf volumes, which enables consistent multi-resolution volume rendering based on probability density functions (pdfs) of voxel neighborhoods. These pdfs are defined in the 4D domain jointly comprising the 3D volume and its 1D intensity range. Crucially, the computation of sparse pdf volumes exploits data coherence in 4D, resulting in a sparse representation with surprisingly low storage requirements. At run time, we dynamically apply transfer functions to the pdfs using simple and fast convolutions. Whereas standard low-pass filtering and down-sampling incur visible differences between resolution levels, the use of pdfs facilitates consistent results independent of the resolution level used. We describe the efficient out-of-core computation of large-scale sparse pdf volumes, using a novel iterative simplification procedure of a mixture of 4D Gaussians. Finally, our data structure is optimized to facilitate interactive multi-resolution volume rendering on GPUs.

  16. Self-consistent electron transport in collisional plasmas

    International Nuclear Information System (INIS)

    Mason, R.J.

    1982-01-01

    A self-consistent scheme has been developed to model electron transport in evolving plasmas of arbitrary classical collisionality. The electrons and ions are treated as either multiple donor-cell fluids, or collisional particles-in-cell. Particle suprathermal electrons scatter off ions, and drag against fluid background thermal electrons. The background electrons undergo ion friction, thermal coupling, and bremsstrahlung. The components move in self-consistent advanced E-fields, obtained by the Implicit Moment Method, which permits Δt >> ω/sub p/ -1 and Δx >> lambda/sub D/ - offering a 10 2 - 10 3 -fold speed-up over older explicit techniques. The fluid description for the background plasma components permits the modeling of transport in systems spanning more than a 10 7 -fold change in density, and encompassing contiguous collisional and collisionless regions. Results are presented from application of the scheme to the modeling of CO 2 laser-generated suprathermal electron transport in expanding thin foils, and in multi-foil target configurations

  17. Efficient self-consistency for magnetic tight binding

    Science.gov (United States)

    Soin, Preetma; Horsfield, A. P.; Nguyen-Manh, D.

    2011-06-01

    Tight binding can be extended to magnetic systems by including an exchange interaction on an atomic site that favours net spin polarisation. We have used a published model, extended to include long-ranged Coulomb interactions, to study defects in iron. We have found that achieving self-consistency using conventional techniques was either unstable or very slow. By formulating the problem of achieving charge and spin self-consistency as a search for stationary points of a Harris-Foulkes functional, extended to include spin, we have derived a much more efficient scheme based on a Newton-Raphson procedure. We demonstrate the capabilities of our method by looking at vacancies and self-interstitials in iron. Self-consistency can indeed be achieved in a more efficient and stable manner, but care needs to be taken to manage this. The algorithm is implemented in the code PLATO. Program summaryProgram title:PLATO Catalogue identifier: AEFC_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFC_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 228 747 No. of bytes in distributed program, including test data, etc.: 1 880 369 Distribution format: tar.gz Programming language: C and PERL Computer: Apple Macintosh, PC, Unix machines Operating system: Unix, Linux, Mac OS X, Windows XP Has the code been vectorised or parallelised?: Yes. Up to 256 processors tested RAM: Up to 2 Gbytes per processor Classification: 7.3 External routines: LAPACK, BLAS and optionally ScaLAPACK, BLACS, PBLAS, FFTW Catalogue identifier of previous version: AEFC_v1_0 Journal reference of previous version: Comput. Phys. Comm. 180 (2009) 2616 Does the new version supersede the previous version?: Yes Nature of problem: Achieving charge and spin self-consistency in magnetic tight binding can be very

  18. Psychometrics and the neuroscience of individual differences: Internal consistency limits between-subjects effects.

    Science.gov (United States)

    Hajcak, Greg; Meyer, Alexandria; Kotov, Roman

    2017-08-01

    In the clinical neuroscience literature, between-subjects differences in neural activity are presumed to reflect reliable measures-even though the psychometric properties of neural measures are almost never reported. The current article focuses on the critical importance of assessing and reporting internal consistency reliability-the homogeneity of "items" that comprise a neural "score." We demonstrate how variability in the internal consistency of neural measures limits between-subjects (i.e., individual differences) effects. To this end, we utilize error-related brain activity (i.e., the error-related negativity or ERN) in both healthy and generalized anxiety disorder (GAD) participants to demonstrate options for psychometric analyses of neural measures; we examine between-groups differences in internal consistency, between-groups effect sizes, and between-groups discriminability (i.e., ROC analyses)-all as a function of increasing items (i.e., number of trials). Overall, internal consistency should be used to inform experimental design and the choice of neural measures in individual differences research. The internal consistency of neural measures is necessary for interpreting results and guiding progress in clinical neuroscience-and should be routinely reported in all individual differences studies. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  19. Consistency in performance evaluation reports and medical records.

    Science.gov (United States)

    Lu, Mingshan; Ma, Ching-to Albert

    2002-12-01

    In the health care market managed care has become the latest innovation for the delivery of services. For efficient implementation, the managed care organization relies on accurate information. So clinicians are often asked to report on patients before referrals are approved, treatments authorized, or insurance claims processed. What are clinicians responses to solicitation for information by managed care organizations? The existing health literature has already pointed out the importance of provider gaming, sincere reporting, nudging, and dodging the rules. We assess the consistency of clinicians reports on clients across administrative data and clinical records. For about 1,000 alcohol abuse treatment episodes, we compare clinicians reports across two data sets. The first one, the Maine Addiction Treatment System (MATS), was an administrative data set; the state government used it for program performance monitoring and evaluation. The second was a set of medical record abstracts, taken directly from the clinical records of treatment episodes. A clinician s reporting practice exhibits an inconsistency if the information reported in MATS differs from the information reported in the medical record in a statistically significant way. We look for evidence of inconsistencies in five categories: admission alcohol use frequency, discharge alcohol use frequency, termination status, admission employment status, and discharge employment status. Chi-square tests, Kappa statistics, and sensitivity and specificity tests are used for hypothesis testing. Multiple imputation methods are employed to address the problem of missing values in the record abstract data set. For admission and discharge alcohol use frequency measures, we find, respectively, strong and supporting evidence for inconsistencies. We find equally strong evidence for consistency in reports of admission and discharge employment status, and mixed evidence on report consistency on termination status. Patterns of

  20. Method used to test the imaging consistency of binocular camera's left-right optical system

    Science.gov (United States)

    Liu, Meiying; Wang, Hu; Liu, Jie; Xue, Yaoke; Yang, Shaodong; Zhao, Hui

    2016-09-01

    To binocular camera, the consistency of optical parameters of the left and the right optical system is an important factor that will influence the overall imaging consistency. In conventional testing procedure of optical system, there lacks specifications suitable for evaluating imaging consistency. In this paper, considering the special requirements of binocular optical imaging system, a method used to measure the imaging consistency of binocular camera is presented. Based on this method, a measurement system which is composed of an integrating sphere, a rotary table and a CMOS camera has been established. First, let the left and the right optical system capture images in normal exposure time under the same condition. Second, a contour image is obtained based on the multiple threshold segmentation result and the boundary is determined using the slope of contour lines near the pseudo-contour line. Third, the constraint of gray level based on the corresponding coordinates of left-right images is established and the imaging consistency could be evaluated through standard deviation σ of the imaging grayscale difference D (x, y) between the left and right optical system. The experiments demonstrate that the method is suitable for carrying out the imaging consistency testing for binocular camera. When the standard deviation 3σ distribution of imaging gray difference D (x, y) between the left and right optical system of the binocular camera does not exceed 5%, it is believed that the design requirements have been achieved. This method could be used effectively and paves the way for the imaging consistency testing of the binocular camera.