WorldWideScience

Sample records for two-part designation consisting

  1. Consistent Design of Dependable Control Systems

    DEFF Research Database (Denmark)

    Blanke, M.

    1996-01-01

    Design of fault handling in control systems is discussed, and a method for consistent design is presented.......Design of fault handling in control systems is discussed, and a method for consistent design is presented....

  2. Consistency in multi-viewpoint architectural design

    NARCIS (Netherlands)

    Dijkman, R.M.; Dijkman, Remco Matthijs

    2006-01-01

    This thesis presents a framework that aids in preserving consistency in multi-viewpoint designs. In a multi-viewpoint design each stakeholder constructs his own design part. We call each stakeholder’s design part the view of that stakeholder. To construct his view, a stakeholder has a viewpoint.

  3. Consistency in multi-viewpoint architectural design

    NARCIS (Netherlands)

    Dijkman, Remco Matthijs

    2006-01-01

    This thesis presents a framework that aids in preserving consistency in multi-viewpoint designs. In a multi-viewpoint design each stakeholder constructs his own design part. We call each stakeholder¿s design part the view of that stakeholder. To construct his view, a stakeholder has a viewpoint. Thi

  4. Consistent Stochastic Modelling of Meteocean Design Parameters

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Sterndorff, M. J.

    2000-01-01

    Consistent stochastic models of metocean design parameters and their directional dependencies are essential for reliability assessment of offshore structures. In this paper a stochastic model for the annual maximum values of the significant wave height, and the associated wind velocity, current...... velocity, and water level is presented. The stochastic model includes statistical uncertainty and dependency between the four stochastic variables. Further, a new stochastic model for annual maximum directional significant wave heights is presented. The model includes dependency between the maximum wave...... height from neighboring directional sectors. Numerical examples are presented where the models are calibrated using the Maximum Likelihood method to data from the central part of the North Sea. The calibration of the directional distributions is made such that the stochastic model for the omnidirectional...

  5. Designing apps for success developing consistent app design practices

    CERN Document Server

    David, Matthew

    2014-01-01

    In 2007, Apple released the iPhone. With this release came tools as revolutionary as the internet was to businesses and individuals back in the mid- and late-nineties: Apps. Much like websites drove (and still drive) business, so too do apps drive sales, efficiencies and communication between people. But also like web design and development, in its early years and iterations, guidelines and best practices for apps are few and far between.Designing Apps for Success provides web/app designers and developers with consistent app design practices that result in timely, appropriate, and efficiently

  6. A Consistent Design Methodology for Wireless Embedded Systems

    Directory of Open Access Journals (Sweden)

    Sauzon G

    2005-01-01

    Full Text Available Complexity demand of modern communication systems, particularly in the wireless domain, grows at an astounding rate, a rate so high that the available complexity and even worse the design productivity required to convert algorithms into silicon are left far behind. This effect is commonly referred to as the design productivity crisis or simply the design gap. Since the design gap is predicted to widen every year, it is of utmost importance to look closer at the design flow of such communication systems in order to find improvements. While various ideas for speeding up designs have been proposed, very few have found their path into existing EDA products. This paper presents requirements for such tools and shows how an open design environment offers a solution to integrate existing EDA tools, allowing for a consistent design flow, considerably speeding up design times.

  7. Use of two-part regression calibration model to correct for measurement error in episodically consumed foods in a single-replicate study design: EPIC case study.

    Directory of Open Access Journals (Sweden)

    George O Agogo

    Full Text Available In epidemiologic studies, measurement error in dietary variables often attenuates association between dietary intake and disease occurrence. To adjust for the attenuation caused by error in dietary intake, regression calibration is commonly used. To apply regression calibration, unbiased reference measurements are required. Short-term reference measurements for foods that are not consumed daily contain excess zeroes that pose challenges in the calibration model. We adapted two-part regression calibration model, initially developed for multiple replicates of reference measurements per individual to a single-replicate setting. We showed how to handle excess zero reference measurements by two-step modeling approach, how to explore heteroscedasticity in the consumed amount with variance-mean graph, how to explore nonlinearity with the generalized additive modeling (GAM and the empirical logit approaches, and how to select covariates in the calibration model. The performance of two-part calibration model was compared with the one-part counterpart. We used vegetable intake and mortality data from European Prospective Investigation on Cancer and Nutrition (EPIC study. In the EPIC, reference measurements were taken with 24-hour recalls. For each of the three vegetable subgroups assessed separately, correcting for error with an appropriately specified two-part calibration model resulted in about three fold increase in the strength of association with all-cause mortality, as measured by the log hazard ratio. Further found is that the standard way of including covariates in the calibration model can lead to over fitting the two-part calibration model. Moreover, the extent of adjusting for error is influenced by the number and forms of covariates in the calibration model. For episodically consumed foods, we advise researchers to pay special attention to response distribution, nonlinearity, and covariate inclusion in specifying the calibration model.

  8. Dynamico, an Icosahedral Dynamical Core Designed for Consistency and Versatility

    Science.gov (United States)

    Dubos, T.

    2014-12-01

    , an icosahedral hydrostatic dynamical core designed for consistency and versatility", in preparation.

  9. Two-part set systems

    CERN Document Server

    Gerbner, Dániel; Lemons, Nathan; Mubayi, Dhruv; Palmer, Cory; Patkós, Balázs

    2011-01-01

    The two part Sperner theorem of Katona and Kleitman states that if $X$ is an $n$-element set with partition $X_1 \\cup X_2$, and $\\cF$ is a family of subsets of $X$ such that no two sets $A, B \\in \\cF$ satisfy $A \\subset B$ (or $B \\subset A$) and $A \\cap X_i=B \\cap X_i$ for some $i$, then $|\\cF| \\le {n \\choose \\lfloor n/2 \\rfloor}$. We consider variations of this problem by replacing the Sperner property with the intersection property and considering families that satisfiy various combinations of these properties on one or both parts $X_1$, $X_2$. Along the way, we prove the following new result which may be of independent interest: let $\\cF, \\cG$ be families of subsets of an $n$-element set such that $\\cF$ and $\\cG$ are both intersecting and cross-Sperner, meaning that if $A \\in \\cF$ and $B \\in \\cG$, then $A \

  10. Design of a Turbulence Generator of Medium Consistency Pulp Pumps

    OpenAIRE

    Hong Li; Haifei Zhuang; Weihao Geng

    2012-01-01

    The turbulence generator is a key component of medium consistency centrifugal pulp pumps, with functions to fluidize the medium consistency pulp and to separate gas from the liquid. Structure sizes of the generator affect the hydraulic performance. The radius and the blade laying angle are two important structural sizes of a turbulence generator. Starting with the research on the flow inside and shearing characteristics of the MC pulp, a simple mathematical model at the flow section of the sh...

  11. Design of a Turbulence Generator of Medium Consistency Pulp Pumps

    Directory of Open Access Journals (Sweden)

    Hong Li

    2012-01-01

    Full Text Available The turbulence generator is a key component of medium consistency centrifugal pulp pumps, with functions to fluidize the medium consistency pulp and to separate gas from the liquid. Structure sizes of the generator affect the hydraulic performance. The radius and the blade laying angle are two important structural sizes of a turbulence generator. Starting with the research on the flow inside and shearing characteristics of the MC pulp, a simple mathematical model at the flow section of the shearing chamber is built, and the formula and procedure to calculate the radius of the turbulence generator are established. The blade laying angle is referenced from the turbine agitator which has the similar shape with the turbulence generator, and the CFD simulation is applied to study the different flow fields with different blade laying angles. Then the recommended blade laying angle of the turbulence generator is formed to be between 60° and 75°.

  12. 两部制负荷率电价体系的设计及应用%Design and application of power selling price system with two parts load factor tariff

    Institute of Scientific and Technical Information of China (English)

    孙素苗

    2015-01-01

    Aiming at the problems of selling power price classification complexity, executing difficulty,cross subsidies severity,and not conducive to rational allocation of power resources, based on the deep analysis of selling power price classification frame,designs the merger scheme that adopting two parts load factor tariff to put together the selling power prices of large industry and general industry. The result shows that the merger more rationally reflects customer cost,decreases cross subsidies of selling power price,and has a role in promoting rational allocation of power resources.%针对我国目前销售电价分类复杂,执行中矛盾较多,交叉补贴严重,不利于电力资源合理配置的问题,在深入分析电价分类结构的基础上,设计了按两部制负荷率电价对大工业与一般工商业电价的归并方案。结果表明:归并后的电价方案较为合理地反映了用户的用电成本,减少了电价交叉补贴,对于电力资源的合理配置具有促进作用。

  13. Measuring consistency of web page design and its effects on performance and satisfaction.

    Science.gov (United States)

    Ozok, A A; Salvendy, G

    2000-04-01

    This study examines the methods for measuring the consistency levels of web pages and the effect of consistency on the performance and satisfaction of the world-wide web (WWW) user. For clarification, a home page is referred to as a single page that is the default page of a web site on the WWW. A web page refers to a single screen that indicates a specific address on the WWW. This study has tested a series of web pages that were mostly hyperlinked. Therefore, the term 'web page' has been adopted for the nomenclature while referring to the objects of which the features were tested. It was hypothesized that participants would perform better and be more satisfied using web pages that have consistent rather than inconsistent interface design; that the overall consistency level of an interface design would significantly correlate with the three elements of consistency, physical, communicational and conceptual consistency; and that physical and communicational consistencies would interact with each other. The hypotheses were tested in a four-group, between-subject design, with 10 participants in each group. The results partially support the hypothesis regarding error rate, but not regarding satisfaction and performance time. The results also support the hypothesis that each of the three elements of consistency significantly contribute to the overall consistency of a web page, and that physical and communicational consistencies interact with each other, while conceptual consistency does not interact with them.

  14. Establishing Interpretive Consistency When Mixing Approaches: Role of Sampling Designs in Evaluations

    Science.gov (United States)

    Collins, Kathleen M. T.; Onwuegbuzie, Anthony J.

    2013-01-01

    The goal of this chapter is to recommend quality criteria to guide evaluators' selections of sampling designs when mixing approaches. First, we contextualize our discussion of quality criteria and sampling designs by discussing the concept of interpretive consistency and how it impacts sampling decisions. Embedded in this discussion are…

  15. New geometric design consistency model based on operating speed profiles for road safety evaluation.

    Science.gov (United States)

    Camacho-Torregrosa, Francisco J; Pérez-Zuriaga, Ana M; Campoy-Ungría, J Manuel; García-García, Alfredo

    2013-12-01

    To assist in the on-going effort to reduce road fatalities as much as possible, this paper presents a new methodology to evaluate road safety in both the design and redesign stages of two-lane rural highways. This methodology is based on the analysis of road geometric design consistency, a value which will be a surrogate measure of the safety level of the two-lane rural road segment. The consistency model presented in this paper is based on the consideration of continuous operating speed profiles. The models used for their construction were obtained by using an innovative GPS-data collection method that is based on continuous operating speed profiles recorded from individual drivers. This new methodology allowed the researchers to observe the actual behavior of drivers and to develop more accurate operating speed models than was previously possible with spot-speed data collection, thereby enabling a more accurate approximation to the real phenomenon and thus a better consistency measurement. Operating speed profiles were built for 33 Spanish two-lane rural road segments, and several consistency measurements based on the global and local operating speed were checked. The final consistency model takes into account not only the global dispersion of the operating speed, but also some indexes that consider both local speed decelerations and speeds over posted speeds as well. For the development of the consistency model, the crash frequency for each study site was considered, which allowed estimating the number of crashes on a road segment by means of the calculation of its geometric design consistency. Consequently, the presented consistency evaluation method is a promising innovative tool that can be used as a surrogate measure to estimate the safety of a road segment.

  16. A time-consistent video segmentation algorithm designed for real-time implementation

    OpenAIRE

    Elhassani, Mohammed; Rivasseau, Delphine; Jehan-Besson, Stéphanie; Revenu, Marinette; Tschumperlé, David; Brun, Luc; Duranton, Marc

    2006-01-01

    International audience; In this paper, we propose a time consistent video segmentation algorithm designed for real-time implementation. Our segmentation algorithm is based on a region merging process that combines both spatial and motion information. The spatial segmentation takes benefit of an adaptive decision rule and a specific order of merging. Our method has proven to be efficient for the segmentation of natural images (flat or textured regions) with few parameters to be set. Temporal c...

  17. A Time-Consistent Video Segmentation Algorithm Designed for Real-Time Implementation

    Directory of Open Access Journals (Sweden)

    M. El Hassani

    2008-01-01

    Temporal consistency of the segmentation is ensured by incorporating motion information through the use of an improved change-detection mask. This mask is designed using both illumination differences between frames and region segmentation of the previous frame. By considering both pixel and region levels, we obtain a particularly efficient algorithm at a low computational cost, allowing its implementation in real-time on the TriMedia processor for CIF image sequences.

  18. A Simple Instrument Designed to Provide Consistent Digital Facial Images in Dermatology

    OpenAIRE

    Balakrishnan Nirmal; Pai, Sathish B.; Handattu Sripathi

    2013-01-01

    Photography has proven to be a valuable tool in the field of dermatology. The major reason for poor photographs is the inability to produce comparable images in the subsequent follow ups. Combining digital photography with image processing software analysis brings consistency in tracking serial images. Digital photographs were taken with the aid of an instrument which we designed in our workshop to ensure that photographs were taken with identical patient positioning, camera angles and distan...

  19. Holistic and Consistent Design Process for Hollow Structures Based on Braided Textiles and RTM

    Science.gov (United States)

    Gnädinger, Florian; Karcher, Michael; Henning, Frank; Middendorf, Peter

    2014-06-01

    The present paper elaborates a holistic and consistent design process for 2D braided composites in conjunction with Resin Transfer Moulding (RTM). These technologies allow a cost-effective production of composites due to their high degree of automation. Literature can be found that deals with specific tasks of the respective technologies but there is no work available that embraces the complete process chain. Therefore, an overall design process is developed within the present paper. It is based on a correlated conduction of sub-design processes for the braided preform, RTM-injection, mandrel plus mould and manufacturing. For each sub-process both, individual tasks and reasonable methods to accomplish them are presented. The information flow within the design process is specified and interdependences are illustrated. Composite designers will be equipped with an efficient set of tools because the respective methods regard the complexity of the part. The design process is applied for a demonstrator in a case study. The individual sub-design processes are accomplished exemplarily to judge about the feasibility of the presented work. For validation reasons, predicted braiding angles and fibre volume fractions are compared with measured ones and a filling and curing simulation based on PAM-RTM is checked against mould filling studies. Tool concepts for a RTM mould and mandrels that realise undercuts are tested. The individual process parameters for manufacturing are derived from previous design steps. Furthermore, the compatibility of the chosen fibre and matrix system is investigated based on pictures of a scanning electron microscope (SEM). The annual production volume of the demonstrator part is estimated based on these findings.

  20. Hazard consistent structural demands and in-structure design response spectra

    Energy Technology Data Exchange (ETDEWEB)

    Houston, Thomas W [Los Alamos National Laboratory; Costantino, Michael C [Los Alamos National Laboratory; Costantino, Carl J [Los Alamos National Laboratory

    2009-01-01

    Current analysis methodology for the Soil Structure Interaction (SSI) analysis of nuclear facilities is specified in ASCE Standard 4. This methodology is based on the use of deterministic procedures with the intention that enough conservatism is included in the specified procedures to achieve an 80% probability of non-exceedance in the computed response of a Structure, System. or Component for given a mean seismic design input. Recently developed standards are aimed at achieving performance-based, risk consistent seismic designs that meet specified target performance goals. These design approaches rely upon accurately characterizing the probability (hazard) level of system demands due to seismic loads consistent with Probabilistic Seismic Hazard Analyses. This paper examines the adequacy of the deterministic SSI procedures described in ASCE 4-98 to achieve an 80th percentile of Non-Exceedance Probability (NEP) in structural demand, given a mean seismic input motion. The study demonstrates that the deterministic procedures provide computed in-structure response spectra that are near or greater than the target 80th percentile NEP for site profiles other than those resulting in high levels of radiation damping. The deterministic procedures do not appear to be as robust in predicting peak accelerations, which correlate to structural demands within the structure.

  1. Department of Energy seismic siting and design decisions: Consistent use of probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, J.K.; Chander, H.

    1997-02-01

    The Department of Energy (DOE) requires that all nuclear or non-nuclear facilities shall be designed, constructed and operated so that the public, the workers, and the environment are protected from the adverse impacts of Natural Phenomena Hazards including earthquakes. The design and evaluation of DOE facilities to accommodate earthquakes shall be based on an assessment of the likelihood of future earthquakes occurrences commensurate with a graded approach which depends on the potential risk posed by the DOE facility. DOE has developed Standards for site characterization and hazards assessments to ensure that a consistent use of probabilistic seismic hazard is implemented at each DOE site. The criteria included in the DOE Standards are described, and compared to those criteria being promoted by the staff of the Nuclear Regulatory Commission (NRC) for commercial nuclear reactors. In addition to a general description of the DOE requirements and criteria, the most recent probabilistic seismic hazard results for a number of DOE sites are presented. Based on the work completed to develop the probabilistic seismic hazard results, a summary of important application issues are described with recommendations for future improvements in the development and use of probabilistic seismic hazard criteria for design of DOE facilities.

  2. Consistency and bicharacteristic analysis of integral porosity shallow water models. Explaining model oversensitivity to mesh design

    Science.gov (United States)

    Guinot, Vincent

    2017-09-01

    The Integral Porosity and Dual Integral Porosity two-dimensional shallow water models have been proposed recently as efficient upscaled models for urban floods. Very little is known so far about their consistency and wave propagation properties. Simple numerical experiments show that both models are unusually sensitive to the computational grid. In the present paper, a two-dimensional consistency and characteristic analysis is carried out for these two models. The following results are obtained: (i) the models are almost insensitive to grid design when the porosity is isotropic, (ii) anisotropic porosity fields induce an artificial polarization of the mass/momentum fluxes along preferential directions when triangular meshes are used and (iii) extra first-order derivatives appear in the governing equations when regular, quadrangular cells are used. The hyperbolic system is thus mesh-dependent, and with it the wave propagation properties of the model solutions. Criteria are derived to make the solution less mesh-dependent, but it is not certain that these criteria can be satisfied at all computational points when real-world situations are dealt with.

  3. Approximation of the two-part MDL code

    NARCIS (Netherlands)

    Adriaans, P.; Vitányi, P.M.B.

    2009-01-01

    Approximation of the optimal two-part minimum description length (MDL) code for given data, through successive monotonically length-decreasing two-part MDL codes, has the following properties: (i) computation of each step may take arbitrarily long; (ii) we may not know when we reach the optimum, or

  4. Approximation of the two-part MDL code

    NARCIS (Netherlands)

    P. Adriaans; P.M.B. Vitányi (Paul)

    2009-01-01

    htmlabstractApproximation of the optimal two-part minimum description length (MDL) code for given data, through successive monotonically length-decreasing two-part MDL codes, has the following properties: (i) computation of each step may take arbitrarily long; (ii) we may not know when we reach the

  5. Improving consistency in large laboratory courses: a design for a standardized practical exam.

    Science.gov (United States)

    Chen, Xinnian; Graesser, Donnasue; Sah, Megha

    2015-06-01

    Laboratory courses serve as important gateways to science, technology, engineering, and mathematics education. One of the challenges in assessing laboratory learning is to conduct meaningful and standardized practical exams, especially for large multisection laboratory courses. Laboratory practical exams in life sciences courses are frequently administered by asking students to move from station to station to answer questions, apply knowledge gained during laboratory experiments, interpret data, and identify various tissues and organs using various microscopic and gross specimens. This approach puts a stringent time limit on all questions regardless of the level of difficulty and also invariably increases the potential risk of cheating. To avoid potential cheating in laboratory courses with multiple sections, the setup for practical exams is often changed in some way between sections. In laboratory courses with multiple instructors or teaching assistants, practical exams may be handled inconsistently among different laboratory sections, due to differences in background knowledge, perceptions of the laboratory goals, or prior teaching experience. In this article, we describe a design for a laboratory practical exam that aims to align the assessment questions with well-defined laboratory learning objectives and improve the consistency among all laboratory sections.

  6. Cogntive Consistency Analysis in Adaptive Bio-Metric Authentication System Design

    Directory of Open Access Journals (Sweden)

    Gahangir Hossain

    2015-07-01

    Full Text Available Cognitive consistency analysis aims to continuously monitor one's perception equilibrium towards successful accomplishment of cognitive task. Opposite to cognitive flexibility analysis – cognitive consistency analysis identifies monotone of perception towards successful interaction process (e.g., biometric authentication and useful in generation of decision support to assist one in need. This study consider fingertip dynamics (e.g., keystroke, tapping, clicking etc. to have insights on instantaneous cognitive states and its effects in monotonic advancement towards successful authentication process. Keystroke dynamics and tapping dynamics are analyzed based on response time data. Finally, cognitive consistency and confusion (inconsistency are computed with Maximal Information Coefficient (MIC and Maximal Asymmetry Score (MAS, respectively. Our preliminary study indicates that a balance between cognitive consistency and flexibility are needed in successful authentication process. Moreover, adaptive and cognitive interaction system requires in depth analysis of user’s cognitive consistency to provide a robust and useful assistance.

  7. Teacher collaborative curriculum design in technical vocational colleges: a strategy for maintaining curriculum consistency?

    NARCIS (Netherlands)

    Albashiry, N.M.; Voogt, J.M.; Pieters, J.M.

    2015-01-01

    The Technical Vocational Education and Training (TVET) curriculum requires continuous renewal and constant involvement of stakeholders in the redesign process. Due to a lack of curriculum design expertise, TVET institutions in developing contexts encounter challenges maintaining and advancing the

  8. Teacher collaborative curriculum design in technical vocational colleges: a strategy for maintaining curriculum consistency?

    NARCIS (Netherlands)

    Albashiry, N.M.; Voogt, J.M.; Pieters, J.M.

    2015-01-01

    The Technical Vocational Education and Training (TVET) curriculum requires continuous renewal and constant involvement of stakeholders in the redesign process. Due to a lack of curriculum design expertise, TVET institutions in developing contexts encounter challenges maintaining and advancing the qu

  9. Teacher Collaborative Curriculum Design in Technical Vocational Colleges: A Strategy for Maintaining Curriculum Consistency?

    Science.gov (United States)

    Albashiry, Nabeel M.; Voogt, Joke M.; Pieters, Jules M.

    2015-01-01

    The Technical Vocational Education and Training (TVET) curriculum requires continuous renewal and constant involvement of stakeholders in the redesign process. Due to a lack of curriculum design expertise, TVET institutions in developing contexts encounter challenges maintaining and advancing the quality and relevance of their programmes to the…

  10. Interface Consistency

    DEFF Research Database (Denmark)

    Staunstrup, Jørgen

    1998-01-01

    This paper proposes that Interface Consistency is an important issue for the development of modular designs. Byproviding a precise specification of component interfaces it becomes possible to check that separately developedcomponents use a common interface in a coherent matter thus avoiding a very...... significant source of design errors. Awide range of interface specifications are possible, the simplest form is a syntactical check of parameter types.However, today it is possible to do more sophisticated forms involving semantic checks....

  11. Why do we not have a Consistent Design Method for Rubble Mound Breakwaters

    DEFF Research Database (Denmark)

    Burcharth, Hans F.

    of the art and the design tools are not satisfactory compared to those available in other branches of civil engineering such as for example structural engineering. I shall try to explain the difficulties e are facing in breakwater engineering, especially for rubble mound breakwaters, by summarizing some...... probability density functions of the involved parameters supplied with statistical information on the related persistance. The following presentation is not in accordance with this since each parameter is treated separately. This is done of the sake of simplicity and also because it will still serve the main...

  12. DYNAMICO, an icosahedral hydrostatic dynamical core designed for consistency and versatility

    Directory of Open Access Journals (Sweden)

    T. Dubos

    2015-02-01

    Full Text Available The design of the icosahedral dynamical core DYNAMICO is presented. DYNAMICO solves the multi-layer rotating shallow-water equations, a compressible variant of the same equivalent to a discretization of the hydrostatic primitive equations in a Lagrangian vertical coordinate, and the primitive equations in a hybrid mass-based vertical coordinate. The common Hamiltonian structure of these sets of equations is exploited to formulate energy-conserving spatial discretizations in a unified way. The horizontal mesh is a quasi-uniform icosahedral C-grid obtained by subdivision of a regular icosahedron. Control volumes for mass, tracers and entropy/potential temperature are the hexagonal cells of the Voronoi mesh to avoid the fast numerical modes of the triangular C-grid. The horizontal discretization is that of Ringler et al. (2010, whose discrete quasi-Hamiltonian structure is identified. The prognostic variables are arranged vertically on a Lorenz grid with all thermodynamical variables collocated with mass. The vertical discretization is obtained from the three-dimensional Hamiltonian formulation. Tracers are transported using a second-order finite volume scheme with slope limiting for positivity. Explicit Runge–Kutta time integration is used for dynamics and forward-in-time integration with horizontal/vertical splitting is used for tracers. Most of the model code is common to the three sets of equations solved, making it easier to develop and validate each piece of the model separately. Representative three-dimensional test cases are run and analyzed, showing correctness of the model. The design permits to consider several extensions in the near future, from higher-order transport to more general dynamics, especially deep-atmosphere and non-hydrostatic equations.

  13. Development by design in Colombia: making mitigation decisions consistent with conservation outcomes.

    Science.gov (United States)

    Saenz, Shirley; Walschburger, Tomas; González, Juan Carlos; León, Jorge; McKenney, Bruce; Kiesecker, Joseph

    2013-01-01

    Mitigation policy and regulatory frameworks are consistent in their strong support for the mitigation hierarchy of: (1) avoiding impacts, (2) minimizing impacts, and then (3) offsetting/compensating for residual impacts. While mitigation frameworks require developers to avoid, minimize and restore biodiversity on-site before considering an offset for residual impacts, there is a lack of quantitative guidance for this decision-making process. What are the criteria for requiring impacts be avoided altogether? Here we examine how conservation planning can guide the application of the mitigation hierarchy to address this issue. In support of the Colombian government's aim to improve siting and mitigation practices for planned development, we examined five pilot projects in landscapes expected to experience significant increases in mining, petroleum and/or infrastructure development. By blending landscape-level conservation planning with application of the mitigation hierarchy, we can proactively identify where proposed development and conservation priorities would be in conflict and where impacts should be avoided. The approach we outline here has been adopted by the Colombian Ministry of Environment and Sustainable Development to guide licensing decisions, avoid piecemeal licensing, and promote mitigation decisions that maintain landscape condition.

  14. Development by design in Colombia: making mitigation decisions consistent with conservation outcomes.

    Directory of Open Access Journals (Sweden)

    Shirley Saenz

    Full Text Available Mitigation policy and regulatory frameworks are consistent in their strong support for the mitigation hierarchy of: (1 avoiding impacts, (2 minimizing impacts, and then (3 offsetting/compensating for residual impacts. While mitigation frameworks require developers to avoid, minimize and restore biodiversity on-site before considering an offset for residual impacts, there is a lack of quantitative guidance for this decision-making process. What are the criteria for requiring impacts be avoided altogether? Here we examine how conservation planning can guide the application of the mitigation hierarchy to address this issue. In support of the Colombian government's aim to improve siting and mitigation practices for planned development, we examined five pilot projects in landscapes expected to experience significant increases in mining, petroleum and/or infrastructure development. By blending landscape-level conservation planning with application of the mitigation hierarchy, we can proactively identify where proposed development and conservation priorities would be in conflict and where impacts should be avoided. The approach we outline here has been adopted by the Colombian Ministry of Environment and Sustainable Development to guide licensing decisions, avoid piecemeal licensing, and promote mitigation decisions that maintain landscape condition.

  15. A Cost-Effective Two-Part Experiment for Teaching Introductory Organic Chemistry Techniques

    Science.gov (United States)

    Sadek, Christopher M.; Brown, Brenna A.; Wan, Hayley

    2011-01-01

    This two-part laboratory experiment is designed to be a cost-effective method for teaching basic organic laboratory techniques (recrystallization, thin-layer chromatography, column chromatography, vacuum filtration, and melting point determination) to large classes of introductory organic chemistry students. Students are exposed to different…

  16. Architectural design of the pelvic floor is consistent with muscle functional subspecialization.

    Science.gov (United States)

    Tuttle, Lori J; Nguyen, Olivia T; Cook, Mark S; Alperin, Marianna; Shah, Sameer B; Ward, Samuel R; Lieber, Richard L

    2014-02-01

    Skeletal muscle architecture is the strongest predictor of a muscle's functional capacity. The purpose of this study was to define the architectural properties of the deep muscles of the female pelvic floor (PFMs) to elucidate their structure-function relationships. PFMs coccygeus (C), iliococcygeus (IC), and pubovisceral (PV) were harvested en bloc from ten fixed human cadavers (mean age 85 years, range 55-102). Fundamental architectural parameters of skeletal muscles [physiological cross-sectional area (PCSA), normalized fiber length, and sarcomere length (L(s))] were determined using validated methods. PCSA predicts muscle-force production, and normalized fiber length is related to muscle excursion. These parameters were compared using repeated measures analysis of variance (ANOVA) with post hoc t tests, as appropriate. Significance was set to α = 0.05. PFMs were thinner than expected based on data reported from imaging studies and in vivo palpation. Significant differences in fiber length were observed across PFMs: C = 5.29 ± 0.32 cm, IC = 7.55 ± 0.46 cm, PV = 10.45 ± 0.67 cm (p design shows individual muscles demonstrating differential architecture, corresponding to specialized function in the pelvic floor.

  17. A two-part model for censored medical cost data.

    Science.gov (United States)

    Tian, Lu; Huang, Jie

    2007-10-15

    The two-part model is often used to analyse medical cost data which contain a large proportion of zero cost and are highly skewed with some large costs. The total medical costs over a period of time are often censored due to incomplete follow-up, making the analysis difficult as the censoring can be informative. We propose to apply the inverse probability weighting method on a two-part model to analyse right-censored cumulative medical costs with informative censoring. We also introduce a set of simple functionals based on the intermediate cost history to be applied with the efficiency augmentation technique. In addition, we propose a practical model-checking technique based on the cumulative residuals. Simulation studies are conducted to evaluate the finite sample performance of the proposed method. We use a data set on the cardiovascular disease (CVD)-related Medicare costs to illustrate our proposed method.

  18. A sample design for globally consistent biomass estimation using lidar data from the Geoscience Laser Altimeter System (GLAS

    Directory of Open Access Journals (Sweden)

    Healey Sean P

    2012-10-01

    Full Text Available Abstract Background Lidar height data collected by the Geosciences Laser Altimeter System (GLAS from 2002 to 2008 has the potential to form the basis of a globally consistent sample-based inventory of forest biomass. GLAS lidar return data were collected globally in spatially discrete full waveform “shots,” which have been shown to be strongly correlated with aboveground forest biomass. Relationships observed at spatially coincident field plots may be used to model biomass at all GLAS shots, and well-established methods of model-based inference may then be used to estimate biomass and variance for specific spatial domains. However, the spatial pattern of GLAS acquisition is neither random across the surface of the earth nor is it identifiable with any particular systematic design. Undefined sample properties therefore hinder the use of GLAS in global forest sampling. Results We propose a method of identifying a subset of the GLAS data which can justifiably be treated as a simple random sample in model-based biomass estimation. The relatively uniform spatial distribution and locally arbitrary positioning of the resulting sample is similar to the design used by the US national forest inventory (NFI. We demonstrated model-based estimation using a sample of GLAS data in the US state of California, where our estimate of biomass (211 Mg/hectare was within the 1.4% standard error of the design-based estimate supplied by the US NFI. The standard error of the GLAS-based estimate was significantly higher than the NFI estimate, although the cost of the GLAS estimate (excluding costs for the satellite itself was almost nothing, compared to at least US$ 10.5 million for the NFI estimate. Conclusions Global application of model-based estimation using GLAS, while demanding significant consolidation of training data, would improve inter-comparability of international biomass estimates by imposing consistent methods and a globally coherent sample frame. The

  19. Much ado about two: reconsidering retransformation and the two-part model in health econometrics.

    Science.gov (United States)

    Mullahy, J

    1998-06-01

    In health economics applications involving outcomes (y) and covariates (x), it is often the case that the central inferential problems of interest involve E[y/x] and its associated partial effects or elasticities. Many such outcomes have two fundamental statistical properties: y > or = 0; and the outcome y = 0 is observed with sufficient frequency that the zeros cannot be ignored econometrically. This paper (1) describes circumstances where the standard two-part model with homoskedastic retransformation will fail to provide consistent inferences about important policy parameters; and (2) demonstrates some alternative approaches that are likely to prove helpful in applications.

  20. Optimization of preparation method for ketoprofen-loaded microspheres consisting polymeric blends using simplex lattice mixture design.

    Science.gov (United States)

    Das, Sanjoy Kumar; Khanam, Jasmina; Nanda, Arunabha

    2016-12-01

    In the present investigation, simplex lattice mixture design was applied for formulation development and optimization of a controlled release dosage form of ketoprofen microspheres consisting polymers like ethylcellulose and Eudragit(®)RL 100; when those were formed by oil-in-oil emulsion solvent evaporation method. The investigation was carried out to observe the effects of polymer amount, stirring speed and emulsifier concentration (% w/w) on percentage yield, average particle size, drug entrapment efficiency and in vitro drug release in 8h from the microspheres. Analysis of variance (ANOVA) was used to estimate the significance of the models. Based on the desirability function approach numerical optimization was carried out. Optimized formulation (KTF-O) showed close match between actual and predicted responses with desirability factor 0.811. No adverse reaction between drug and polymers were observed on the basis of Fourier transform infrared (FTIR) spectroscopy and Differential scanning calorimetric (DSC) analysis. Scanning electron microscopy (SEM) was carried out to show discreteness of microspheres (149.2±1.25μm) and their surface conditions during pre and post dissolution operations. The drug release pattern from KTF-O was best explained by Korsmeyer-Peppas and Higuchi models. The batch of optimized microspheres were found with maximum entrapment (~90%), minimum loss (~10%) and prolonged drug release for 8h (91.25%) which may be considered as favourable criteria of controlled release dosage form.

  1. An Analytic Creativity Assessment Scale for Digital Game Story Design: Construct Validity, Internal Consistency and Interrater Reliability

    Science.gov (United States)

    Chuang, Tsung-Yen; Huang, Yun-Hsuan

    2015-01-01

    Mobile technology has rapidly made digital games a popular entertainment to this digital generation, and thus digital game design received considerable attention in both the game industry and design education. Digital game design involves diverse dimensions in which digital game story design (DGSD) particularly attracts our interest, as the…

  2. A sample design for globally consistent biomass estimation using lidar data from the Geoscience Laser Altimeter System (GLAS)

    Science.gov (United States)

    Sean P. Healey; Paul L. Patterson; Sassan S. Saatchi; Michael A. Lefsky; Andrew J. Lister; Elizabeth A. Freeman

    2012-01-01

    Lidar height data collected by the Geosciences Laser Altimeter System (GLAS) from 2002 to 2008 has the potential to form the basis of a globally consistent sample-based inventory of forest biomass. GLAS lidar return data were collected globally in spatially discrete full waveform "shots," which have been shown to be strongly correlated with aboveground forest...

  3. A Bayesian two part model applied to analyze risk factors of adult mortality with application to data from Namibia.

    Directory of Open Access Journals (Sweden)

    Lawrence N Kazembe

    Full Text Available Despite remarkable gains in life expectancy and declining mortality in the 21st century, in many places mostly in developing countries, adult mortality has increased in part due to HIV/AIDS or continued abject poverty levels. Moreover many factors including behavioural, socio-economic and demographic variables work simultaneously to impact on risk of mortality. Understanding risk factors of adult mortality is crucial towards designing appropriate public health interventions. In this paper we proposed a structured additive two-part random effects regression model for adult mortality data. Our proposal assumed two processes: (i whether death occurred in the household (prevalence part, and (ii number of reported deaths, if death did occur (severity part. The proposed model specification therefore consisted of two generalized linear mixed models (GLMM with correlated random effects that permitted structured and unstructured spatial components at regional level. Specifically, the first part assumed a GLMM with a logistic link and the second part explored a count model following either a Poisson or negative binomial distribution. The model was used to analyse adult mortality data of 25,793 individuals from the 2006/2007 Namibian DHS data. Inference is based on the Bayesian framework with appropriate priors discussed.

  4. Solid consistency

    Science.gov (United States)

    Bordin, Lorenzo; Creminelli, Paolo; Mirbabayi, Mehrdad; Noreña, Jorge

    2017-03-01

    We argue that isotropic scalar fluctuations in solid inflation are adiabatic in the super-horizon limit. During the solid phase this adiabatic mode has peculiar features: constant energy-density slices and comoving slices do not coincide, and their curvatures, parameterized respectively by ζ and Script R, both evolve in time. The existence of this adiabatic mode implies that Maldacena's squeezed limit consistency relation holds after angular average over the long mode. The correlation functions of a long-wavelength spherical scalar mode with several short scalar or tensor modes is fixed by the scaling behavior of the correlators of short modes, independently of the solid inflation action or dynamics of reheating.

  5. A Comparison of Dimensional Accuracy of Addition Silicone of Different Consistencies with Two Different Spacer Designs - In-vitro Study

    Science.gov (United States)

    Eswaran, B; Eswaran, MA; Prabhu, R; Geetha, KR; Krishna, GP; Jagadeshwari

    2014-01-01

    Introduction: Dimensional accuracy of impression materials is crucial for the production of working casts in Fixed Prosthodontics. The accurate replication of tooth preparations and their arch position requires impression materials that exhibit limited distortion. Methods: This study was conducted to comparatively evaluate the dimensional accuracy of additional silicones by comparing two different techniques and spacer designs, by measuring the linear changes in interpreparation distance. The impressions were made from a stainless steel master die simulating a three unit bridge. A total 80 die stone (type IV, Ultrarock) models were obtained from the impressions made using two different parameters. The two different parameters are Multimix and Monophasic technique and different spacer designs. Result: The interpreparation distance of the abutments in the casts was measured using a travelling microscope. Each sample was measured thrice and the mean value was calculated. The results obtained were statistically analysed and the values fall within the clinically acceptable range. Conclusion: The most accurate combination is multimix technique with spacer design which uses less bulk of impression material. PMID:25177635

  6. Designing Functionalized Nanoparticles for Controlled Assembly in Polymer Matrix: Self consistent PRISM Theory and Monte Carlo simulation Study

    Science.gov (United States)

    Jayaraman, Arthi; Nair, Nitish

    2011-03-01

    Significant interest has grown around the ability to create hybrid materials with controlled spatial arrangement of nanoparticles mediated by a polymer matrix. By functionalizing or grafting polymers on to nanoparticle surfaces and systematically tuning the composition, chemistry, molecular weight and grafting density of the grafted polymers one can tailor the inter-particle interactions and control the assembly/dispersion of the particles in the polymer matrix. In our recent work using self-consistent Polymer Reference Interaction Site Model (PRISM) theory- Monte Carlo simulations we have shown that tailoring the monomer sequences in the grafted copolymers provides a novel route to tuning the effective inter-particle interactions between the functionalized nanoparticles in a polymer matrix. In this talk I will present how monomer sequence and molecular weights (with and without polydispersity) of the grafted polymers, compatibility of the graft and matrix polymers, and nanoparticle size affect the chain conformations of the grafted polymers and the potential of mean force between the grafted nanoparticles in the matrix.

  7. Self-Consistent Field Theory for the Design of Thermoplastic Elastomers from Miktoarm Block Copolymer - Homopolymer Blends

    Science.gov (United States)

    Hamilton, Andrew Lawrence

    We have used self-consistent field theory to study the morphological characteristics of blends of miktoarm block copolymers and homopolymers. More specifically, we have studied the effects of segregation strength, miktoarm block copolymer composition, and homopolymer size and volume fraction on the phase diagrams of these systems. A15 domains with discrete A-monomer spherical domains were found to be stable with A-monomer loading fractions of at least as high as 52%. Hexagonally-packed cylindrical domains were found to be stable at A-monomer loadings of at least as high as 72%. These findings represent a significant improvement from the loading fractions of 43% and 60% reported by Lynd et al. for spherical and cylindrical domains in neat miktoarm block copolymers, respectively. It is also quite possible that even greater loading fractions are achievable in systems too large for our simulations. These results predict exciting new materials for next-generation thermoplastic elastomers, since the ideal TPE has a large loading of A monomers in discrete, crystalline or glassy domains, surrounded by a continuous matrix of elastomeric B domains. Additionally, we have performed SCFT simulations modelled after experimental blends of polystyrene and polyisoprene-based miktoarm block copolymers and homopolymers. Certain experimental samples showed fascinating new "bricks and mortar" phases and swollen asymmetric lamellar phases. In both cases, the A domains are highly swollen with homopolymer, forcing the miktoarm block copolymer to segregate near the interface and adopt the role of a surfactant. The resulting structures maintain separate A and B domains, but lack long-range order. While it is not possible to study these mesophases using SCFT, since they lack long-range order and therefore well-defined symmetry, our SCFT results show the onset of macrophase separation at similar homopolymer loadings, for both the bricks and mortar phases and the highly swollen lamellae. This

  8. Calibrating Reaction Enthalpies: Use of Density Functional Theory and the Correlation Consistent Composite Approach in the Design of Photochromic Materials.

    Science.gov (United States)

    Letterman, Roger G; DeYonker, Nathan J; Burkey, Theodore J; Webster, Charles Edwin

    2016-12-22

    Acquisition of highly accurate energetic data for chromium-containing molecules and various chromium carbonyl complexes is a major step toward calibrating bond energies and thermal isomerization energies from mechanisms for Cr-centered photochromic materials being developed in our laboratories. The performance of six density functionals in conjunction with seven basis sets, utilizing Gaussian-type orbitals, has been evaluated for the calculation of gas-phase enthalpies of formation and enthalpies of reaction at 298.15 K on various chromium-containing systems. Nineteen molecules were examined: Cr(CO)6, Cr(CO)5, Cr(CO)5(C2H4), Cr(CO)5(C2ClH3), Cr(CO)5(cis-(C2Cl2H2)), Cr(CO)5(gem-(C2Cl2H2)), Cr(CO)5(trans-(C2Cl2H2)), Cr(CO)5(C2Cl3H), Cr(CO)5(C2Cl4), CrO2, CrF2, CrCl2, CrCl4, CrBr2, CrBr4, CrOCl2, CrO2Cl2, CrOF2, and CrO2F2. The performance of 69 density functionals in conjunction with a single basis set utilizing Slater-type orbitals (STO) and a zeroth-order relativistic approximation was also evaluated for the same test set. Values derived from density functional theory were compared to experimental values where available, or values derived from the correlation consistent composite approach (ccCA). When all reactions were considered, the functionals that exhibited the smallest mean absolute deviations (MADs, in kcal mol(-1)) from ccCA-derived values were B97-1 (6.9), VS98 (9.0), and KCIS (9.4) in conjunction with quadruple-ζ STO basis sets and B97-1 (9.3) in conjunction with cc-pVTZ basis sets. When considering only the set of gas-phase reaction enthalpies (ΔrH°gas), the functional that exhibited the smallest MADs from ccCA-derived values were B97-1 in conjunction with cc-pVTZ basis sets (9.1) and PBEPBE in conjunction with polarized valence triple-ζ basis set/effective core potential combination for Cr and augmented and multiple polarized triple-ζ Pople style basis sets (9.5). Also of interest, certainly because of known cancellation of errors, PBEPBE with the

  9. Consistency in Distributed Systems

    OpenAIRE

    Kemme, Bettina; Ramalingam, Ganesan; Schiper, André; Shapiro, Marc; Vaswani, Kapil

    2013-01-01

    International audience; In distributed systems, there exists a fundamental trade-off between data consistency, availability, and the ability to tolerate failures. This trade-off has significant implications on the design of the entire distributed computing infrastructure such as storage systems, compilers and runtimes, application development frameworks and programming languages. Unfortunately, it also has significant, and poorly understood, implications for the designers and developers of en...

  10. Sample selection versus two-part models revisited: the case of female smoking and drinking.

    Science.gov (United States)

    Madden, David

    2008-03-01

    There is a well-established debate between Heckman sample selection and two-part models in health econometrics, particularly when no obvious exclusion restrictions are available. Most of this debate has focussed on the application of these models to health care expenditure. This paper revisits the debate in the context of female smoking and drinking, and evaluates the two approaches on three grounds: theoretical, practical and statistical. The two-part model is generally favoured but it is stressed that this comparison should be carried out on a case-by-case basis.

  11. A Study on Digital Analysis of Bach’s “Two-Part Inventions”

    Directory of Open Access Journals (Sweden)

    Xiao-Yi Song

    2015-01-01

    Full Text Available In the field of music composition, creating polyphony is relatively one of the most difficult parts. Among them, the basis of multivoice polyphonic composition is two-part counterpoint. The main purpose of this paper is, through the computer technology, conducting a series of studies on “Two-Part Inventions” of Bach, a Baroque polyphony master. Based on digitalization, visualization and mathematical methods, data mining algorithm has been applied to identify bipartite characteristics and rules of counterpoint polyphony. We hope that the conclusions drawn from the article could be applied to the digital creation of polyphony.

  12. Development of the two-part pattern during regeneration of the head in hydra

    DEFF Research Database (Denmark)

    Bode, Matthias; Awad, T A; Koizumi, O;

    1988-01-01

    The head of a hydra is composed of two parts, a domed hypostome with a mouth at the top and a ring of tentacles below. When animals are decapitated a new head regenerates. During the process of regeneration the apical tip passes through a transient stage in which it exhibits tentacle...... began evaginating in a ring, both the TS-19 antigen and RLI+ ganglion cells gradually disappeared from the presumptive hypostome area and RLI+ sensory cells appeared at the apex. By tracking tissue movements during morphogenesis it became clear that the apical cap, in which these changes took place, did...... not undergo tissue turnover. The implications of this tentacle-like stage for patterning the two-part head are discussed....

  13. The Corynebacterium xerosis composite transposon Tn5432 consists of two identical insertion sequences, designated IS1249, flanking the erythromycin resistance gene ermCX.

    Science.gov (United States)

    Tauch, A; Kassing, F; Kalinowski, J; Pühler, A

    1995-09-01

    Analysis of the 50-kb R-plasmid pTP10 from the clinical isolate Corynebacterium xerosis M82B revealed that the erythromycin resistance gene, ermCX, is located on a 4524-bp composite transposable element, Tn5432. The ends of Tn5432 are identical, direct repeats of an insertion sequence, designated IS1249, encoding a putative transposase of the IS256 family. IS1249 consists of 1385 bp with 45/42 imperfect terminal inverted repeats. The nucleotide sequence of the 1754-bp Tn5432 central region is 99% identical to the previously sequenced erythromycin resistance region of the Corynebacterium diphtheriae plasmid pNG2. It encodes the erythromycin resistance gene, ermCX, and an ORF homologous to the amino-terminal end of the transposase of IS31831 from Corynebacterium glutamicum. Transposons with regions flanking the insertion sites were recovered from the C. glutamicum chromosome by a plasmid rescue technique. Insertion of Tn5432 created 8-bp target site duplications. A Tn5432-induced isoleucine/valine-auxotrophic mutant was found to carry the transposon in the 5' region of the ilvBNC cluster; in pTP10 the transposon is inserted in a region similar to replication and partitioning functions of the Enterococcus faecalis plasmid pAD1 and the Agrobacterium tumefaciens plasmid pTAR.

  14. Total System Performance Assessment-License Application Design Selection (LADS) Phase 1 Analysis of Surface Modification Consisting of Addition of Alluvium (Feature 23a)

    Energy Technology Data Exchange (ETDEWEB)

    N. Erb

    1999-06-11

    The objective of this report is to document the analysis that was conducted to evaluate the effect of a potential change to the TSPA-VA base case design that could improve long-term repository performance. The design feature evaluated in this report is a modification of the topographic surface of Yucca Mountain. The modification consists of covering the land surface immediately above the repository foot-print with a thick layer of unconsolidated material utilizing rip-rap and plants to mitigate erosion. This surface modification is designated as Feature 23a or simply abbreviated as F23a. The fundamental aim of F23a is to reduce the net infiltration into the unsaturated zone by enhancing the potential for evapotranspiratiration at the surface; such a change would, in turn, reduce the seepage flux and the rate of radionuclide releases from the repository. Field and modeling studies of water movement in the unsaturated zone have indicated that shallow infiltration at the surface is almost negligible in locations where the bedrock is covered by a sufficiently thick soil layer. In addition to providing storage for meteoric water, a thick soil layer would slow the downward movement of soil moisture to such an extent that evaporation and transpiration could easily transfer most of the soil-water back to the atmosphere. Generic requirements for the effectiveness of this design feature are two-fold. First, the soil layer above the repository foot-print must be thick enough to provide sufficient storage of meteoric water (from episodic precipitation events) and accommodate plant roots. Second, the added soil layer must be engineered so as to mitigate thinning by erosional processes and have sufficient thickness to accommodate the roots of common desert plants. Under these two conditions, it is reasonable to expect that modification would be effective for a significant time period and the net infiltration and deep percolation flux would be reduced by orders of magnitude lower

  15. Physical activity versus cardiorespiratory fitness: two (partly) distinct components of cardiovascular health?

    Science.gov (United States)

    DeFina, Laura F; Haskell, William L; Willis, Benjamin L; Barlow, Carolyn E; Finley, Carrie E; Levine, Benjamin D; Cooper, Kenneth H

    2015-01-01

    Physical activity (PA) and cardiorespiratory fitness (CRF) both have inverse relationships to cardiovascular (CV) morbidity and mortality. Recent position papers and guidelines have identified the important role of both of these factors in CV health. The benefits of PA and CRF in the prevention of CV disease and risk factors are reviewed. In addition, assessment methodology and utilization in the research and clinical arenas are discussed. Finally, the benefits, methodology, and utilization are compared and contrasted to better understand the two (partly) distinct components and their impact on CV health.

  16. Two-part silicone mold. A new tool for flexible ureteroscopy surgical training

    Directory of Open Access Journals (Sweden)

    Bruno Marroig

    Full Text Available ABSTRACT Introduction and objectives: Flexible ureteroscopy is a common procedure nowadays. Most of the training programs use virtual reality simulators. The aim of this study was to standardize the building of a three-dimensional silicone mold (cavity of the collecting system, on the basis of polyester resin endocasts, which can be used in surgical training programs. Materials and Methods: A yellow polyester resin was injected into the ureter to fill the collecting system of 24 cadaveric fresh human kidneys. After setting off the resin, the kidneys were immersed in hydrochloric acid until total corrosion of the organic matter was achieved and the collecting system endocasts obtained. The endocasts were used to prepare white color two-part silicone molds, which after endocasts withdrawn, enabled a ureteroscope insertion into the collecting system molds (cavities. Also, the minor calices were painted with different colors in order to map the access to the different caliceal groups. The cost of the materials used in the molds is $30.00 and two days are needed to build them. Results: Flexible ureteroscope could be inserted into all molds and the entire collecting system could be examined. Since some anatomical features, as infundular length, acute angle, and perpendicular minor calices may difficult the access to some minor calices, especially in the lower caliceal group, surgical training in models leads to better surgical results. Conclusions: The two-part silicone mold is feasible, cheap and allows its use for flexible ureteroscopy surgical training.

  17. Do case-only designs yield consistent results between them and across different databases (DB)? Hip fractures associated with Benzodiazepines (BZD) as a case study

    NARCIS (Netherlands)

    Requena, Gema; Logie, John; Martin, Elisa; Huerta, Consuelo; González-González, Rocio; Boudiaf, Nada; Álvarez, Arturo; Bate, Andrew; García-Rodríguez, Luis A.; Reynolds, Robert; Schlienger, Raymond G.; De Groot, Mark C.H.; Klungel, Olaf H.; De Abajo, Francisco J.; Douglas, Ian

    2014-01-01

    Background: The case crossover (CXO) and selfcontrolled case series designs (SCCS) are increasingly used in pharmacoepidemiology. In both designs relative risk estimates are obtained within persons rather than between persons thus implicitly controlling for fixed confounding variables. Objectives: T

  18. Research on Estimates of Xi’an City Life Garbage Pay-As-You-Throw Based on Two-part Tariff method

    Science.gov (United States)

    Yaobo, Shi; Xinxin, Zhao; Fuli, Zheng

    2017-05-01

    Domestic waste whose pricing can’t be separated from the pricing of public economics category is quasi public goods. Based on Two-part Tariff method on urban public utilities, this paper designs the pricing model in order to match the charging method and estimates the standard of pay-as-you-throw using data of the past five years in Xi’an. Finally, this paper summarizes the main results and proposes corresponding policy recommendations.

  19. New Classification Methods for Hiding Information into Two Parts: Multimedia Files and Non Multimedia Files

    CERN Document Server

    Alanazi, Hamdan O; Zaidan, B B; Jalab, Hamid A; AL-Ani, Zaidoon Kh

    2010-01-01

    With the rapid development of various multimedia technologies, more and more multimedia data are generated and transmitted in the medical, commercial, and military fields, which may include some sensitive information which should not be accessed by or can only be partially exposed to the general users. Therefore, security and privacy has become an important, Another problem with digital document and video is that undetectable modifications can be made with very simple and widely available equipment, which put the digital material for evidential purposes under question .With the large flood of information and the development of the digital format Information hiding considers one of the techniques which used to protect the important information. The main goals for this paper, provides a general overview of the New Classification Methods for Hiding Information into Two Parts: Multimedia Files and Non Multimedia Files.

  20. Chip Multithreaded Consistency Model

    Institute of Scientific and Technical Information of China (English)

    Zu-Song Li; Dan-Dan Huan; Wei-Wu Hu; Zhi-Min Tang

    2008-01-01

    Multithreaded technique is the developing trend of high performance processor. Memory consistency model is essential to the correctness, performance and complexity of multithreaded processor. The chip multithreaded consistency model adapting to multithreaded processor is proposed in this paper. The restriction imposed on memory event ordering by chip multithreaded consistency is presented and formalized. With the idea of critical cycle built by Wei-Wu Hu, we prove that the proposed chip multithreaded consistency model satisfies the criterion of correct execution of sequential consistency model. Chip multithreaded consistency model provides a way of achieving high performance compared with sequential consistency model and ensures the compatibility of software that the execution result in multithreaded processor is the same as the execution result in uniprocessor. The implementation strategy of chip multithreaded consistency model in Godson-2 SMT processor is also proposed. Godson-2 SMT processor supports chip multithreaded consistency model correctly by exception scheme based on the sequential memory access queue of each thread.

  1. Two Views of Islam: Ceramic Tile Design and Miniatures.

    Science.gov (United States)

    Macaulay, Sara Grove

    2001-01-01

    Describes an art project focusing on Islamic art that consists of two parts: (1) ceramic tile design; and (2) Islamic miniatures. Provides background information on Islamic art and step-by-step instructions for designing the Islamic tile and miniature. Includes learning objectives and resources on Islamic tile miniatures. (CMK)

  2. The U. S. transportation sector in the year 2030: results of a two-part Delphi survey.

    Energy Technology Data Exchange (ETDEWEB)

    Morrison, G.; Stephens, T.S. (Energy Systems); (Univ. of California at Davis); (ES)

    2011-10-11

    A two-part Delphi Survey was given to transportation experts attending the Asilomar Conference on Transportation and Energy in August, 2011. The survey asked respondents about trends in the US transportation sector in 2030. Topics included: alternative vehicles, high speed rail construction, rail freight transportation, average vehicle miles traveled, truck versus passenger car shares, vehicle fuel economy, and biofuels in different modes. The survey consisted of two rounds -- both asked the same set of seven questions. In the first round, respondents were given a short introductory paragraph about the topic and asked to use their own judgment in their responses. In the second round, the respondents were asked the same questions, but were also given results from the first round as guidance. The survey was sponsored by Argonne National Lab (ANL), the National Renewable Energy Lab (NREL), and implemented by University of California at Davis, Institute of Transportation Studies. The survey was part of the larger Transportation Energy Futures (TEF) project run by the Department of Energy, Office of Energy Efficiency and Renewable Energy. Of the 206 invitation letters sent, 94 answered all questions in the first round (105 answered at least one question), and 23 of those answered all questions in the second round. 10 of the 23 second round responses were at a discussion section at Asilomar, while the remaining were online. Means and standard deviations of responses from Round One and Two are given in Table 1 below. One main purpose of Delphi surveys is to reduce the variance in opinions through successive rounds of questioning. As shown in Table 1, the standard deviations of 25 of the 30 individual sub-questions decreased between Round One and Round Two, but the decrease was slight in most cases.

  3. Marijuana Use among Juvenile Arrestees: A Two-Part Growth Model Analysis

    Science.gov (United States)

    Dembo, Richard; Wareham, Jennifer; Greenbaum, Paul E.; Childs, Kristina; Schmeidler, James

    2009-01-01

    This article examines the impact of sociodemographic characteristics and psychosocial factors on the probability and frequency of marijuana use and, for youths initiating use, on their frequency of use over four time points. The sample consists of 278 justice-involved youths completing at least one of three follow-up interviews as part of a…

  4. Marijuana Use among Juvenile Arrestees: A Two-Part Growth Model Analysis

    Science.gov (United States)

    Dembo, Richard; Wareham, Jennifer; Greenbaum, Paul E.; Childs, Kristina; Schmeidler, James

    2009-01-01

    This article examines the impact of sociodemographic characteristics and psychosocial factors on the probability and frequency of marijuana use and, for youths initiating use, on their frequency of use over four time points. The sample consists of 278 justice-involved youths completing at least one of three follow-up interviews as part of a…

  5. The Graphic Design for "Kiss Tibet with Camera Lens"

    Institute of Scientific and Technical Information of China (English)

    WuYang

    2004-01-01

    I have lectured on graphic design for two weeks for sophomores learning photographing, They are immature in basic skills and use of design languages, but their creativity is out of my expectation, My course consists of two parts, One is to make a realistic graphic design for a book according to the publisher's requirements. The

  6. Consistent model driven architecture

    Science.gov (United States)

    Niepostyn, Stanisław J.

    2015-09-01

    The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.

  7. No consistent bimetric gravity?

    CERN Document Server

    Deser, S; Waldron, A

    2013-01-01

    We discuss the prospects for a consistent, nonlinear, partially massless (PM), gauge symmetry of bimetric gravity (BMG). Just as for single metric massive gravity, ultimate consistency of both BMG and the putative PM BMG theory relies crucially on this gauge symmetry. We argue, however, that it does not exist.

  8. High temperature alloys for gas turbines and other applications 1986. Two parts

    Energy Technology Data Exchange (ETDEWEB)

    Betz, W.; Brunetaud, R.; Coutsouradis, D.; Fischmeister, H.; Gibbons, T.B.; Kvernes, I.; Lindblom, Y.; Marriott, J.B.; Meadowcroft, D.B. (eds.)

    1986-01-01

    A total of 105 papers were presented at the conference in 14 sessions with the following headings: developments in materials and processing; environmental effects and protection; mechanical properties for engineering design; solidfication and casting; joining and repair; forging and powder metallurgy; materials; ODS materials; corrosion; protective systems; combined stress - environmental effects; creep; fatigue and creep fatigue; life time prediction. 4 of the papers have been abstracted separately.

  9. Prizes for consistency

    Energy Technology Data Exchange (ETDEWEB)

    Hiscock, S.

    1986-07-01

    The importance of consistency in coal quality has become of increasing significance recently, with the current trend towards using coal from a range of sources. A significant development has been the swing in responsibilities for coal quality. The increasing demand for consistency in quality has led to a re-examination of where in the trade and transport chain the quality should be assessed and where further upgrading of inspection and preparation facilities are required. Changes are in progress throughout the whole coal transport chain which will improve consistency of delivered coal quality. These include installation of beneficiation plant at coal mines, export terminals, and on the premises of end users. It is suggested that one of the keys to success for the coal industry will be the ability to provide coal of a consistent quality.

  10. Consistent sets contradict

    CERN Document Server

    Kent, A

    1996-01-01

    In the consistent histories formulation of quantum theory, the probabilistic predictions and retrodictions made from observed data depend on the choice of a consistent set. We show that this freedom allows the formalism to retrodict several contradictory propositions which correspond to orthogonal commuting projections and which all have probability one. We also show that the formalism makes contradictory probability one predictions when applied to generalised time-symmetric quantum mechanics.

  11. Network Consistent Data Association.

    Science.gov (United States)

    Chakraborty, Anirban; Das, Abir; Roy-Chowdhury, Amit K

    2016-09-01

    Existing data association techniques mostly focus on matching pairs of data-point sets and then repeating this process along space-time to achieve long term correspondences. However, in many problems such as person re-identification, a set of data-points may be observed at multiple spatio-temporal locations and/or by multiple agents in a network and simply combining the local pairwise association results between sets of data-points often leads to inconsistencies over the global space-time horizons. In this paper, we propose a Novel Network Consistent Data Association (NCDA) framework formulated as an optimization problem that not only maintains consistency in association results across the network, but also improves the pairwise data association accuracies. The proposed NCDA can be solved as a binary integer program leading to a globally optimal solution and is capable of handling the challenging data-association scenario where the number of data-points varies across different sets of instances in the network. We also present an online implementation of NCDA method that can dynamically associate new observations to already observed data-points in an iterative fashion, while maintaining network consistency. We have tested both the batch and the online NCDA in two application areas-person re-identification and spatio-temporal cell tracking and observed consistent and highly accurate data association results in all the cases.

  12. Designing end-user interfaces

    CERN Document Server

    Heaton, N

    1988-01-01

    Designing End-User Interfaces: State of the Art Report focuses on the field of human/computer interaction (HCI) that reviews the design of end-user interfaces.This compilation is divided into two parts. Part I examines specific aspects of the problem in HCI that range from basic definitions of the problem, evaluation of how to look at the problem domain, and fundamental work aimed at introducing human factors into all aspects of the design cycle. Part II consists of six main topics-definition of the problem, psychological and social factors, principles of interface design, computer intelligenc

  13. Reporting consistently on CSR

    DEFF Research Database (Denmark)

    Thomsen, Christa; Nielsen, Anne Ellerup

    2006-01-01

    of a case study showing that companies use different and not necessarily consistent strategies for reporting on CSR. Finally, the implications for managerial practice are discussed. The chapter concludes by highlighting the value and awareness of the discourse and the discourse types adopted......This chapter first outlines theory and literature on CSR and Stakeholder Relations focusing on the different perspectives and the contextual and dynamic character of the CSR concept. CSR reporting challenges are discussed and a model of analysis is proposed. Next, our paper presents the results...... in the reporting material. By implementing consistent discourse strategies that interact according to a well-defined pattern or order, it is possible to communicate a strong social commitment on the one hand, and to take into consideration the expectations of the shareholders and the other stakeholders...

  14. A Magnetic Consistency Relation

    CERN Document Server

    Jain, Rajeev Kumar

    2012-01-01

    If cosmic magnetic fields are indeed produced during inflation, they are likely to be correlated with the scalar metric perturbations that are responsible for the Cosmic Microwave Background anisotropies and Large Scale Structure. Within an archetypical model of inflationary magnetogenesis, we show that there exists a new simple consistency relation for the non-Gaussian cross correlation function of the scalar metric perturbation with two powers of the magnetic field in the squeezed limit where the momentum of the metric perturbation vanishes. We emphasize that such a consistency relation turns out to be extremely useful to test some recent calculations in the literature. Apart from primordial non-Gaussianity induced by the curvature perturbations, such a cross correlation might provide a new observational probe of inflation and can in principle reveal the primordial nature of cosmic magnetic fields.

  15. Geometrically Consistent Mesh Modification

    KAUST Repository

    Bonito, A.

    2010-01-01

    A new paradigm of adaptivity is to execute refinement, coarsening, and smoothing of meshes on manifolds with incomplete information about their geometry and yet preserve position and curvature accuracy. We refer to this collectively as geometrically consistent (GC) mesh modification. We discuss the concept of discrete GC, show the failure of naive approaches, and propose and analyze a simple algorithm that is GC and accuracy preserving. © 2010 Society for Industrial and Applied Mathematics.

  16. Consistent wind Facilitates Vection

    Directory of Open Access Journals (Sweden)

    Masaki Ogawa

    2011-10-01

    Full Text Available We examined whether a consistent haptic cue suggesting forward self-motion facilitated vection. We used a fan with no blades (Dyson, AM01 providing a wind of constant strength and direction (wind speed was 6.37 m/s to the subjects' faces with the visual stimuli visible through the fan. We used an optic flow of expansion or contraction created by positioning 16,000 dots at random inside a simulated cube (length 20 m, and moving the observer's viewpoint to simulate forward or backward self-motion of 16 m/s. we tested three conditions for fan operation, which were normal operation, normal operation with the fan reversed (ie, no wind, and no operation (no wind and no sound. Vection was facilitated by the wind (shorter latency, longer duration and larger magnitude values with the expansion stimuli. The fan noise did not facilitate vection. The wind neither facilitated nor inhibited vection with the contraction stimuli, perhaps because a headwind is not consistent with backward self-motion. We speculate that the consistency between multi modalities is a key factor in facilitating vection.

  17. Infanticide and moral consistency.

    Science.gov (United States)

    McMahan, Jeff

    2013-05-01

    The aim of this essay is to show that there are no easy options for those who are disturbed by the suggestion that infanticide may on occasion be morally permissible. The belief that infanticide is always wrong is doubtfully compatible with a range of widely shared moral beliefs that underlie various commonly accepted practices. Any set of beliefs about the morality of abortion, infanticide and the killing of animals that is internally consistent and even minimally credible will therefore unavoidably contain some beliefs that are counterintuitive.

  18. The Rucio Consistency Service

    CERN Document Server

    Serfon, Cedric; The ATLAS collaboration

    2016-01-01

    One of the biggest challenge with Large scale data management system is to ensure the consistency between the global file catalog and what is physically on all storage elements. To tackle this issue, the Rucio software which is used by the ATLAS Distributed Data Management system has been extended to automatically handle lost or unregistered files (aka Dark Data). This system automatically detects these inconsistencies and take actions like recovery or deletion of unneeded files in a central manner. In this talk, we will present this system, explain the internals and give some results.

  19. OCRWM Bulletin: Westinghouse begins designing multi-purpose canister

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    This publication consists of two parts: OCRWM (Office of Civilian Radioactive Waste Management) Bulletin; and Of Mountains & Science which has articles on the Yucca Mountain project. The OCRWM provides information about OCRWM activities and in this issue has articles on multi-purpose canister design, and transportation cask trailer.

  20. When is holography consistent?

    Energy Technology Data Exchange (ETDEWEB)

    McInnes, Brett, E-mail: matmcinn@nus.edu.sg [National University of Singapore (Singapore); Ong, Yen Chin, E-mail: yenchin.ong@nordita.org [Nordita, KTH Royal Institute of Technology and Stockholm University, Roslagstullsbacken 23, SE-106 91 Stockholm (Sweden)

    2015-09-15

    Holographic duality relates two radically different kinds of theory: one with gravity, one without. The very existence of such an equivalence imposes strong consistency conditions which are, in the nature of the case, hard to satisfy. Recently a particularly deep condition of this kind, relating the minimum of a probe brane action to a gravitational bulk action (in a Euclidean formulation), has been recognized; and the question arises as to the circumstances under which it, and its Lorentzian counterpart, is satisfied. We discuss the fact that there are physically interesting situations in which one or both versions might, in principle, not be satisfied. These arise in two distinct circumstances: first, when the bulk is not an Einstein manifold and, second, in the presence of angular momentum. Focusing on the application of holography to the quark–gluon plasma (of the various forms arising in the early Universe and in heavy-ion collisions), we find that these potential violations never actually occur. This suggests that the consistency condition is a “law of physics” expressing a particular aspect of holography.

  1. Consistent quantum measurements

    Science.gov (United States)

    Griffiths, Robert B.

    2015-11-01

    In response to recent criticisms by Okon and Sudarsky, various aspects of the consistent histories (CH) resolution of the quantum measurement problem(s) are discussed using a simple Stern-Gerlach device, and compared with the alternative approaches to the measurement problem provided by spontaneous localization (GRW), Bohmian mechanics, many worlds, and standard (textbook) quantum mechanics. Among these CH is unique in solving the second measurement problem: inferring from the measurement outcome a property of the measured system at a time before the measurement took place, as is done routinely by experimental physicists. The main respect in which CH differs from other quantum interpretations is in allowing multiple stochastic descriptions of a given measurement situation, from which one (or more) can be selected on the basis of its utility. This requires abandoning a principle (termed unicity), central to classical physics, that at any instant of time there is only a single correct description of the world.

  2. 模糊一致矩阵理论在建筑设计方案优选中的应用%APPLICATION OF FUZZY CONSISTENT MATRIX THEORY TO OPTIMIZATION OF ARCHITECTURAL DESIGN SCHEME

    Institute of Scientific and Technical Information of China (English)

    邢彦富; 孔娅; 方兴

    2011-01-01

    建筑设计方案的优选是一项复杂的决策过程。运用模糊一致矩阵理论,建立建筑设计方案优选的数学模型,对设计方案进行多阂素综合评价,克服了建筑设计方案选择中影响因素的粗略性以及人为主观判断的随意性,从而获得良好的经济效益和社会效益。实例表明,采用该方法进行建筑设计方案的优选是可行的、有效的。%It is a complicated decision process for the optimization of architectural design scheme. The paper adopts the fuzzy consistent matrix theory, the indicator system of the architectural design scheme was established, has the comprehensive evaluation on the architectural design scheme, overcomes the rudeness of the influential factors in the architectural scheme selection, and the casualness of personal objective judgment, finally get the best benefits from the economic and social. A case shows that the method is reasonable and effective to evaluate architectural design scheme.

  3. When Is Holography Consistent?

    CERN Document Server

    McInnes, Brett

    2015-01-01

    Holographic duality relates two radically different kinds of theory: one with gravity, one without. The very existence of such an equivalence imposes strong consistency conditions which are, in the nature of the case, hard to satisfy. Recently a particularly deep condition of this kind, relating the minimum of a probe brane action to a gravitational bulk action (in a Euclidean formulation), has been recognised; and the question arises as to the circumstances under which it, and its Lorentzian counterpart, are satisfied. We discuss the fact that there are physically interesting situations in which one or both versions might, in principle, \\emph{not} be satisfied. These arise in two distinct circumstances: first, when the bulk is not an Einstein manifold, and, second, in the presence of angular momentum. Focusing on the application of holography to the quark-gluon plasma (of the various forms arising in the early Universe and in heavy-ion collisions), we find that these potential violations never actually occur...

  4. The consistency of semiparametric regression model with missing data under fixed design%随机缺失情况下固定设计半参数回归模型的相合性

    Institute of Scientific and Technical Information of China (English)

    裴晓换; 郭鹏江

    2012-01-01

    Aim To obtain the property of consistency for the estimates of β and g(·) in semiparametric regression model with missing data under fixed design. Methods Using the lemma, some inequality and the given conditions. Results The property of strong consistency for the estimates of β and g(·) is proved. Conclusion In the semiparametric regression model with missing data, the estimates of β and g(·)have the property of strong consistency.%目的 在随机缺失情况下证明固定设计半参数回归模型的强相合性.方法 利用引理,一些不等式及已给条件进行证明.结果 证明了参数β的最小二乘估计和未知函数g(·)的非参数核估计是强相合的.结论随机缺失下半参数回归模型中β的参数估计和非参数函数g(·)的估计量是强相合的.

  5. Drug price regulation under consumer moral hazard. Two-part tariffs, uniform price or third-degree price discrimination?

    Science.gov (United States)

    Felder, Stefan

    2004-12-01

    Drug price differences across national markets as they exist in the EU are often justified by the concept of Ramsey prices: with fixed costs for R&D, the optimal mark-ups on marginal costs are inversely related to the price elasticity in the individual markets. This well-known result prevails if consumer moral hazard is taken into account. Contrary to the situation without moral hazard, the uniform price does not necessarily dominate discriminatory pricing in welfare terms. The two-part tariff is a better alternative as it allows governments to address moral hazard. A uniform price combined with lump-payments reflecting differences in the willingness to pay and the moral hazard in member states appears to be an attractive option for a common EU drug market.

  6. Maintaining consistency in distributed systems

    Science.gov (United States)

    Birman, Kenneth P.

    1991-01-01

    In systems designed as assemblies of independently developed components, concurrent access to data or data structures normally arises within individual programs, and is controlled using mutual exclusion constructs, such as semaphores and monitors. Where data is persistent and/or sets of operation are related to one another, transactions or linearizability may be more appropriate. Systems that incorporate cooperative styles of distributed execution often replicate or distribute data within groups of components. In these cases, group oriented consistency properties must be maintained, and tools based on the virtual synchrony execution model greatly simplify the task confronting an application developer. All three styles of distributed computing are likely to be seen in future systems - often, within the same application. This leads us to propose an integrated approach that permits applications that use virtual synchrony with concurrent objects that respect a linearizability constraint, and vice versa. Transactional subsystems are treated as a special case of linearizability.

  7. 基于CAN总线ECU诊断功能的一致性测试设计%Design of consistency testing of ECU diagnostic function based on CAN bus

    Institute of Scientific and Technical Information of China (English)

    程安宇; 赵恬; 申帅

    2013-01-01

    为了保证基于CAN总线ECU诊断功能的正确性和稳定性,在分析CAN总线诊断系统的基础上,提出了一种针对诊断功能的测试方法.根据CAN总线诊断协议ISO 15765,从黑盒测试的角度设计了测试方案和测试用例.最后以CANoe总线分析测试软件为基础搭建测试平台,完成了对ECU诊断功能的一致性测试.测试结果验证了此方法能有效地测试ECU诊断功能与诊断协议ISO 15765的一致性,表明此诊断测试方法可行.%In order to ensure the correctness and stability of ECU diagnostic function based on CAN bus,on the basis of analysis of CAN bus diagnostic system,a test method for diagnostic function is proposed.According to CAN bus diagnostic protocol ISO 15765,test scheme and test cases are designed from the view of blackboxtesting.Finally,test platform based on CANoe Bus analysis testing software is built up,the consistency test of ECU diagnostic function is fulfilled.Test result verifies that the method can effectively test the consistency between the ECU diagnostic function and diagnostic protocol ISO 15765,and shows the diagnostic test method is feasible.

  8. Consistent estimators in random censorship semiparametric models

    Institute of Scientific and Technical Information of China (English)

    王启华

    1996-01-01

    For the fixed design regression modelwhen Y, are randomly censored on the right, the estimators of unknown parameter and regression function g from censored observations are defined in the two cases .where the censored distribution is known and unknown, respectively. Moreover, the sufficient conditions under which these estimators are strongly consistent and pth (p>2) mean consistent are also established.

  9. Testing the visual consistency of web sites

    NARCIS (Netherlands)

    Geest, van der Thea; Loorbach, Nicole

    2005-01-01

    Consistency in the visual appearance of Web pages is often checked by experts, such as designers or reviewers. This article reports a card sort study conducted to determine whether users rather than experts could distinguish visual (in-)consistency in Web elements and pages. The users proved to agre

  10. A dual-consistency cache coherence protocol

    OpenAIRE

    Ros, Alberto; Jimborean, Alexandra

    2015-01-01

    Weak memory consistency models can maximize system performance by enabling hardware and compiler optimizations, but increase programming complexity since they do not match programmers’ intuition. The design of an efficient system with an intuitive memory model is an open challenge. This paper proposes SPEL, a dual-consistency cache coherence protocol which simultaneously guarantees the strongest memory consistency model provided by the hardware and yields improvements in both performance and ...

  11. Reverse Revenue Sharing Contract versus Two-Part Tariff Contract under a Closed-Loop Supply Chain System

    Directory of Open Access Journals (Sweden)

    Zunya Shi

    2016-01-01

    Full Text Available The importance of remanufacturing has been recognized in research and practice. The integrated system, combining the forward and reverse activities of supply chains, is called closed-loop supply chain (CLSC system. By coordination in the CLSC system, players will get economic improvement. This paper studies different coordination performances of two types of contracts, two-part tariff (TTC and reverse revenue sharing contract (RRSC, in a closed-loop system. Through mathematical analysis based on Stackelberg Game Theory, we find that it is easy for manufacturer to improve more profits and retailer’s collection effects by adjusting the ratio of transfer collection price through RRSC, and we also give the function to calculate the best ratio of transfer collection price, which may be a valuable reference for the decision maker in practice. Besides, our results also suggest that although the profits of the coordinated CLSC system are always higher than the contradictory scenario, the RRSC is more favorable to the manufacturer than to the retailer, as results show that the manufacturer will share more profits from the system through RRSC. Therefore, RRSC has attracted the manufacturers more to closing the supply chain for economic consideration.

  12. Decentralized Consistent Updates in SDN

    KAUST Repository

    Nguyen, Thanh Dang

    2017-04-10

    We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and blackholes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.

  13. Delirium, sedation and analgesia in the intensive care unit: a multinational, two-part survey among intensivists.

    Directory of Open Access Journals (Sweden)

    Alawi Luetz

    Full Text Available Analgesia, sedation and delirium management are important parts of intensive care treatment as they are relevant for patients' clinical and functional long-term outcome. Previous surveys showed that despite this fact implementation rates are still low. The primary aim of the prospective, observational multicenter study was to investigate the implementation rate of delirium monitoring among intensivists. Secondly, current practice concerning analgesia and sedation monitoring as well as treatment strategies for patients with delirium were assesed. In addition, this study compares perceived and actual practice regarding delirium, sedation and analgesia management. Data were obtained with a two-part, anonymous survey, containing general data from intensive care units in a first part and data referring to individual patients in a second part. Questionnaires from 101 hospitals (part 1 and 868 patients (part 2 were included in data analysis. Fifty-six percent of the intensive care units reported to monitor for delirium in clinical routine. Fourty-four percent reported the use of a validated delirium score. In this respect, the survey suggests an increasing use of delirium assessment tools compared to previous surveys. Nevertheless, part two of the survey revealed that in actual practice 73% of included patients were not monitored with a validated score. Furthermore, we observed a trend towards moderate or deep sedation which is contradicting to guideline-recommendations. Every fifth patient was suffering from pain. The implementation rate of adequate pain-assessment tools for mechanically ventilated and sedated patients was low (30%. In conclusion, further efforts are necessary to implement guideline recommendations into clinical practice. The study was registered (ClinicalTrials.gov identifier: NCT01278524 and approved by the ethical committee.

  14. Consistency of trace norm minimization

    CERN Document Server

    Bach, Francis

    2007-01-01

    Regularization by the sum of singular values, also referred to as the trace norm, is a popular technique for estimating low rank rectangular matrices. In this paper, we extend some of the consistency results of the Lasso to provide necessary and sufficient conditions for rank consistency of trace norm minimization with the square loss. We also provide an adaptive version that is rank consistent even when the necessary condition for the non adaptive version is not fulfilled.

  15. High SNR Consistent Compressive Sensing

    OpenAIRE

    Kallummil, Sreejith; Kalyani, Sheetal

    2017-01-01

    High signal to noise ratio (SNR) consistency of model selection criteria in linear regression models has attracted a lot of attention recently. However, most of the existing literature on high SNR consistency deals with model order selection. Further, the limited literature available on the high SNR consistency of subset selection procedures (SSPs) is applicable to linear regression with full rank measurement matrices only. Hence, the performance of SSPs used in underdetermined linear models ...

  16. Modeling and Testing Legacy Data Consistency Requirements

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard

    2003-01-01

    An increasing number of data sources are available on the Internet, many of which offer semantically overlapping data, but based on different schemas, or models. While it is often of interest to integrate such data sources, the lack of consistency among them makes this integration difficult....... This paper addresses the need for new techniques that enable the modeling and consistency checking for legacy data sources. Specifically, the paper contributes to the development of a framework that enables consistency testing of data coming from different types of data sources. The vehicle is UML and its...... accompanying XMI. The paper presents techniques for modeling consistency requirements using OCL and other UML modeling elements: it studies how models that describe the required consistencies among instances of legacy models can be designed in standard UML tools that support XMI. The paper also considers...

  17. Predictive tools for designing new insulins and treatment regimens

    DEFF Research Database (Denmark)

    Klim, Søren

    The thesis deals with the development of "Predictive tools for designing new insulins and treatments regimens" and consists of two parts: A model based approach for bridging properties of new insulin analogues from glucose clamp experiments to meal tolerance tests (MTT) and a second part that des......The thesis deals with the development of "Predictive tools for designing new insulins and treatments regimens" and consists of two parts: A model based approach for bridging properties of new insulin analogues from glucose clamp experiments to meal tolerance tests (MTT) and a second part...... on ordinary differential equations. The absence of such a program motivated the development of new a tool with PK/PD features, SDEs and mixed effects. Part II presents a software package which was developed in order to be able to handle SDEs with mixed effects. The package was implemented in R which allowed...

  18. Consistency argued students of fluid

    Science.gov (United States)

    Viyanti; Cari; Suparmi; Winarti; Slamet Budiarti, Indah; Handika, Jeffry; Widyastuti, Fatma

    2017-01-01

    Problem solving for physics concepts through consistency arguments can improve thinking skills of students and it is an important thing in science. The study aims to assess the consistency of the material Fluid student argmentation. The population of this study are College students PGRI Madiun, UIN Sunan Kalijaga Yogyakarta and Lampung University. Samples using cluster random sampling, 145 samples obtained by the number of students. The study used a descriptive survey method. Data obtained through multiple-choice test and interview reasoned. Problem fluid modified from [9] and [1]. The results of the study gained an average consistency argmentation for the right consistency, consistency is wrong, and inconsistent respectively 4.85%; 29.93%; and 65.23%. Data from the study have an impact on the lack of understanding of the fluid material which is ideally in full consistency argued affect the expansion of understanding of the concept. The results of the study as a reference in making improvements in future studies is to obtain a positive change in the consistency of argumentations.

  19. Coordinating user interfaces for consistency

    CERN Document Server

    Nielsen, Jakob

    2001-01-01

    In the years since Jakob Nielsen's classic collection on interface consistency first appeared, much has changed, and much has stayed the same. On the one hand, there's been exponential growth in the opportunities for following or disregarding the principles of interface consistency-more computers, more applications, more users, and of course the vast expanse of the Web. On the other, there are the principles themselves, as persistent and as valuable as ever. In these contributed chapters, you'll find details on many methods for seeking and enforcing consistency, along with bottom-line analys

  20. Consistency of Random Survival Forests.

    Science.gov (United States)

    Ishwaran, Hemant; Kogalur, Udaya B

    2010-07-01

    We prove uniform consistency of Random Survival Forests (RSF), a newly introduced forest ensemble learner for analysis of right-censored survival data. Consistency is proven under general splitting rules, bootstrapping, and random selection of variables-that is, under true implementation of the methodology. Under this setting we show that the forest ensemble survival function converges uniformly to the true population survival function. To prove this result we make one key assumption regarding the feature space: we assume that all variables are factors. Doing so ensures that the feature space has finite cardinality and enables us to exploit counting process theory and the uniform consistency of the Kaplan-Meier survival function.

  1. Process Fairness and Dynamic Consistency

    NARCIS (Netherlands)

    S.T. Trautmann (Stefan); P.P. Wakker (Peter)

    2010-01-01

    textabstractAbstract: When process fairness deviates from outcome fairness, dynamic inconsistencies can arise as in nonexpected utility. Resolute choice (Machina) can restore dynamic consistency under nonexpected utility without using Strotz's precommitment. It can similarly justify dynamically

  2. Gravitation, Causality, and Quantum Consistency

    CERN Document Server

    Hertzberg, Mark P

    2016-01-01

    We examine the role of consistency with causality and quantum mechanics in determining the properties of gravitation. We begin by constructing two different classes of interacting theories of massless spin 2 particles -- gravitons. One involves coupling the graviton with the lowest number of derivatives to matter, the other involves coupling the graviton with higher derivatives to matter, making use of the linearized Riemann tensor. The first class requires an infinite tower of terms for consistency, which is known to lead uniquely to general relativity. The second class only requires a finite number of terms for consistency, which appears as a new class of theories of massless spin 2. We recap the causal consistency of general relativity and show how this fails in the second class for the special case of coupling to photons, exploiting related calculations in the literature. In an upcoming publication [1] this result is generalized to a much broader set of theories. Then, as a causal modification of general ...

  3. Time-consistent and market-consistent evaluations

    NARCIS (Netherlands)

    Pelsser, A.; Stadje, M.A.

    2014-01-01

    We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from mathemati

  4. Poly (ε-caprolactone nanofibrous ring surrounding a polyvinyl alcohol hydrogel for the development of a biocompatible two-part artificial cornea

    Directory of Open Access Journals (Sweden)

    Bakhshandeh H

    2011-07-01

    % when measured in the 400–800 nm wavelength range. The plasma-treated PCL nanofibrous scaffold promoted limbal stem cell adhesion and proliferation within 10 days. These results confirmed that the polymeric artificial cornea showed suitable physical properties and good biocompatibility and epithelialization ability.Keywords: two part artificial cornea, nanofibers, electrospun, poly (ε-caprolactone, polyvinyl alcohol hydrogel, limbal stem cells

  5. Market-consistent actuarial valuation

    CERN Document Server

    Wüthrich, Mario V

    2016-01-01

    This is the third edition of this well-received textbook, presenting powerful methods for measuring insurance liabilities and assets in a consistent way, with detailed mathematical frameworks that lead to market-consistent values for liabilities. Topics covered are stochastic discounting with deflators, valuation portfolio in life and non-life insurance, probability distortions, asset and liability management, financial risks, insurance technical risks, and solvency. Including updates on recent developments and regulatory changes under Solvency II, this new edition of Market-Consistent Actuarial Valuation also elaborates on different risk measures, providing a revised definition of solvency based on industry practice, and presents an adapted valuation framework which takes a dynamic view of non-life insurance reserving risk.

  6. Consistent Histories in Quantum Cosmology

    CERN Document Server

    Craig, David A; 10.1007/s10701-010-9422-6

    2010-01-01

    We illustrate the crucial role played by decoherence (consistency of quantum histories) in extracting consistent quantum probabilities for alternative histories in quantum cosmology. Specifically, within a Wheeler-DeWitt quantization of a flat Friedmann-Robertson-Walker cosmological model sourced with a free massless scalar field, we calculate the probability that the univese is singular in the sense that it assumes zero volume. Classical solutions of this model are a disjoint set of expanding and contracting singular branches. A naive assessment of the behavior of quantum states which are superpositions of expanding and contracting universes may suggest that a "quantum bounce" is possible i.e. that the wave function of the universe may remain peaked on a non-singular classical solution throughout its history. However, a more careful consistent histories analysis shows that for arbitrary states in the physical Hilbert space the probability of this Wheeler-DeWitt quantum universe encountering the big bang/crun...

  7. Entropy-based consistent model driven architecture

    Science.gov (United States)

    Niepostyn, Stanisław Jerzy

    2016-09-01

    A description of software architecture is a plan of the IT system construction, therefore any architecture gaps affect the overall success of an entire project. The definitions mostly describe software architecture as a set of views which are mutually unrelated, hence potentially inconsistent. Software architecture completeness is also often described in an ambiguous way. As a result most methods of IT systems building comprise many gaps and ambiguities, thus presenting obstacles for software building automation. In this article the consistency and completeness of software architecture are mathematically defined based on calculation of entropy of the architecture description. Following this approach, in this paper we also propose our method of automatic verification of consistency and completeness of the software architecture development method presented in our previous article as Consistent Model Driven Architecture (CMDA). The proposed FBS (Functionality-Behaviour-Structure) entropy-based metric applied in our CMDA approach enables IT architects to decide whether the modelling process is complete and consistent. With this metric, software architects could assess the readiness of undergoing modelling work for the start of IT system building. It even allows them to assess objectively whether the designed software architecture of the IT system could be implemented at all. The overall benefit of such an approach is that it facilitates the preparation of complete and consistent software architecture more effectively as well as it enables assessing and monitoring of the ongoing modelling development status. We demonstrate this with a few industry examples of IT system designs.

  8. The Importance of being consistent

    CERN Document Server

    Wasserman, Adam; Jiang, Kaili; Kim, Min-Cheol; Sim, Eunji; Burke, Kieron

    2016-01-01

    We review the role of self-consistency in density functional theory. We apply a recent analysis to both Kohn-Sham and orbital-free DFT, as well as to Partition-DFT, which generalizes all aspects of standard DFT. In each case, the analysis distinguishes between errors in approximate functionals versus errors in the self-consistent density. This yields insights into the origins of many errors in DFT calculations, especially those often attributed to self-interaction or delocalization error. In many classes of problems, errors can be substantially reduced by using `better' densities. We review the history of these approaches, many of their applications, and give simple pedagogical examples.

  9. Use of Two-Part Regression Calibration Model to Correct for Measurement Error in Episodically Consumed Foods in a Single-Replicate Study Design: EPIC Case Study

    NARCIS (Netherlands)

    Agogo, G.O.; Voet, van der H.; Veer, van 't P.; Ferrari, P.; Leenders, M.; Muller, D.C.; Sánchez-Cantalejo, E.; Bamia, C.; Braaten, T.; Knüppel, S.; Johansson, I.; Eeuwijk, van F.A.; Boshuizen, H.C.

    2014-01-01

    In epidemiologic studies, measurement error in dietary variables often attenuates association between dietary intake and disease occurrence. To adjust for the attenuation caused by error in dietary intake, regression calibration is commonly used. To apply regression calibration, unbiased reference m

  10. Use of Two-Part Regression Calibration Model to Correct for Measurement Error in Episodically Consumed Foods in a Single-Replicate Study Design : EPIC Case Study

    NARCIS (Netherlands)

    Agogo, George O; der Voet, Hilko van; Veer, Pieter Van't; Ferrari, Pietro; Leenders, Max; Muller, David C; Sánchez-Cantalejo, Emilio; Bamia, Christina; Braaten, Tonje; Knüppel, Sven; Johansson, Ingegerd; van Eeuwijk, Fred A; Boshuizen, Hendriek

    2014-01-01

    In epidemiologic studies, measurement error in dietary variables often attenuates association between dietary intake and disease occurrence. To adjust for the attenuation caused by error in dietary intake, regression calibration is commonly used. To apply regression calibration, unbiased reference m

  11. Consistent supersymmetric decoupling in cosmology

    NARCIS (Netherlands)

    Sousa Sánchez, Kepa

    2012-01-01

    The present work discusses several problems related to the stability of ground states with broken supersymmetry in supergravity, and to the existence and stability of cosmic strings in various supersymmetric models. In particular we study the necessary conditions to truncate consistently a sector o

  12. Consistence of Network Filtering Rules

    Institute of Scientific and Technical Information of China (English)

    SHE Kun; WU Yuancheng; HUANG Juncai; ZHOU Mingtian

    2004-01-01

    The inconsistence of firewall/VPN(Virtual Private Network) rule makes a huge maintainable cost.With development of Multinational Company,SOHO office,E-government the number of firewalls/VPN will increase rapidly.Rule table in stand-alone or network will be increased in geometric series accordingly.Checking the consistence of rule table manually is inadequate.A formal approach can define semantic consistence,make a theoretic foundation of intelligent management about rule tables.In this paper,a kind of formalization of host rules and network ones for auto rule-validation based on SET theory were proporsed and a rule validation scheme was defined.The analysis results show the superior performance of the methods and demonstrate its potential for the intelligent management based on rule tables.

  13. Self-consistent triaxial models

    CERN Document Server

    Sanders, Jason L

    2015-01-01

    We present self-consistent triaxial stellar systems that have analytic distribution functions (DFs) expressed in terms of the actions. These provide triaxial density profiles with cores or cusps at the centre. They are the first self-consistent triaxial models with analytic DFs suitable for modelling giant ellipticals and dark haloes. Specifically, we study triaxial models that reproduce the Hernquist profile from Williams & Evans (2015), as well as flattened isochrones of the form proposed by Binney (2014). We explore the kinematics and orbital structure of these models in some detail. The models typically become more radially anisotropic on moving outwards, have velocity ellipsoids aligned in Cartesian coordinates in the centre and aligned in spherical polar coordinates in the outer parts. In projection, the ellipticity of the isophotes and the position angle of the major axis of our models generally changes with radius. So, a natural application is to elliptical galaxies that exhibit isophote twisting....

  14. On Modal Refinement and Consistency

    DEFF Research Database (Denmark)

    Nyman, Ulrik; Larsen, Kim Guldstrand; Wasowski, Andrzej

    2007-01-01

    Almost 20 years after the original conception, we revisit several fundamental question about modal transition systems. First, we demonstrate the incompleteness of the standard modal refinement using a counterexample due to Hüttel. Deciding any refinement, complete with respect to the standard...... notions of implementation, is shown to be computationally hard (co-NP hard). Second, we consider four forms of consistency (existence of implementations) for modal specifications. We characterize each operationally, giving algorithms for deciding, and for synthesizing implementations, together...

  15. Tri-Sasakian consistent reduction

    CERN Document Server

    Cassani, Davide

    2011-01-01

    We establish a universal consistent Kaluza-Klein truncation of M-theory based on seven-dimensional tri-Sasakian structure. The four-dimensional truncated theory is an N=4 gauged supergravity with three vector multiplets and a non-abelian gauge group, containing the compact factor SO(3). Consistency follows from the fact that our truncation takes exactly the same form as a left-invariant reduction on a specific coset manifold, and we show that the same holds for the various universal consistent truncations recently put forward in the literature. We describe how the global symmetry group SL(2,R) x SO(6,3) is embedded in the symmetry group E7(7) of maximally supersymmetric reductions, and make the connection with the approach of Exceptional Generalized Geometry. Vacuum AdS4 solutions spontaneously break the amount of supersymmetry from N=4 to N=3,1 or 0, and the spectrum contains massive modes. We find a subtruncation to minimal N=3 gauged supergravity as well as an N=1 subtruncation to the SO(3)-invariant secto...

  16. CRDTs: Consistency without concurrency control

    CERN Document Server

    Letia, Mihai; Shapiro, Marc

    2009-01-01

    A CRDT is a data type whose operations commute when they are concurrent. Replicas of a CRDT eventually converge without any complex concurrency control. As an existence proof, we exhibit a non-trivial CRDT: a shared edit buffer called Treedoc. We outline the design, implementation and performance of Treedoc. We discuss how the CRDT concept can be generalised, and its limitations.

  17. On the consistency of MPS

    CERN Document Server

    Souto-Iglesias, Antonio; González, Leo M; Cercos-Pita, Jose L

    2013-01-01

    The consistency of Moving Particle Semi-implicit (MPS) method in reproducing the gradient, divergence and Laplacian differential operators is discussed in the present paper. Its relation to the Smoothed Particle Hydrodynamics (SPH) method is rigorously established. The application of the MPS method to solve the Navier-Stokes equations using a fractional step approach is treated, unveiling inconsistency problems when solving the Poisson equation for the pressure. A new corrected MPS method incorporating boundary terms is proposed. Applications to one dimensional boundary value Dirichlet and mixed Neumann-Dirichlet problems and to two-dimensional free-surface flows are presented.

  18. Measuring process and knowledge consistency

    DEFF Research Database (Denmark)

    Edwards, Kasper; Jensen, Klaes Ladeby; Haug, Anders

    2007-01-01

    with a 5 point Liker scale and a corresponding scoring system. Process consistency is measured by using a first-person drawing tool with the respondent in the centre. Respondents sketch the sequence of steps and people they contact when configuring a product. The methodology is tested in one company...... for granted; rather the contrary, and attempting to implement a configuration system may easily ignite a political battle. This is because stakes are high in the sense that the rules and processes chosen may only reflect one part of the practice, ignoring a majority of the employees. To avoid this situation...

  19. 基于DSP的光纤惯性测试组件嵌入式数据采集系统%Embedded Data Acquisition System Design for Inertial Measurement Unit Consisted by FOG Based on DSP

    Institute of Scientific and Technical Information of China (English)

    樊勇华; 查峰; 李京书

    2012-01-01

    针对光纤测试组件信号输出特点,设计一种基于DSP的光纤惯性测试组件的数据采集系统,实现对光纤陀螺和加速度计输出信息、温度信息的A/D转换、采集、存储。针对光纤陀螺输出、温度以及加速度计输出等脉冲形式的信号,设计基于CPLD的脉冲信号计数电路,实现了固定时间间隔内的数据采样。针对加速度计温度输出为模拟信号的特点,基于24位的AD7738转换芯片通过DSP编程实现对加速度计温度信号AD转换和采集。设计了DSP与CPLD、上位机的外围通讯电路,通过对DSP内部存储器的编程,实现了对光纤惯性测试组件各信息的采集与测试,并将其发送给上位机进行存储、分析等处理。%To acquire the outputs of fiber optic gyros(FOG) and accelerometers and their temperature,a acquisition system is designed for the FOG inertial measurement unit(IMU) according to its data characteristic.A pulse-scaling circuit based on CPLD is designed to count the output pulses of the gyros and accelerometers,and consequently the outputs are sampled.Due to the temperature outputs of the accelerometers are analog signals,so the signals are converted to digital format using AD7738 chip and then they are acquired by DSP TMS320F28335.Finally,the communication circuit and DSP interactive program between the DSP,CPLD and test computer are designed and the data is acquired and sent to the test computer by the DSP.

  20. Design

    DEFF Research Database (Denmark)

    Volf, Mette

    This publication is unique in its demystification and operationalization of the complex and elusive nature of the design process. The publication portrays the designer’s daily work and the creative process, which the designer is a part of. Apart from displaying the designer’s work methods...... and design parameters, the publication shows examples from renowned Danish design firms. Through these examples the reader gets an insight into the designer’s reality....

  1. Probability-consistent spectrum and code spectrum

    Institute of Scientific and Technical Information of China (English)

    沈建文; 石树中

    2004-01-01

    In the seismic safety evaluation (SSE) for key projects, the probability-consistent spectrum (PCS), usually obtained from probabilistic seismic hazard analysis (PSHA), is not consistent with the design response spectrum given by Code for Seismic Design of Buildings (GB50011-2001). Sometimes, there may be a remarkable difference between them. If the PCS is lower than the corresponding code design response spectrum (CDS), the seismic fortification criterion for the key projects would be lower than that for the general industry and civil buildings. In the paper, the relation between PCS and CDS is discussed by using the ideal simple potential seismic source. The results show that in the most areas influenced mainly by the potential sources of the epicentral earthquakes and the regional earthquakes, PCS is generally lower than CDS in the long periods. We point out that the long-period response spectra of the code should be further studied and combined with the probability method of seismic zoning as much as possible. Because of the uncertainties in SSE, it should be prudent to use the long-period response spectra given by SSE for key projects when they are lower than CDS.

  2. The Consistent Vehicle Routing Problem

    Energy Technology Data Exchange (ETDEWEB)

    Groer, Christopher S [ORNL; Golden, Bruce [University of Maryland; Edward, Wasil [American University

    2009-01-01

    In the small package shipping industry (as in other industries), companies try to differentiate themselves by providing high levels of customer service. This can be accomplished in several ways, including online tracking of packages, ensuring on-time delivery, and offering residential pickups. Some companies want their drivers to develop relationships with customers on a route and have the same drivers visit the same customers at roughly the same time on each day that the customers need service. These service requirements, together with traditional constraints on vehicle capacity and route length, define a variant of the classical capacitated vehicle routing problem, which we call the consistent VRP (ConVRP). In this paper, we formulate the problem as a mixed-integer program and develop an algorithm to solve the ConVRP that is based on the record-to-record travel algorithm. We compare the performance of our algorithm to the optimal mixed-integer program solutions for a set of small problems and then apply our algorithm to five simulated data sets with 1,000 customers and a real-world data set with more than 3,700 customers. We provide a technique for generating ConVRP benchmark problems from vehicle routing problem instances given in the literature and provide our solutions to these instances. The solutions produced by our algorithm on all problems do a very good job of meeting customer service objectives with routes that have a low total travel time.

  3. Consistency and stability of recombinant fermentations.

    Science.gov (United States)

    Wiebe, M E; Builder, S E

    1994-01-01

    Production of proteins of consistent quality in heterologous, genetically-engineered expression systems is dependent upon identifying the manufacturing process parameters which have an impact on product structure, function, or purity, validating acceptable ranges for these variables, and performing the manufacturing process as specified. One of the factors which may affect product consistency is genetic instability of the primary product sequence, as well as instability of genes which code for proteins responsible for post-translational modification of the product. Approaches have been developed for mammalian expression systems to assure that product quality is not changing through mechanisms of genetic instability. Sensitive protein analytical methods, particularly peptide mapping, are used to evaluate product structure directly, and are more sensitive in detecting genetic instability than is direct genetic analysis by nucleotide sequencing of the recombinant gene or mRNA. These methods are being employed to demonstrate that the manufacturing process consistently yields a product of defined structure from cells cultured through the range of cell ages used in the manufacturing process and well beyond the maximum cell age defined for the process. The combination of well designed validation studies which demonstrate consistent product quality as a function of cell age, and rigorous quality control of every product lot by sensitive protein analytical methods provide the necessary assurance that product structure is not being altered through mechanisms of mutation and selection.

  4. Consistent implementation of decisions in the brain.

    Directory of Open Access Journals (Sweden)

    James A R Marshall

    Full Text Available Despite the complexity and variability of decision processes, motor responses are generally stereotypical and independent of decision difficulty. How is this consistency achieved? Through an engineering analogy we consider how and why a system should be designed to realise not only flexible decision-making, but also consistent decision implementation. We specifically consider neurobiologically-plausible accumulator models of decision-making, in which decisions are made when a decision threshold is reached. To trade-off between the speed and accuracy of the decision in these models, one can either adjust the thresholds themselves or, equivalently, fix the thresholds and adjust baseline activation. Here we review how this equivalence can be implemented in such models. We then argue that manipulating baseline activation is preferable as it realises consistent decision implementation by ensuring consistency of motor inputs, summarise empirical evidence in support of this hypothesis, and suggest that it could be a general principle of decision making and implementation. Our goal is therefore to review how neurobiologically-plausible models of decision-making can manipulate speed-accuracy trade-offs using different mechanisms, to consider which of these mechanisms has more desirable decision-implementation properties, and then review the relevant neuroscientific data on which mechanism brains actually use.

  5. Evaluating Temporal Consistency in Marine Biodiversity Hotspots.

    Science.gov (United States)

    Piacenza, Susan E; Thurman, Lindsey L; Barner, Allison K; Benkwitt, Cassandra E; Boersma, Kate S; Cerny-Chipman, Elizabeth B; Ingeman, Kurt E; Kindinger, Tye L; Lindsley, Amy J; Nelson, Jake; Reimer, Jessica N; Rowe, Jennifer C; Shen, Chenchen; Thompson, Kevin A; Heppell, Selina S

    2015-01-01

    With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monitoring dataset collected over an eight year period off the US Pacific Coast, we developed a methodological approach for avoiding biases associated with hotspot delineation. We aggregated benthic fish species data from research trawls and calculated mean hotspot thresholds for fish species richness and Shannon's diversity indices over the eight year dataset. We used a spatial frequency distribution method to assign hotspot designations to the grid cells annually. We found no areas containing consistently high biodiversity through the entire study period based on the mean thresholds, and no grid cell was designated as a hotspot for greater than 50% of the time-series. To test if our approach was sensitive to sampling effort and the geographic extent of the survey, we followed a similar routine for the northern region of the survey area. Our finding of low consistency in benthic fish biodiversity hotspots over time was upheld, regardless of biodiversity metric used, whether thresholds were calculated per year or across all years, or the spatial extent for which we calculated thresholds and identified hotspots. Our results suggest that static measures of benthic fish biodiversity off the US West Coast are insufficient for identification of hotspots and that long-term data are required to appropriately identify patterns of high temporal variability in biodiversity for these highly mobile taxa. Given that ecological communities are responding to a changing climate and other

  6. Consistent thermodynamic properties of lipids systems

    DEFF Research Database (Denmark)

    Cunico, Larissa; Ceriani, Roberta; Sarup, Bent

    Physical and thermodynamic properties of pure components and their mixtures are the basic requirement for process design, simulation, and optimization. In the case of lipids, our previous works[1-3] have indicated a lack of experimental data for pure components and also for their mixtures...... different pressures, with azeotrope behavior observed. Available thermodynamic consistency tests for TPx data were applied before performing parameter regressions for Wilson, NRTL, UNIQUAC and original UNIFAC models. The relevance of enlarging experimental databank of lipids systems data in order to improve...

  7. Consistency and variability in functional localisers

    Science.gov (United States)

    Duncan, Keith J.; Pattamadilok, Chotiga; Knierim, Iris; Devlin, Joseph T.

    2009-01-01

    A critical assumption underlying the use of functional localiser scans is that the voxels identified as the functional region-of-interest (fROI) are essentially the same as those activated by the main experimental manipulation. Intra-subject variability in the location of the fROI violates this assumption, reducing the sensitivity of the analysis and biasing the results. Here we investigated consistency and variability in fROIs in a set of 45 volunteers. They performed two functional localiser scans to identify word- and object-sensitive regions of ventral and lateral occipito-temporal cortex, respectively. In the main analyses, fROIs were defined as the category-selective voxels in each region and consistency was measured as the spatial overlap between scans. Consistency was greatest when minimally selective thresholds were used to define “active” voxels (p < 0.05 uncorrected), revealing that approximately 65% of the voxels were commonly activated by both scans. In contrast, highly selective thresholds (p < 10− 4 to 10− 6) yielded the lowest consistency values with less than 25% overlap of the voxels active in both scans. In other words, intra-subject variability was surprisingly high, with between one third and three quarters of the voxels in a given fROI not corresponding to those activated in the main task. This level of variability stands in striking contrast to the consistency seen in retinotopically-defined areas and has important implications for designing robust but efficient functional localiser scans. PMID:19289173

  8. Design

    DEFF Research Database (Denmark)

    Jensen, Ole B.; Pettiway, Keon

    2017-01-01

    by designers, planners, etc. (staging from above) and mobile subjects (staging from below). A research agenda for studying situated practices of mobility and mobilities design is outlined in three directions: foci of studies, methods and approaches, and epistemologies and frames of thinking. Jensen begins...... with a brief description of how movement is studied within social sciences after the “mobilities turn” versus the idea of physical movement in transport geography and engineering. He then explains how “mobilities design” was derived from connections between traffic and architecture. Jensen concludes......In this chapter, Ole B. Jensen takes a situational approach to mobilities to examine how ordinary life activities are structured by technology and design. Using “staging mobilities” as a theoretical approach, Jensen considers mobilities as overlapping, actions, interactions and decisions...

  9. Design

    DEFF Research Database (Denmark)

    Volf, Mette

    Design - proces & metode iBog®  er enestående i sit fokus på afmystificering og operationalisering af designprocessens flygtige og komplekse karakter. Udgivelsen går bag om designerens daglige arbejde og giver et indblik i den kreative skabelsesproces, som designeren er en del af. Udover et bredt...... indblik i designerens arbejdsmetoder og designparametre giver Design - proces & metode en række eksempler fra anerkendte designvirksomheder, der gør det muligt at komme helt tæt på designerens virkelighed....

  10. Design

    Science.gov (United States)

    Buchanan, Richard; Cross, Nigel; Durling, David; Nelson, Harold; Owen, Charles; Valtonen, Anna; Boling, Elizabeth; Gibbons, Andrew; Visscher-Voerman, Irene

    2013-01-01

    Scholars representing the field of design were asked to identify what they considered to be the most exciting and imaginative work currently being done in their field, as well as how that work might change our understanding. The scholars included Richard Buchanan, Nigel Cross, David Durling, Harold Nelson, Charles Owen, and Anna Valtonen. Scholars…

  11. Design

    Science.gov (United States)

    Buchanan, Richard; Cross, Nigel; Durling, David; Nelson, Harold; Owen, Charles; Valtonen, Anna; Boling, Elizabeth; Gibbons, Andrew; Visscher-Voerman, Irene

    2013-01-01

    Scholars representing the field of design were asked to identify what they considered to be the most exciting and imaginative work currently being done in their field, as well as how that work might change our understanding. The scholars included Richard Buchanan, Nigel Cross, David Durling, Harold Nelson, Charles Owen, and Anna Valtonen. Scholars…

  12. Overview to the two-part series: Measurement equivalence of the Patient Reported Outcomes Measurement Information System® (PROMIS® short forms

    Directory of Open Access Journals (Sweden)

    Bryce B. Reeve

    2016-03-01

    Full Text Available Measurement equivalence across differing socio-demographic groups is essential for valid assessment. This is one of two issues of Psychological Test and Assessment Modeling that contains articles describing methods and substantive findings related to establishing measurement equivalence in self-reported health, mental health and social functioning measures. The articles in this two part series describe analyses of items assessing eight domains: fatigue, depression, anxiety, sleep, pain, physical function, cognitive concerns and social function. Additionally, two overview articles describe the methods and sample characteristics of the data set used in these analyses. An additional article describes the important topic of assessing magnitude and impact of differential item functioning. These articles provide the first strong evidence supporting the measurement equivalence of the Patient Reported Outcomes Measurement Information System® (PROMIS® short form measures in ethnically, socio-demographically diverse groups, and is a beginning step in meeting the international call for further study of their performance in such groups.

  13. 基于二部制的价格歧视方法%Price Discrimination Method Based on Two-parts Tariff

    Institute of Scientific and Technical Information of China (English)

    彭云飞

    2012-01-01

    产品实施价格二部制的企业大量采用二级价格歧视、三级价格歧视方法外,还存在混合价格歧视.研究表明:企业收益最大化目标下实施这些价格歧视方法时,固定价格和从量价格的需求弹性、用户数、单位时间里的收益以及用户区间、资费区间分别需要满足不同的条件.这些结论为企业实施二部价格歧视提供了理论依据和决策参考.%Second-degree and third-degree price discriminaions are widely adopted in he product pricing. Research shows: under the constraints of income maximization, some con-ditions for the fixed fee of service packages and the demand elasticity of the quantity rate of bundled services, the number of users, the income per unit time, the users; and price's intervals should be meet in order to implement these price discrimination decision-making methods. These conclusions provide the theoretical and decision-making bases when the enterprises carry out two parts price discrimination

  14. Volume Haptics with Topology-Consistent Isosurfaces.

    Science.gov (United States)

    Corenthy, Loc; Otaduy, Miguel A; Pastor, Luis; Garcia, Marcos

    2015-01-01

    Haptic interfaces offer an intuitive way to interact with and manipulate 3D datasets, and may simplify the interpretation of visual information. This work proposes an algorithm to provide haptic feedback directly from volumetric datasets, as an aid to regular visualization. The haptic rendering algorithm lets users perceive isosurfaces in volumetric datasets, and it relies on several design features that ensure a robust and efficient rendering. A marching tetrahedra approach enables the dynamic extraction of a piecewise linear continuous isosurface. Robustness is achieved using a continuous collision detection step coupled with state-of-the-art proxy-based rendering methods over the extracted isosurface. The introduced marching tetrahedra approach guarantees that the extracted isosurface will match the topology of an equivalent isosurface computed using trilinear interpolation. The proposed haptic rendering algorithm improves the consistency between haptic and visual cues computing a second proxy on the isosurface displayed on screen. Our experiments demonstrate the improvements on the isosurface extraction stage as well as the robustness and the efficiency of the complete algorithm.

  15. Institutional transfer from the European design practices to Ukraine and Moldova: the case of hospital design.

    Science.gov (United States)

    Plugaru, Rodica

    2010-01-01

    This article explores the development of post-soviet hospital design through the analysis of recent modernisations in Moldova and Ukraine. It consists of two parts. First, an introduction of the definition of hospital design as well as its main characteristics during the Soviet period. Secondly, a presentation of two hospital modernisations in Ukraine and Moldova. In a comparative perspective, the paper presents the actors involved, the difficulties in modernising the hospital regarding the inherited rules as well as the solutions advanced in order to implement a change. An introduction to the hospital design in Moldova and Ukraine will allow an in-depth study of the involvement of international actors in the post-communist transformations.

  16. Consistent estimation of Gibbs energy using component contributions.

    Science.gov (United States)

    Noor, Elad; Haraldsdóttir, Hulda S; Milo, Ron; Fleming, Ronan M T

    2013-01-01

    Standard Gibbs energies of reactions are increasingly being used in metabolic modeling for applying thermodynamic constraints on reaction rates, metabolite concentrations and kinetic parameters. The increasing scope and diversity of metabolic models has led scientists to look for genome-scale solutions that can estimate the standard Gibbs energy of all the reactions in metabolism. Group contribution methods greatly increase coverage, albeit at the price of decreased precision. We present here a way to combine the estimations of group contribution with the more accurate reactant contributions by decomposing each reaction into two parts and applying one of the methods on each of them. This method gives priority to the reactant contributions over group contributions while guaranteeing that all estimations will be consistent, i.e. will not violate the first law of thermodynamics. We show that there is a significant increase in the accuracy of our estimations compared to standard group contribution. Specifically, our cross-validation results show an 80% reduction in the median absolute residual for reactions that can be derived by reactant contributions only. We provide the full framework and source code for deriving estimates of standard reaction Gibbs energy, as well as confidence intervals, and believe this will facilitate the wide use of thermodynamic data for a better understanding of metabolism.

  17. Consistent estimation of Gibbs energy using component contributions.

    Directory of Open Access Journals (Sweden)

    Elad Noor

    Full Text Available Standard Gibbs energies of reactions are increasingly being used in metabolic modeling for applying thermodynamic constraints on reaction rates, metabolite concentrations and kinetic parameters. The increasing scope and diversity of metabolic models has led scientists to look for genome-scale solutions that can estimate the standard Gibbs energy of all the reactions in metabolism. Group contribution methods greatly increase coverage, albeit at the price of decreased precision. We present here a way to combine the estimations of group contribution with the more accurate reactant contributions by decomposing each reaction into two parts and applying one of the methods on each of them. This method gives priority to the reactant contributions over group contributions while guaranteeing that all estimations will be consistent, i.e. will not violate the first law of thermodynamics. We show that there is a significant increase in the accuracy of our estimations compared to standard group contribution. Specifically, our cross-validation results show an 80% reduction in the median absolute residual for reactions that can be derived by reactant contributions only. We provide the full framework and source code for deriving estimates of standard reaction Gibbs energy, as well as confidence intervals, and believe this will facilitate the wide use of thermodynamic data for a better understanding of metabolism.

  18. Balance fatigue design of cast steel nodes in tubular steel structures.

    Science.gov (United States)

    Wang, Libin; Jin, Hui; Dong, Haiwei; Li, Jing

    2013-01-01

    Cast steel nodes are being increasingly popular in steel structure joint application as their advanced mechanical performances and flexible forms. This kind of joints improves the structural antifatigue capability observably and is expected to be widely used in the structures with fatigue loadings. Cast steel node joint consists of two parts: casting itself and the welds between the node and the steel member. The fatigue resistances of these two parts are very different; the experiment results showed very clearly that the fatigue behavior was governed by the welds in all tested configurations. This paper focuses on the balance fatigue design of these two parts in a cast steel node joint using fracture mechanics and FEM. The defects in castings are simulated by cracks conservatively. The final crack size is decided by the minimum of 90% of the wall thickness and the value deduced by fracture toughness. The allowable initial crack size could be obtained through the integral of Paris equation when the crack propagation life is considered equal to the weld fatigue life; therefore, the two parts in a cast steel node joint will have a balance fatigue life.

  19. Epilogue to the two-part series: Measurement equivalence of the Patient Reported Outcomes Measurement Information System® (PROMIS® short forms

    Directory of Open Access Journals (Sweden)

    Jeanne A. Teresi

    2016-06-01

    Full Text Available The articles in this two-part series of Psychological Test and Assessment Modeling describe the psychometric performance and measurement equivalence of the Patient Reported Outcomes Measurement Information System® (PROMIS® short form measures in ethnically, socio-demographically diverse groups of cancer patients. Measures in eight health-related quality of life domains were evaluated: fatigue, depression, anxiety, cognition, pain, sleep, and physical and social function. State-of-the-art latent variable methods, most based on item response theory, and described in two methods overview articles in this series were used to examine differential item functioning (DIF. Findings were generally supportive of the performance of the PROMIS measures. Although use of powerful methods and large samples resulted in the identification of many items with DIF, practi-cally none were identified with high magnitude. The aggregate level impact of DIF was small, and minimal individual impact was detected. Some methodological challenges were encountered in-volving positively and negatively worded items, but most were resolved through modest item removal. Sensitivity analyses showed minimal impact of model assumption violation on the results presented. A cautionary note is the observance of a few instances of individual-level impact of DIF in the analyses of depression, anxiety, and pain, and one instance of aggregate level impact just below threshold in the analyses of physical function. Although this sample of over 5,000 individuals was diverse, ethnically, a limitation was the lack of ability to examine language groups other than Spanish and English and specific ethnic subgroups within Hispanic, Asian/Pacific Islander, and Black subsamples. Extensive qualitative and quantitative analyses were performed in the development of PROMIS item banks. These sets of analyses, performed by several teams of psychometricians, statisticians, and qualitative experts, were the

  20. Carbohydrate intake, obesity, metabolic syndrome and cancer risk? A two-part systematic review and meta-analysis protocol to estimate attributability.

    Science.gov (United States)

    Sartorius, B; Sartorius, K; Aldous, C; Madiba, T E; Stefan, C; Noakes, T

    2016-01-04

    required. The final results of this two part systematic review (plus multiplicative calculations) will be published in a relevant international peer-reviewed journal. PROSPERO CRD42015023257. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  1. MDCC: Multi-Data Center Consistency

    CERN Document Server

    Kraska, Tim; Franklin, Michael J; Madden, Samuel

    2012-01-01

    Replicating data across multiple data centers not only allows moving the data closer to the user and, thus, reduces latency for applications, but also increases the availability in the event of a data center failure. Therefore, it is not surprising that companies like Google, Yahoo, and Netflix already replicate user data across geographically different regions. However, replication across data centers is expensive. Inter-data center network delays are in the hundreds of milliseconds and vary significantly. Synchronous wide-area replication is therefore considered to be unfeasible with strong consistency and current solutions either settle for asynchronous replication which implies the risk of losing data in the event of failures, restrict consistency to small partitions, or give up consistency entirely. With MDCC (Multi-Data Center Consistency), we describe the first optimistic commit protocol, that does not require a master or partitioning, and is strongly consistent at a cost similar to eventually consiste...

  2. A new approach to hull consistency

    Directory of Open Access Journals (Sweden)

    Kolev Lubomir

    2016-06-01

    Full Text Available Hull consistency is a known technique to improve the efficiency of iterative interval methods for solving nonlinear systems describing steady-states in various circuits. Presently, hull consistency is checked in a scalar manner, i.e. successively for each equation of the nonlinear system with respect to a single variable. In the present poster, a new more general approach to implementing hull consistency is suggested which consists in treating simultaneously several equations with respect to the same number of variables.

  3. Student Effort, Consistency, and Online Performance

    Science.gov (United States)

    Patron, Hilde; Lopez, Salvador

    2011-01-01

    This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas…

  4. Consistent truncations with massive modes and holography

    CERN Document Server

    Cassani, Davide; Faedo, Anton F

    2011-01-01

    We review the basic features of some recently found consistent Kaluza-Klein truncations including massive modes. We emphasize the general ideas underlying the reduction procedure, then we focus on type IIB supergravity on 5-dimensional manifolds admitting a Sasaki-Einstein structure, which leads to half-maximal gauged supergravity in five dimensions. Finally, we comment on the holographic picture of consistency.

  5. CONSISTENT AGGREGATION IN FOOD DEMAND SYSTEMS

    OpenAIRE

    Levedahl, J. William; Reed, Albert J.; Clark, J. Stephen

    2002-01-01

    Two aggregation schemes for food demand systems are tested for consistency with the Generalized Composite Commodity Theorem (GCCT). One scheme is based on the standard CES classification of food expenditures. The second scheme is based on the Food Guide Pyramid. Evidence is found that both schemes are consistent with the GCCT.

  6. A Framework of Memory Consistency Models

    Institute of Scientific and Technical Information of China (English)

    胡伟武; 施巍松; 等

    1998-01-01

    Previous descriptions of memory consistency models in shared-memory multiprocessor systems are mainly expressed as constraints on the memory access event ordering and hence are hardware-centric.This paper presents a framework of memory consistency models which describes the memory consistency model on the behavior level.Based on the understanding that the behavior of an execution is determined by the execution order of conflicting accesses,a memory consistency model is defined as an interprocessor synchronization mechanism which orders the execution of operations from different processors.Synchronization order of an execution under certain consistency model is also defined.The synchronization order,together with the program order determines the behavior of an execution.This paper also presents criteria for correct program and correct implementation of consistency models.Regarding an implementation of a consistency model as certain memory event ordering constraints,this paper provides a method to prove the correctness of consistency model implementations,and the correctness of the lock-based cache coherence protocol is proved with this method.

  7. Sticky continuous processes have consistent price systems

    DEFF Research Database (Denmark)

    Bender, Christian; Pakkanen, Mikko; Sayit, Hasanjan

    Under proportional transaction costs, a price process is said to have a consistent price system, if there is a semimartingale with an equivalent martingale measure that evolves within the bid-ask spread. We show that a continuous, multi-asset price process has a consistent price system, under arb...

  8. Putting Consistent Theories Together in Institutions

    Institute of Scientific and Technical Information of China (English)

    应明生

    1995-01-01

    The problem of putting consistent theories together in institutions is discussed.A general necessary condition for consistency of the resulting theory is carried out,and some sufficient conditions are given for diagrams of theories in which shapes are tree bundles or directed graphs.Moreover,some transformations from complicated cases to simple ones are established.

  9. Dispersion sensitivity analysis & consistency improvement of APFSDS

    Directory of Open Access Journals (Sweden)

    Sangeeta Sharma Panda

    2017-08-01

    In Bore Balloting Motion simulation shows that reduction in residual spin by about 5% results in drastic 56% reduction in first maximum yaw. A correlation between first maximum yaw and residual spin is observed. Results of data analysis are used in design modification for existing ammunition. Number of designs are evaluated numerically before freezing five designs for further soundings. These designs are critically assessed in terms of their comparative performance during In-bore travel & external ballistics phase. Results are validated by free flight trials for the finalised design.

  10. Design and analysis tool validation

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.

    1981-07-01

    The Solar Energy Research Institute (SERI) is developing a procedure for the validation of Building Energy Analysis Simulation Codes (BEAS). These codes are being used increasingly in the building design process, both directly and as the basis for simplified design tools and guidelines. The importance of the validity of the BEAS in predicting building energy performance is obvious when one considers the money and energy that could be wasted by energy-inefficient designs. However, to date, little or no systematic effort has been made to ensure the validity of the various BEAS. The validation work at SERI consists of three distinct parts: Comparative Study, Analytical Verification, and Empirical Validation. The procedures have been developed for the first two parts and have been implemented on a sampling of the major BEAS; results have shown major problems in one of the BEAS tested. Furthermore, when one building design was run using several of the BEAS, large differences were found in the predicted annual cooling and heating loads. The empirical validation procedure has been developed, and five two-zone test cells have been constructed for validation; a summer validation run will take place as soon as the data acquisition system is completed. Additionally, a test validation exercise is now in progress using the low-cal house to fine-tune the empirical validation procedure and better define monitoring data requirements.

  11. Consistent measurements comparing the drift features of noble gas mixtures

    CERN Document Server

    Becker, U; Fortunato, E M; Kirchner, J; Rosera, K; Uchida, Y

    1999-01-01

    We present a consistent set of measurements of electron drift velocities and Lorentz deflection angles for all noble gases with methane and ethane as quenchers in magnetic fields up to 0.8 T. Empirical descriptions are also presented. Details on the World Wide Web allow for guided design and optimization of future detectors.

  12. On the Initial State and Consistency Relations

    CERN Document Server

    Berezhiani, Lasha

    2014-01-01

    We study the effect of the initial state on the consistency conditions for adiabatic perturbations. In order to be consistent with the constraints of General Relativity, the initial state must be diffeomorphism invariant. As a result, we show that initial wavefunctional/density matrix has to satisfy a Slavnov-Taylor identity similar to that of the action. We then investigate the precise ways in which modified initial states can lead to violations of the consistency relations. We find two independent sources of violations: i) the state can include initial non-Gaussianities; ii) even if the initial state is Gaussian, such as a Bogoliubov state, the modified 2-point function can modify the q->0 analyticity properties of the vertex functional and result in violations of the consistency relations.

  13. On the initial state and consistency relations

    Energy Technology Data Exchange (ETDEWEB)

    Berezhiani, Lasha; Khoury, Justin, E-mail: lashaber@sas.upenn.edu, E-mail: jkhoury@sas.upenn.edu [Center for Particle Cosmology, Department of Physics and Astronomy, University of Pennsylvania, 209 South 33rd Street, Philadelphia, PA 19104 (United States)

    2014-09-01

    We study the effect of the initial state on the consistency conditions for adiabatic perturbations. In order to be consistent with the constraints of General Relativity, the initial state must be diffeomorphism invariant. As a result, we show that initial wavefunctional/density matrix has to satisfy a Slavnov-Taylor identity similar to that of the action. We then investigate the precise ways in which modified initial states can lead to violations of the consistency relations. We find two independent sources of violations: i) the state can include initial non-Gaussianities; ii) even if the initial state is Gaussian, such as a Bogoliubov state, the modified 2-point function can modify the q-vector → 0 analyticity properties of the vertex functional and result in violations of the consistency relations.

  14. Self-Consistent Asset Pricing Models

    CERN Document Server

    Malevergne, Y

    2006-01-01

    We discuss the foundations of factor or regression models in the light of the self-consistency condition that the market portfolio (and more generally the risk factors) is (are) constituted of the assets whose returns it is (they are) supposed to explain. As already reported in several articles, self-consistency implies correlations between the return disturbances. As a consequence, the alpha's and beta's of the factor model are unobservable. Self-consistency leads to renormalized beta's with zero effective alpha's, which are observable with standard OLS regressions. Analytical derivations and numerical simulations show that, for arbitrary choices of the proxy which are different from the true market portfolio, a modified linear regression holds with a non-zero value $\\alpha_i$ at the origin between an asset $i$'s return and the proxy's return. Self-consistency also introduces ``orthogonality'' and ``normality'' conditions linking the beta's, alpha's (as well as the residuals) and the weights of the proxy por...

  15. Quasiparticle self-consistent GW theory.

    Science.gov (United States)

    van Schilfgaarde, M; Kotani, Takao; Faleev, S

    2006-06-09

    In past decades the scientific community has been looking for a reliable first-principles method to predict the electronic structure of solids with high accuracy. Here we present an approach which we call the quasiparticle self-consistent approximation. It is based on a kind of self-consistent perturbation theory, where the self-consistency is constructed to minimize the perturbation. We apply it to selections from different classes of materials, including alkali metals, semiconductors, wide band gap insulators, transition metals, transition metal oxides, magnetic insulators, and rare earth compounds. Apart from some mild exceptions, the properties are very well described, particularly in weakly correlated cases. Self-consistency dramatically improves agreement with experiment, and is sometimes essential. Discrepancies with experiment are systematic, and can be explained in terms of approximations made.

  16. Consistency in the World Wide Web

    DEFF Research Database (Denmark)

    Thomsen, Jakob Grauenkjær

    Tim Berners-Lee envisioned that computers will behave as agents of humans on the World Wide Web, where they will retrieve, extract, and interact with information from the World Wide Web. A step towards this vision is to make computers capable of extracting this information in a reliable...... and consistent way. In this dissertation we study steps towards this vision by showing techniques for the specication, the verication and the evaluation of the consistency of information in the World Wide Web. We show how to detect certain classes of errors in a specication of information, and we show how...... the World Wide Web, in order to help perform consistent evaluations of web extraction techniques. These contributions are steps towards having computers reliable and consistently extract information from the World Wide Web, which in turn are steps towards achieving Tim Berners-Lee's vision. ii...

  17. Consistency Relations for Large Field Inflation

    CERN Document Server

    Chiba, Takeshi

    2014-01-01

    Consistency relations for chaotic inflation with a monomial potential and natural inflation and hilltop inflation are given which involve the scalar spectral index $n_s$, the tensor-to-scalar ratio $r$ and the running of the spectral index $\\alpha$. The measurement of $\\alpha$ with $O(10^{-3})$ and the improvement in the measurement of $n_s$ could discriminate monomial model from natural/hilltop inflation models. A consistency region for general large field models is also presented.

  18. Study of space shuttle orbiter system management computer function. Volume 1: Analysis, baseline design

    Science.gov (United States)

    1975-01-01

    A system analysis of the shuttle orbiter baseline system management (SM) computer function is performed. This analysis results in an alternative SM design which is also described. The alternative design exhibits several improvements over the baseline, some of which are increased crew usability, improved flexibility, and improved growth potential. The analysis consists of two parts: an application assessment and an implementation assessment. The former is concerned with the SM user needs and design functional aspects. The latter is concerned with design flexibility, reliability, growth potential, and technical risk. The system analysis is supported by several topical investigations. These include: treatment of false alarms, treatment of off-line items, significant interface parameters, and a design evaluation checklist. An in-depth formulation of techniques, concepts, and guidelines for design of automated performance verification is discussed.

  19. 突发事件下两部收费契约协调闭环供应链研究%Research on Closed-loop Supply Chain Coordination with Two-part Tariff Contract under Disruptions

    Institute of Scientific and Technical Information of China (English)

    牟宗玉; 曹德弼; 刘晓冰; 李新然

    2013-01-01

    This paper studies the coordination of the closed-loop supply chain(CLSC)with one manufacturer and one retailer who recollects the used products .We design the model that can be coordinated by a two-part tariff contract in static environment .Based on the model , we get the optimal profits by K-T condition and the optimal response strategy of the centralized CLSC under demand disruptions caused by an emergent event .The conclu-sion shows that the two-part tariff contract which has been signed can ’ t coordinate the decentralized CLSC any more under disruptions .Then we give two strategies to coordinate the CLSC .One strategy is that the wholesale price and the transfer price of used products with the original contract are improved in the case that the manufac -turer bears the deviation costs , and the other is that we retain the wholesale price and the transfer price of used products with the original contract unchanged under the condition that the retailer bears the deviation costs .The manufacturer and the retailer need to bargain with each other to determine the fixed fee in both of the above strat-egies , and we can realize the coordination of the CLSC under disruptions .In addition , we have proved the feasi-bility of them .%本文针对由一个制造商和一个零售商组成的零售商负责回收闭环供应链的协调问题,首先设计了稳定环境下两部收费契约协调闭环供应链模型。在此基础上,针对突发事件影响下造成零售商面临的市场需求发生变化时,运用K-T条件对集中式决策下的闭环供应链的最优利润进行求解,给出了最优应对策略。通过研究表明:当突发事件发生时,原适用于稳定环境下的两部收费契约将不再对分散式决策下的闭环供应链起协调作用。为此,本研究给出了两种协调策略,一是在制造商自身承担突发事件引起的偏差成本的情况下,通过对原有的两部收费契约中的批

  20. Consistency and Derangements in Brane Tilings

    CERN Document Server

    Hanany, Amihay; Ramgoolam, Sanjaye; Seong, Rak-Kyeong

    2015-01-01

    Brane tilings describe Lagrangians (vector multiplets, chiral multiplets, and the superpotential) of four dimensional $\\mathcal{N}=1$ supersymmetric gauge theories. These theories, written in terms of a bipartite graph on a torus, correspond to worldvolume theories on $N$ D$3$-branes probing a toric Calabi-Yau threefold singularity. A pair of permutations compactly encapsulates the data necessary to specify a brane tiling. We show that geometric consistency for brane tilings, which ensures that the corresponding quantum field theories are well behaved, imposes constraints on the pair of permutations, restricting certain products constructed from the pair to have no one-cycles. Permutations without one-cycles are known as derangements. We illustrate this formulation of consistency with known brane tilings. Counting formulas for consistent brane tilings with an arbitrary number of chiral bifundamental fields are written down in terms of delta functions over symmetric groups.

  1. Quantifying the consistency of scientific databases

    CERN Document Server

    Šubelj, Lovro; Boshkoska, Biljana Mileva; Kastrin, Andrej; Levnajić, Zoran

    2015-01-01

    Science is a social process with far-reaching impact on our modern society. In the recent years, for the first time we are able to scientifically study the science itself. This is enabled by massive amounts of data on scientific publications that is increasingly becoming available. The data is contained in several databases such as Web of Science or PubMed, maintained by various public and private entities. Unfortunately, these databases are not always consistent, which considerably hinders this study. Relying on the powerful framework of complex networks, we conduct a systematic analysis of the consistency among six major scientific databases. We found that identifying a single "best" database is far from easy. Nevertheless, our results indicate appreciable differences in mutual consistency of different databases, which we interpret as recipes for future bibliometric studies.

  2. Self-consistent Green's function approaches

    CERN Document Server

    Barbieri, Carlo

    2016-01-01

    We present the fundamental techniques and working equations of many-body Green's function theory for calculating ground state properties and the spectral strength. Green's function methods closely relate to other polynomial scaling approaches discussed in chapters~8 and ~10. However, here we aim directly at a global view of the many-fermion structure. We derive the working equations for calculating many-body propagators, using both the Algebraic Diagrammatic Construction technique and the self-consistent formalism at finite temperature. Their implementation is discussed, as well as the the inclusion of three-nucleon interactions. The self-consistency feature is essential to guarantee thermodynamic consistency. The paring and neutron matter models introduced in previous chapters are solved and compared with the other methods in this book.

  3. Personalized recommendation based on unbiased consistence

    Science.gov (United States)

    Zhu, Xuzhen; Tian, Hui; Zhang, Ping; Hu, Zheng; Zhou, Tao

    2015-08-01

    Recently, in physical dynamics, mass-diffusion-based recommendation algorithms on bipartite network provide an efficient solution by automatically pushing possible relevant items to users according to their past preferences. However, traditional mass-diffusion-based algorithms just focus on unidirectional mass diffusion from objects having been collected to those which should be recommended, resulting in a biased causal similarity estimation and not-so-good performance. In this letter, we argue that in many cases, a user's interests are stable, and thus bidirectional mass diffusion abilities, no matter originated from objects having been collected or from those which should be recommended, should be consistently powerful, showing unbiased consistence. We further propose a consistence-based mass diffusion algorithm via bidirectional diffusion against biased causality, outperforming the state-of-the-art recommendation algorithms in disparate real data sets, including Netflix, MovieLens, Amazon and Rate Your Music.

  4. A Revisit to Probability - Possibility Consistency Principles

    Directory of Open Access Journals (Sweden)

    Mamoni Dhar

    2013-03-01

    Full Text Available In this article, our main intention is to highlight the fact that the probable links between probability and possibility which were established by different authors at different point of time on the basis of some well known consistency principles cannot provide the desired result. That is why the paper discussed some prominent works for transformations between probability and possibility and finally aimed to suggest a new principle because none of the existing principles because none of them found the unique transformation. The new consistency principle which is suggested hereby would in turn replace all others that exist in the literature references by providing a reliable estimate of consistency between the two.Furthermore some properties of entropy of fuzzy numbers are also presented in this article.

  5. Consistent matter couplings for Plebanski gravity

    CERN Document Server

    Tennie, Felix

    2010-01-01

    We develop a scheme for the minimal coupling of all standard types of tensor and spinor field matter to Plebanski gravity. This theory is a geometric reformulation of vacuum general relativity in terms of two-form frames and connection one-forms, and provides a covariant basis for various quantization approaches. Using the spinor formalism we prove the consistency of the newly proposed matter coupling by demonstrating the full equivalence of Plebanski gravity plus matter to Einstein--Cartan gravity. As a byproduct we also show the consistency of some previous suggestions for matter actions.

  6. Consistent matter couplings for Plebanski gravity

    Science.gov (United States)

    Tennie, Felix; Wohlfarth, Mattias N. R.

    2010-11-01

    We develop a scheme for the minimal coupling of all standard types of tensor and spinor field matter to Plebanski gravity. This theory is a geometric reformulation of vacuum general relativity in terms of two-form frames and connection one-forms, and provides a covariant basis for various quantization approaches. Using the spinor formalism we prove the consistency of the newly proposed matter coupling by demonstrating the full equivalence of Plebanski gravity plus matter to Einstein-Cartan gravity. As a by-product we also show the consistency of some previous suggestions for matter actions.

  7. Design of a rotary reactor for chemical-looping combustion. Part 1: Fundamentals and design methodology

    KAUST Repository

    Zhao, Zhenlong

    2014-04-01

    Chemical-looping combustion (CLC) is a novel and promising option for several applications including carbon capture (CC), fuel reforming, H 2 generation, etc. Previous studies demonstrated the feasibility of performing CLC in a novel rotary design with micro-channel structures. In the reactor, a solid wheel rotates between the fuel and air streams at the reactor inlet, and depleted air and product streams at exit. The rotary wheel consists of a large number of micro-channels with oxygen carriers (OC) coated on the inner surface of the channel walls. In the CC application, the OC oxidizes the fuel while the channel is in the fuel zone to generate undiluted CO2, and is regenerated while the channel is in the air zone. In this two-part series, the effect of the reactor design parameters is evaluated and its performance with different OCs is compared. In Part 1, the design objectives and criteria are specified and the key parameters controlling the reactor performance are identified. The fundamental effects of the OC characteristics, the design parameters, and the operating conditions are studied. The design procedures are presented on the basis of the relative importance of each parameter, enabling a systematic methodology of selecting the design parameters and the operating conditions with different OCs. Part 2 presents the application of the methodology to the designs with the three commonly used OCs, i.e., nickel, copper, and iron, and compares the simulated performances of the designs. © 2013 Elsevier Ltd. All rights reserved.

  8. Optimizing the value of the posterior condylar offset, proximal tibial resection and slope in order to achieve the right balance of the posterior cruciate ligament - clinical application of the molding function of the two parts of the PCL

    Science.gov (United States)

    Bogojevski, Ljubomir; Doksevska, Milena Bogojevska

    2017-01-01

    Introduction: In order to achieve the right balance of the posterior cruciate ligament using the skeletal method is very difficult, almost impossible (Mahoney). Our hypothesis for the right balance of the PCL by using the skeletal method is based on several defined facts: - PCL is a union based of two anatomically independent, but functionally synergic parts, posteromedial and anterolateral part. - The length of the posteromedial part of the PCL is determined by the belonging of the medial compartment and is shortest in varus and longest in valgus deformation. - The length of the anterolateral part of the PCL, placed centrally is unchangeable (cca 38 mm) in every knee and is independent from the anatomical appearance (deformation). - The cylindrical shape of the distal posterior part of the femur (Ficat) depends of the molding function of the PCL (Kapandji) and is a result of the proportion of the both parts of the PCL that is consisted of: shorter posteromedial part, less bone stock on the medial and more bone stock on the lateral condyle (varus knee) and vice versa, longer posteromedial part, more bone stock on the medial condyle and less on the lateral (valgus knee). According to that, the neutral bone stock is achieved by equalization of the lengths of the two parts (common radius of the cylinder) of the PCL, that is basis for the interligamentary balance of the posterior cruciate ligament. Methods: The basic characteristics of the interligamentary balance of the PCL that we started in 2008 are the following: 1. Posterior condylar offset is equal to the even length of the both part of the PCL. 2. Decrease of the values of proximal tibial resection from 10 in varus to 4-6 in valgus. 3. Femoral valgus cut from 6 in excessive varus deformity to 4 in valgus. Results: The clinical evaluation of the cases divided in groups excessive varus, mean varus, valgus type 1, 2 (Krakow) showed right distribution in the groups of the postoperative ROM and intraoperative

  9. Developing consistent time series landsat data products

    Science.gov (United States)

    The Landsat series satellite has provided earth observation data record continuously since early 1970s. There are increasing demands on having a consistent time series of Landsat data products. In this presentation, I will summarize the work supported by the USGS Landsat Science Team project from 20...

  10. Developing consistent pronunciation models for phonemic variants

    CSIR Research Space (South Africa)

    Davel, M

    2006-09-01

    Full Text Available from a lexicon containing variants. In this paper we (the authors) address both these issues by creating ‘pseudo-phonemes’ associated with sets of ‘generation restriction rules’ to model those pronunciations that are consistently realised as two or more...

  11. On Consistency Maintenance In Service Discovery

    NARCIS (Netherlands)

    Sundramoorthy, V.; Hartel, Pieter H.; Scholten, Johan

    2005-01-01

    Communication and node failures degrade the ability of a service discovery protocol to ensure Users receive the correct service information when the service changes. We propose that service discovery protocols employ a set of recovery techniques to recover from failures and regain consistency. We

  12. On Consistency Maintenance In Service Discovery

    NARCIS (Netherlands)

    Sundramoorthy, V.; Hartel, Pieter H.; Scholten, Johan

    Communication and node failures degrade the ability of a service discovery protocol to ensure Users receive the correct service information when the service changes. We propose that service discovery protocols employ a set of recovery techniques to recover from failures and regain consistency. We

  13. Consistent feeding positions of great tit parents

    NARCIS (Netherlands)

    Lessells, C.M.; Poelman, E.H.; Mateman, A.C.; Cassey, P.

    2006-01-01

    When parent birds arrive at the nest to provision their young, their position on the nest rim may influence which chick or chicks are fed. As a result, the consistency of feeding positions of the individual parents, and the difference in position between the parents, may affect how equitably food is

  14. Computing Rooted and Unrooted Maximum Consistent Supertrees

    CERN Document Server

    van Iersel, Leo

    2009-01-01

    A chief problem in phylogenetics and database theory is the computation of a maximum consistent tree from a set of rooted or unrooted trees. A standard input are triplets, rooted binary trees on three leaves, or quartets, unrooted binary trees on four leaves. We give exact algorithms constructing rooted and unrooted maximum consistent supertrees in time O(2^n n^5 m^2 log(m)) for a set of m triplets (quartets), each one distinctly leaf-labeled by some subset of n labels. The algorithms extend to weighted triplets (quartets). We further present fast exact algorithms for constructing rooted and unrooted maximum consistent trees in polynomial space. Finally, for a set T of m rooted or unrooted trees with maximum degree D and distinctly leaf-labeled by some subset of a set L of n labels, we compute, in O(2^{mD} n^m m^5 n^6 log(m)) time, a tree distinctly leaf-labeled by a maximum-size subset X of L that all trees in T, when restricted to X, are consistent with.

  15. Addendum to "On the consistency of MPS"

    CERN Document Server

    Souto-Iglesias, Antonio; González, Leo M; Cercos-Pita, Jose L

    2013-01-01

    The analogies between the Moving Particle Semi-implicit method (MPS) and Incompressible Smoothed Particle Hydrodynamics method (ISPH) are established in this note, as an extension of the MPS consistency analysis conducted in "Souto-Iglesias et al., Computer Physics Communications, 184(3), 2013."

  16. Proteolysis and consistency of Meshanger cheese

    NARCIS (Netherlands)

    Jong, de L.

    1978-01-01

    Proteolysis in Meshanger cheese, estimated by quantitative polyacrylamide gel electrophoresis is discussed. The conversion of α s1 -casein was proportional to rennet concentration in the cheese. Changes in consistency, after a maximum, were correlated to breakdown of

  17. On the existence of consistent price systems

    DEFF Research Database (Denmark)

    Bayraktar, Erhan; Pakkanen, Mikko S.; Sayit, Hasanjan

    2014-01-01

    We formulate a sufficient condition for the existence of a consistent price system (CPS), which is weaker than the conditional full support condition (CFS). We use the new condition to show the existence of CPSs for certain processes that fail to have the CFS property. In particular this condition...

  18. A self-consistent Maltsev pulse model

    Science.gov (United States)

    Buneman, O.

    1985-04-01

    A self-consistent model for an electron pulse propagating through a plasma is presented. In this model, the charge imbalance between plasma ions, plasma electrons and pulse electrons creates the travelling potential well in which the pulse electrons are trapped.

  19. Properties and Update Semantics of Consistent Views

    Science.gov (United States)

    1985-09-01

    8217 PR.OPERTIES AND UPDATE SEMANTICS OF CONSISTENT VIEWS G. Gottlob Institute for Applied Mathematics C.N.H.., G<•nova, Italy Compnt.<•r Sden... Gottlob G., Paolini P., Zicari R., "Proving Properties of Programs ou Database Views", Dipartiuwnto di Elcttronica, Politecnko di Milano (in

  20. Consistency Analysis of Network Traffic Repositories

    NARCIS (Netherlands)

    Lastdrager, Elmer; Pras, Aiko

    2009-01-01

    Traffic repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffic that has been flowing over the network; little thoughts are made regarding the consistency of these repositories. Still, for var

  1. Consistency of Network Traffic Repositories: An Overview

    NARCIS (Netherlands)

    Lastdrager, E.; Pras, A.

    2009-01-01

    Traffc repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffc that has been flowing over the network; little thoughts are made regarding the consistency of these repositories. Still, for vario

  2. Autonomous Navigation with Constrained Consistency for C-Ranger

    Directory of Open Access Journals (Sweden)

    Shujing Zhang

    2014-06-01

    Full Text Available Autonomous underwater vehicles (AUVs have become the most widely used tools for undertaking complex exploration tasks in marine environments. Their synthetic ability to carry out localization autonomously and build an environmental map concurrently, in other words, simultaneous localization and mapping (SLAM, are considered to be pivotal requirements for AUVs to have truly autonomous navigation. However, the consistency problem of the SLAM system has been greatly ignored during the past decades. In this paper, a consistency constrained extended Kalman filter (EKF SLAM algorithm, applying the idea of local consistency, is proposed and applied to the autonomous navigation of the C-Ranger AUV, which is developed as our experimental platform. The concept of local consistency (LC is introduced after an explicit theoretical derivation of the EKF-SLAM system. Then, we present a locally consistency- constrained EKF-SLAM design, LC-EKF, in which the landmark estimates used for linearization are fixed at the beginning of each local time period, rather than evaluated at the latest landmark estimates. Finally, our proposed LC- EKF algorithm is experimentally verified, both in simulations and sea trials. The experimental results show that the LC-EKF performs well with regard to consistency, accuracy and computational efficiency.

  3. Autonomous Navigation with Constrained Consistency for C-Ranger

    Directory of Open Access Journals (Sweden)

    Shujing Zhang

    2014-06-01

    Full Text Available Autonomous underwater vehicles (AUVs have become the most widely used tools for undertaking complex exploration tasks in marine environments. Their synthetic ability to carry out localization autonomously and build an environmental map concurrently, in other words, simultaneous localization and mapping (SLAM, are considered to be pivotal requirements for AUVs to have truly autonomous navigation. However, the consistency problem of the SLAM system has been greatly ignored during the past decades. In this paper, a consistency constrained extended Kalman filter (EKF SLAM algorithm, applying the idea of local consistency, is proposed and applied to the autonomous navigation of the C-Ranger AUV, which is developed as our experimental platform. The concept of local consistency (LC is introduced after an explicit theoretical derivation of the EKF-SLAM system. Then, we present a locally consistency-constrained EKF-SLAM design, LC-EKF, in which the landmark estimates used for linearization are fixed at the beginning of each local time period, rather than evaluated at the latest landmark estimates. Finally, our proposed LC-EKF algorithm is experimentally verified, both in simulations and sea trials. The experimental results show that the LC-EKF performs well with regard to consistency, accuracy and computational efficiency.

  4. Self-consistency in Capital Markets

    Science.gov (United States)

    Benbrahim, Hamid

    2013-03-01

    Capital Markets are considered, at least in theory, information engines whereby traders contribute to price formation with their diverse perspectives. Regardless whether one believes in efficient market theory on not, actions by individual traders influence prices of securities, which in turn influence actions by other traders. This influence is exerted through a number of mechanisms including portfolio balancing, margin maintenance, trend following, and sentiment. As a result market behaviors emerge from a number of mechanisms ranging from self-consistency due to wisdom of the crowds and self-fulfilling prophecies, to more chaotic behavior resulting from dynamics similar to the three body system, namely the interplay between equities, options, and futures. This talk will address questions and findings regarding the search for self-consistency in capital markets.

  5. Student Effort, Consistency and Online Performance

    Directory of Open Access Journals (Sweden)

    Hilde Patron

    2011-07-01

    Full Text Available This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas effort, or total minutes spent online, is not. Other independent variables include GPA and the difference between a pre-test and a post-test. The GPA is used as a measure of motivation, and the difference between a post-test and pre-test as marginal learning. As expected, the level of motivation is found statistically significant at a 99% confidence level, and marginal learning is also significant at a 95% level.

  6. Consistence beats causality in recommender systems

    CERN Document Server

    Zhu, Xuzhen; Hu, Zheng; Zhang, Ping; Zhou, Tao

    2015-01-01

    The explosive growth of information challenges people's capability in finding out items fitting to their own interests. Recommender systems provide an efficient solution by automatically push possibly relevant items to users according to their past preferences. Recommendation algorithms usually embody the causality from what having been collected to what should be recommended. In this article, we argue that in many cases, a user's interests are stable, and thus the previous and future preferences are highly consistent. The temporal order of collections then does not necessarily imply a causality relationship. We further propose a consistence-based algorithm that outperforms the state-of-the-art recommendation algorithms in disparate real data sets, including \\textit{Netflix}, \\textit{MovieLens}, \\textit{Amazon} and \\textit{Rate Your Music}.

  7. A supersymmetric consistent truncation for conifold solutions

    CERN Document Server

    Cassani, Davide

    2010-01-01

    We establish a supersymmetric consistent truncation of type IIB supergravity on the T^{1,1} coset space, based on extending the Papadopoulos-Tseytlin ansatz to the full set of SU(2)xSU(2) invariant Kaluza-Klein modes. The five-dimensional model is a gauged N=4 supergravity with three vector multiplets, which incorporates various conifold solutions and is suitable for the study of their dynamics. By analysing the scalar potential we find a family of new non-supersymmetric AdS_5 extrema interpolating between a solution obtained long ago by Romans and a solution employing an Einstein metric on T^{1,1} different from the standard one. Finally, we discuss some simple consistent subtruncations preserving N=2 supersymmetry. One of them is compatible with the inclusion of smeared D7-branes.

  8. Temporally consistent segmentation of point clouds

    Science.gov (United States)

    Owens, Jason L.; Osteen, Philip R.; Daniilidis, Kostas

    2014-06-01

    We consider the problem of generating temporally consistent point cloud segmentations from streaming RGB-D data, where every incoming frame extends existing labels to new points or contributes new labels while maintaining the labels for pre-existing segments. Our approach generates an over-segmentation based on voxel cloud connectivity, where a modified k-means algorithm selects supervoxel seeds and associates similar neighboring voxels to form segments. Given the data stream from a potentially mobile sensor, we solve for the camera transformation between consecutive frames using a joint optimization over point correspondences and image appearance. The aligned point cloud may then be integrated into a consistent model coordinate frame. Previously labeled points are used to mask incoming points from the new frame, while new and previous boundary points extend the existing segmentation. We evaluate the algorithm on newly-generated RGB-D datasets.

  9. Foundations of consistent couple stress theory

    CERN Document Server

    Hadjesfandiari, Ali R

    2015-01-01

    In this paper, we examine the recently developed skew-symmetric couple stress theory and demonstrate its inner consistency, natural simplicity and fundamental connection to classical mechanics. This hopefully will help the scientific community to overcome any ambiguity and skepticism about this theory, especially the validity of the skew-symmetric character of the couple-stress tensor. We demonstrate that in a consistent continuum mechanics, the response of infinitesimal elements of matter at each point decomposes naturally into a rigid body portion, plus the relative translation and rotation of these elements at adjacent points of the continuum. This relative translation and rotation captures the deformation in terms of stretches and curvatures, respectively. As a result, the continuous displacement field and its corresponding rotation field are the primary variables, which remarkably is in complete alignment with rigid body mechanics, thus providing a unifying basis. For further clarification, we also exami...

  10. Consistent Linearized Gravity in Brane Backgrounds

    CERN Document Server

    Aref'eva, I Ya; Mück, W; Viswanathan, K S; Volovich, I V

    2000-01-01

    A globally consistent treatment of linearized gravity in the Randall-Sundrum background with matter on the brane is formulated. Using a novel gauge, in which the transverse components of the metric are non-vanishing, the brane is kept straight. We analyze the gauge symmetries and identify the physical degrees of freedom of gravity. Our results underline the necessity for non-gravitational confinement of matter to the brane.

  11. Self-consistent model of fermions

    CERN Document Server

    Yershov, V N

    2002-01-01

    We discuss a composite model of fermions based on three-flavoured preons. We show that the opposite character of the Coulomb and strong interactions between these preons lead to formation of complex structures reproducing three generations of quarks and leptons with all their quantum numbers and masses. The model is self-consistent (it doesn't use input parameters). Nevertheless, the masses of the generated structures match the experimental values.

  12. Consistent formulation of the spacelike axial gauge

    Energy Technology Data Exchange (ETDEWEB)

    Burnel, A.; Van der Rest-Jaspers, M.

    1983-12-15

    The usual formulation of the spacelike axial gauge is afflicted with the difficulty that the metric is indefinite while no ghost is involved. We solve this difficulty by introducing a ghost whose elimination is such that the metric becomes positive for physical states. The technique consists in the replacement of the gauge condition nxA = 0 by the weaker one partial/sub 0/nxAroughly-equal0.

  13. Security Policy: Consistency, Adjustments and Restraining Factors

    Institute of Scientific and Technical Information of China (English)

    Yang; Jiemian

    2004-01-01

    In the 2004 U.S. presidential election, despite well-divided domestic opinions and Kerry's appealing slogan of "Reversing the Trend," a slight majority still voted for George W. Bush in the end. It is obvious that, based on the author's analysis, security agenda such as counter-terrorism and Iraqi issue has contributed greatly to the reelection of Mr. Bush. This also indicates that the security policy of Bush's second term will basically be consistent.……

  14. Self-consistent structure of metallic hydrogen

    Science.gov (United States)

    Straus, D. M.; Ashcroft, N. W.

    1977-01-01

    A calculation is presented of the total energy of metallic hydrogen for a family of face-centered tetragonal lattices carried out within the self-consistent phonon approximation. The energy of proton motion is large and proper inclusion of proton dynamics alters the structural dependence of the total energy, causing isotropic lattices to become favored. For the dynamic lattice the structural dependence of terms of third and higher order in the electron-proton interaction is greatly reduced from static lattice equivalents.

  15. Radiometric consistency assessment of hyperspectral infrared sounders

    OpenAIRE

    Wang, L.; Y. Han; Jin, X.; Y. Chen; D. A. Tremblay

    2015-01-01

    The radiometric and spectral consistency among the Atmospheric Infrared Sounder (AIRS), the Infrared Atmospheric Sounding Interferometer (IASI), and the Cross-track Infrared Sounder (CrIS) is fundamental for the creation of long-term infrared (IR) hyperspectral radiance benchmark datasets for both inter-calibration and climate-related studies. In this study, the CrIS radiance measurements on Suomi National Polar-orbiting Partnership (SNPP) satellite are directly com...

  16. The internal consistency of perfect competition

    OpenAIRE

    Jakob Kapeller; Stephan Pühringer

    2010-01-01

    This article surveys some arguments brought forward in defense of the theory of perfect competition. While some critics propose that the theory of perfect competition, and thus also the theory of the firm, are logically flawed, (mainstream) economists defend their most popular textbook model by a series of apparently different arguments. Here it is examined whether these arguments are comparable, consistent and convincing from the point of view of philosophy of science.

  17. Cloud Standardization: Consistent Business Processes and Information

    Directory of Open Access Journals (Sweden)

    Razvan Daniel ZOTA

    2013-01-01

    Full Text Available Cloud computing represents one of the latest emerging trends in distributed computing that enables the existence of hardware infrastructure and software applications as services. The present paper offers a general approach to the cloud computing standardization as a mean of improving the speed of adoption for the cloud technologies. Moreover, this study tries to show out how organizations may achieve more consistent business processes while operating with cloud computing technologies.

  18. Dynamic consistency for Stochastic Optimal Control problems

    CERN Document Server

    Carpentier, Pierre; Cohen, Guy; De Lara, Michel; Girardeau, Pierre

    2010-01-01

    For a sequence of dynamic optimization problems, we aim at discussing a notion of consistency over time. This notion can be informally introduced as follows. At the very first time step $t_0$, the decision maker formulates an optimization problem that yields optimal decision rules for all the forthcoming time step $t_0, t_1, ..., T$; at the next time step $t_1$, he is able to formulate a new optimization problem starting at time $t_1$ that yields a new sequence of optimal decision rules. This process can be continued until final time $T$ is reached. A family of optimization problems formulated in this way is said to be time consistent if the optimal strategies obtained when solving the original problem remain optimal for all subsequent problems. The notion of time consistency, well-known in the field of Economics, has been recently introduced in the context of risk measures, notably by Artzner et al. (2007) and studied in the Stochastic Programming framework by Shapiro (2009) and for Markov Decision Processes...

  19. CMB lens sample covariance and consistency relations

    Science.gov (United States)

    Motloch, Pavel; Hu, Wayne; Benoit-Lévy, Aurélien

    2017-02-01

    Gravitational lensing information from the two and higher point statistics of the cosmic microwave background (CMB) temperature and polarization fields are intrinsically correlated because they are lensed by the same realization of structure between last scattering and observation. Using an analytic model for lens sample covariance, we show that there is one mode, separately measurable in the lensed CMB power spectra and lensing reconstruction, that carries most of this correlation. Once these measurements become lens sample variance dominated, this mode should provide a useful consistency check between the observables that is largely free of sampling and cosmological parameter errors. Violations of consistency could indicate systematic errors in the data and lens reconstruction or new physics at last scattering, any of which could bias cosmological inferences and delensing for gravitational waves. A second mode provides a weaker consistency check for a spatially flat universe. Our analysis isolates the additional information supplied by lensing in a model-independent manner but is also useful for understanding and forecasting CMB cosmological parameter errors in the extended Λ cold dark matter parameter space of dark energy, curvature, and massive neutrinos. We introduce and test a simple but accurate forecasting technique for this purpose that neither double counts lensing information nor neglects lensing in the observables.

  20. Beam dynamics design of the main accelerating section with KONUS in the CSR-LINAC

    CERN Document Server

    Xiao-Hu, Zhang; Jia-Wen, Xia; Xue-Jun, Yin; Heng, Du

    2013-01-01

    The CSR-LINAC injector has been proposed in Heavy Ion Research Facility in Lanzhou (HIRFL). The linac mainly consists of two parts, the RFQ and the IH-DTL. The KONUS (Kombinierte Null Grad Struktur) concept has been introduced into the DTL section. In this paper, the re-matching of the main accelerating section will be finished in the 3.7 MeV/u scheme and the new beam dynamics design up to 7 MeV/u will be also shown. Through the beam re-matching, the relative emittance growth has been suppressed greatly along the linac.

  1. Consistency Relations for the Conformal Mechanism

    CERN Document Server

    Creminelli, Paolo; Khoury, Justin; Simonović, Marko

    2012-01-01

    We systematically derive the consistency relations associated to the non-linearly realized symmetries of theories with spontaneously broken conformal symmetry but with a linearly-realized de Sitter subalgebra. These identities relate (N+1)-point correlation functions with a soft external Goldstone to N-point functions. These relations have direct implications for the recently proposed conformal mechanism for generating density perturbations in the early universe. We study the observational consequences, in particular a novel one-loop contribution to the four-point function, relevant for the stochastic scale-dependent bias and CMB mu-distortion.

  2. Consistency relations for the conformal mechanism

    Energy Technology Data Exchange (ETDEWEB)

    Creminelli, Paolo [Abdus Salam International Centre for Theoretical Physics, Strada Costiera 11, 34151, Trieste (Italy); Joyce, Austin; Khoury, Justin [Center for Particle Cosmology, Department of Physics and Astronomy, University of Pennsylvania, Philadelphia, PA 19104 (United States); Simonović, Marko, E-mail: creminel@ictp.it, E-mail: joyceau@sas.upenn.edu, E-mail: jkhoury@sas.upenn.edu, E-mail: marko.simonovic@sissa.it [SISSA, via Bonomea 265, 34136, Trieste (Italy)

    2013-04-01

    We systematically derive the consistency relations associated to the non-linearly realized symmetries of theories with spontaneously broken conformal symmetry but with a linearly-realized de Sitter subalgebra. These identities relate (N+1)-point correlation functions with a soft external Goldstone to N-point functions. These relations have direct implications for the recently proposed conformal mechanism for generating density perturbations in the early universe. We study the observational consequences, in particular a novel one-loop contribution to the four-point function, relevant for the stochastic scale-dependent bias and CMB μ-distortion.

  3. Improving analytical tomographic reconstructions through consistency conditions

    CERN Document Server

    Arcadu, Filippo; Stampanoni, Marco; Marone, Federica

    2016-01-01

    This work introduces and characterizes a fast parameterless filter based on the Helgason-Ludwig consistency conditions, used to improve the accuracy of analytical reconstructions of tomographic undersampled datasets. The filter, acting in the Radon domain, extrapolates intermediate projections between those existing. The resulting sinogram, doubled in views, is then reconstructed by a standard analytical method. Experiments with simulated data prove that the peak-signal-to-noise ratio of the results computed by filtered backprojection is improved up to 5-6 dB, if the filter is used prior to reconstruction.

  4. Consistency of non-minimal renormalisation schemes

    CERN Document Server

    Jack, I

    2016-01-01

    Non-minimal renormalisation schemes such as the momentum subtraction scheme (MOM) have frequently been used for physical computations. The consistency of such a scheme relies on the existence of a coupling redefinition linking it to MSbar. We discuss the implementation of this procedure in detail for a general theory and show how to construct the relevant redefinition up to three-loop order, for the case of a general theory of fermions and scalars in four dimensions and a general scalar theory in six dimensions.

  5. Gentzen's centenary the quest for consistency

    CERN Document Server

    Rathjen, Michael

    2015-01-01

    Gerhard Gentzen has been described as logic’s lost genius, whom Gödel called a better logician than himself. This work comprises articles by leading proof theorists, attesting to Gentzen’s enduring legacy to mathematical logic and beyond. The contributions range from philosophical reflections and re-evaluations of Gentzen’s original consistency proofs to the most recent developments in proof theory. Gentzen founded modern proof theory. His sequent calculus and natural deduction system beautifully explain the deep symmetries of logic. They underlie modern developments in computer science such as automated theorem proving and type theory.

  6. Consistent Predictions of Future Forest Mortality

    Science.gov (United States)

    McDowell, N. G.

    2014-12-01

    We examined empirical and model based estimates of current and future forest mortality of conifers in the northern hemisphere. Consistent water potential thresholds were found that resulted in mortality of our case study species, pinon pine and one-seed juniper. Extending these results with IPCC climate scenarios suggests that most existing trees in this region (SW USA) will be dead by 2050. Further, independent estimates of future mortality for the entire coniferous biome suggest widespread mortality by 2100. The validity and assumptions and implications of these results are discussed.

  7. Surface consistent finite frequency phase corrections

    Science.gov (United States)

    Kimman, W. P.

    2016-07-01

    Static time-delay corrections are frequency independent and ignore velocity variations away from the assumed vertical ray path through the subsurface. There is therefore a clear potential for improvement if the finite frequency nature of wave propagation can be properly accounted for. Such a method is presented here based on the Born approximation, the assumption of surface consistency and the misfit of instantaneous phase. The concept of instantaneous phase lends itself very well for sweep-like signals, hence these are the focus of this study. Analytical sensitivity kernels are derived that accurately predict frequency-dependent phase shifts due to P-wave anomalies in the near surface. They are quick to compute and robust near the source and receivers. An additional correction is presented that re-introduces the nonlinear relation between model perturbation and phase delay, which becomes relevant for stronger velocity anomalies. The phase shift as function of frequency is a slowly varying signal, its computation therefore does not require fine sampling even for broad-band sweeps. The kernels reveal interesting features of the sensitivity of seismic arrivals to the near surface: small anomalies can have a relative large impact resulting from the medium field term that is dominant near the source and receivers. Furthermore, even simple velocity anomalies can produce a distinct frequency-dependent phase behaviour. Unlike statics, the predicted phase corrections are smooth in space. Verification with spectral element simulations shows an excellent match for the predicted phase shifts over the entire seismic frequency band. Applying the phase shift to the reference sweep corrects for wavelet distortion, making the technique akin to surface consistent deconvolution, even though no division in the spectral domain is involved. As long as multiple scattering is mild, surface consistent finite frequency phase corrections outperform traditional statics for moderately large

  8. Are there consistent models giving observable NSI ?

    CERN Document Server

    Martinez, Enrique Fernandez

    2013-01-01

    While the existing direct bounds on neutrino NSI are rather weak, order 10(−)(1) for propagation and 10(−)(2) for production and detection, the close connection between these interactions and new NSI affecting the better-constrained charged letpon sector through gauge invariance make these bounds hard to saturate in realistic models. Indeed, Standard Model extensions leading to neutrino NSI typically imply constraints at the 10(−)(3) level. The question of whether consistent models leading to observable neutrino NSI naturally arises and was discussed in a dedicated session at NUFACT 11. Here we summarize that discussion.

  9. Consistency Checking of Web Service Contracts

    DEFF Research Database (Denmark)

    Cambronero, M. Emilia; Okika, Joseph C.; Ravn, Anders Peter

    2008-01-01

    Behavioural properties are analyzed for web service contracts formulated in Business Process Execution Language (BPEL) and Choreography Description Language (CDL). The key result reported is an automated technique to check consistency between protocol aspects of the contracts. The contracts...... are abstracted to (timed) automata and from there a simulation is set up, which is checked using automated tools for analyzing networks of finite state processes. Here we use the Concurrency Work Bench. The proposed techniques are illustrated with a case study that include otherwise difficult to analyze fault...

  10. Sludge characterization: the role of physical consistency

    Energy Technology Data Exchange (ETDEWEB)

    Spinosa, Ludovico; Wichmann, Knut

    2003-07-01

    The physical consistency is an important parameter in sewage sludge characterization as it strongly affects almost all treatment, utilization and disposal operations. In addition, in many european Directives a reference to the physical consistency is reported as a characteristic to be evaluated for fulfilling the regulations requirements. Further, in many analytical methods for sludge different procedures are indicated depending on whether a sample is liquid or not, is solid or not. Three physical behaviours (liquid, paste-like and solid) can be observed with sludges, so the development of analytical procedures to define the boundary limit between liquid and paste-like behaviours (flowability) and that between solid and paste-like ones (solidity) is of growing interest. Several devices can be used for evaluating the flowability and solidity properties, but often they are costly and difficult to be operated in the field. Tests have been carried out to evaluate the possibility to adopt a simple extrusion procedure for flowability measurements, and a Vicat needle for solidity ones. (author)

  11. Consistent mutational paths predict eukaryotic thermostability

    Directory of Open Access Journals (Sweden)

    van Noort Vera

    2013-01-01

    Full Text Available Abstract Background Proteomes of thermophilic prokaryotes have been instrumental in structural biology and successfully exploited in biotechnology, however many proteins required for eukaryotic cell function are absent from bacteria or archaea. With Chaetomium thermophilum, Thielavia terrestris and Thielavia heterothallica three genome sequences of thermophilic eukaryotes have been published. Results Studying the genomes and proteomes of these thermophilic fungi, we found common strategies of thermal adaptation across the different kingdoms of Life, including amino acid biases and a reduced genome size. A phylogenetics-guided comparison of thermophilic proteomes with those of other, mesophilic Sordariomycetes revealed consistent amino acid substitutions associated to thermophily that were also present in an independent lineage of thermophilic fungi. The most consistent pattern is the substitution of lysine by arginine, which we could find in almost all lineages but has not been extensively used in protein stability engineering. By exploiting mutational paths towards the thermophiles, we could predict particular amino acid residues in individual proteins that contribute to thermostability and validated some of them experimentally. By determining the three-dimensional structure of an exemplar protein from C. thermophilum (Arx1, we could also characterise the molecular consequences of some of these mutations. Conclusions The comparative analysis of these three genomes not only enhances our understanding of the evolution of thermophily, but also provides new ways to engineer protein stability.

  12. Viewpoint Consistency: An Eye Movement Study

    Directory of Open Access Journals (Sweden)

    Filipe Cristino

    2012-05-01

    Full Text Available Eye movements have been widely studied, using images and videos in laboratories or portable eye trackers in the real world. Although a good understanding of the saccadic system and extensive models of gaze have been developed over the years, only a few studies have focused on the consistency of eye movements across viewpoints. We have developed a new technique to compute and map the depth of collected eye movements on stimuli rendered from 3D mesh objects using a traditional corneal reflection eye tracker (SR Eyelink 1000. Having eye movements mapped into 3D space (and not on an image space allowed us to compare fixations across viewpoints. Fixation sequences (scanpaths were also studied across viewpoints using the ScanMatch method (Cristino et al 2010, Behavioural and Research Methods 42, 692–700, extended to work with 3D eye movements. In a set of experiments where participants were asked to perform a recognition task on either a set of objects or faces, we recorded their gaze while performing the task. Participants either viewed the stimuli in 2D or using anaglyph glasses. The stimuli were shown from different viewpoints during the learning and testing phases. A high degree of gaze consistency was found across the different viewpoints, particularly between learning and testing phases. Scanpaths were also similar across viewpoints, suggesting not only that the gazed spatial locations are alike, but also their temporal order.

  13. Subgame consistent cooperation a comprehensive treatise

    CERN Document Server

    Yeung, David W K

    2016-01-01

    Strategic behavior in the human and social world has been increasingly recognized in theory and practice. It is well known that non-cooperative behavior could lead to suboptimal or even highly undesirable outcomes. Cooperation suggests the possibility of obtaining socially optimal solutions and the calls for cooperation are prevalent in real-life problems. Dynamic cooperation cannot be sustainable if there is no guarantee that the agreed upon optimality principle at the beginning is maintained throughout the cooperation duration. It is due to the lack of this kind of guarantees that cooperative schemes fail to last till its end or even fail to get started. The property of subgame consistency in cooperative dynamic games and the corresponding solution mechanism resolve this “classic” problem in game theory. This book is a comprehensive treatise on subgame consistent dynamic cooperation covering the up-to-date state of the art analyses in this important topic. It sets out to provide the theory, solution tec...

  14. Consistent evolution in a pedestrian flow

    Science.gov (United States)

    Guan, Junbiao; Wang, Kaihua

    2016-03-01

    In this paper, pedestrian evacuation considering different human behaviors is studied by using a cellular automaton (CA) model combined with the snowdrift game theory. The evacuees are divided into two types, i.e. cooperators and defectors, and two different human behaviors, herding behavior and independent behavior, are investigated. It is found from a large amount of numerical simulations that the ratios of the corresponding evacuee clusters are evolved to consistent states despite 11 typically different initial conditions, which may largely owe to self-organization effect. Moreover, an appropriate proportion of initial defectors who are of herding behavior, coupled with an appropriate proportion of initial defectors who are of rationally independent thinking, are two necessary factors for short evacuation time.

  15. Consistency of warm k-inflation

    CERN Document Server

    Peng, Zhi-Peng; Zhang, Xiao-Min; Zhu, Jian-Yang

    2016-01-01

    We extend the k-inflation which is a type of kinetically driven inflationary model under the standard inflationary scenario to a possible warm inflationary scenario. The dynamical equations of this warm k-inflation model are obtained. We rewrite the slow-roll parameters which are different from the usual potential driven inflationary models and perform a linear stability analysis to give the proper slow-roll conditions in the warm k-inflation. Two cases, a power-law kinetic function and an exponential kinetic function, are studied, when the dissipative coefficient $\\Gamma=\\Gamma_0$ and $\\Gamma=\\Gamma(\\phi)$, respectively. A proper number of e-folds is obtained in both concrete cases of warm k-inflation. We find a constant dissipative coefficient ($\\Gamma=\\Gamma_0$) is not a workable choice for these two cases while the two cases with $\\Gamma=\\Gamma(\\phi)$ are self-consistent warm inflationary models.

  16. Compact difference approximation with consistent boundary condition

    Institute of Scientific and Technical Information of China (English)

    FU Dexun; MA Yanwen; LI Xinliang; LIU Mingyu

    2003-01-01

    For simulating multi-scale complex flow fields it should be noted that all the physical quantities we are interested in must be simulated well. With limitation of the computer resources it is preferred to use high order accurate difference schemes. Because of their high accuracy and small stencil of grid points computational fluid dynamics (CFD) workers pay more attention to compact schemes recently. For simulating the complex flow fields the treatment of boundary conditions at the far field boundary points and near far field boundary points is very important. According to authors' experience and published results some aspects of boundary condition treatment for far field boundary are presented, and the emphasis is on treatment of boundary conditions for the upwind compact schemes. The consistent treatment of boundary conditions at the near boundary points is also discussed. At the end of the paper are given some numerical examples. The computed results with presented method are satisfactory.

  17. Reliability and Consistency of Surface Contamination Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Rouppert, F.; Rivoallan, A.; Largeron, C.

    2002-02-26

    Surface contamination evaluation is a tough problem since it is difficult to isolate the radiations emitted by the surface, especially in a highly irradiating atmosphere. In that case the only possibility is to evaluate smearable (removeable) contamination since ex-situ countings are possible. Unfortunately, according to our experience at CEA, these values are not consistent and thus non relevant. In this study, we show, using in-situ Fourier Transform Infra Red spectrometry on contaminated metal samples, that fixed contamination seems to be chemisorbed and removeable contamination seems to be physisorbed. The distribution between fixed and removeable contamination appears to be variable. Chemical equilibria and reversible ion exchange mechanisms are involved and are closely linked to environmental conditions such as humidity and temperature. Measurements of smearable contamination only give an indication of the state of these equilibria between fixed and removeable contamination at the time and in the environmental conditions the measurements were made.

  18. Evaluating the hydrological consistency of evaporation products

    KAUST Repository

    López, Oliver

    2017-01-18

    Advances in space-based observations have provided the capacity to develop regional- to global-scale estimates of evaporation, offering insights into this key component of the hydrological cycle. However, the evaluation of large-scale evaporation retrievals is not a straightforward task. While a number of studies have intercompared a range of these evaporation products by examining the variance amongst them, or by comparison of pixel-scale retrievals against ground-based observations, there is a need to explore more appropriate techniques to comprehensively evaluate remote-sensing-based estimates. One possible approach is to establish the level of product agreement between related hydrological components: for instance, how well do evaporation patterns and response match with precipitation or water storage changes? To assess the suitability of this "consistency"-based approach for evaluating evaporation products, we focused our investigation on four globally distributed basins in arid and semi-arid environments, comprising the Colorado River basin, Niger River basin, Aral Sea basin, and Lake Eyre basin. In an effort to assess retrieval quality, three satellite-based global evaporation products based on different methodologies and input data, including CSIRO-PML, the MODIS Global Evapotranspiration product (MOD16), and Global Land Evaporation: the Amsterdam Methodology (GLEAM), were evaluated against rainfall data from the Global Precipitation Climatology Project (GPCP) along with Gravity Recovery and Climate Experiment (GRACE) water storage anomalies. To ensure a fair comparison, we evaluated consistency using a degree correlation approach after transforming both evaporation and precipitation data into spherical harmonics. Overall we found no persistent hydrological consistency in these dryland environments. Indeed, the degree correlation showed oscillating values between periods of low and high water storage changes, with a phase difference of about 2–3 months

  19. Consistency of canonical formulation of Horava gravity

    Energy Technology Data Exchange (ETDEWEB)

    Soo, Chopin, E-mail: cpsoo@mail.ncku.edu.tw [Department of Physics, National Cheng Kung University, Tainan, Taiwan (China)

    2011-09-22

    Both the non-projectable and projectable version of Horava gravity face serious challenges. In the non-projectable version, the constraint algebra is seemingly inconsistent. The projectable version lacks a local Hamiltonian constraint, thus allowing for an extra graviton mode which can be problematic. A new formulation (based on arXiv:1007.1563) of Horava gravity which is naturally realized as a representation of the master constraint algebra (instead of the Dirac algebra) studied by loop quantum gravity researchers is presented. This formulation yields a consistent canonical theory with first class constraints; and captures the essence of Horava gravity in retaining only spatial diffeomorphisms as the physically relevant non-trivial gauge symmetry. At the same time the local Hamiltonian constraint is equivalently enforced by the master constraint.

  20. Trisomy 21 consistently activates the interferon response.

    Science.gov (United States)

    Sullivan, Kelly D; Lewis, Hannah C; Hill, Amanda A; Pandey, Ahwan; Jackson, Leisa P; Cabral, Joseph M; Smith, Keith P; Liggett, L Alexander; Gomez, Eliana B; Galbraith, Matthew D; DeGregori, James; Espinosa, Joaquín M

    2016-07-29

    Although it is clear that trisomy 21 causes Down syndrome, the molecular events acting downstream of the trisomy remain ill defined. Using complementary genomics analyses, we identified the interferon pathway as the major signaling cascade consistently activated by trisomy 21 in human cells. Transcriptome analysis revealed that trisomy 21 activates the interferon transcriptional response in fibroblast and lymphoblastoid cell lines, as well as circulating monocytes and T cells. Trisomy 21 cells show increased induction of interferon-stimulated genes and decreased expression of ribosomal proteins and translation factors. An shRNA screen determined that the interferon-activated kinases JAK1 and TYK2 suppress proliferation of trisomy 21 fibroblasts, and this defect is rescued by pharmacological JAK inhibition. Therefore, we propose that interferon activation, likely via increased gene dosage of the four interferon receptors encoded on chromosome 21, contributes to many of the clinical impacts of trisomy 21, and that interferon antagonists could have therapeutic benefits.

  1. On the consistent use of Constructed Observables

    CERN Document Server

    Trott, Michael

    2015-01-01

    We define "constructed observables" as relating experimental measurements to terms in a Lagrangian while simultaneously making assumptions about possible deviations from the Standard Model (SM), in other Lagrangian terms. Ensuring that the SM effective field theory (EFT) is constrained correctly when using constructed observables requires that their defining conditions are imposed on the EFT in a manner that is consistent with the equations of motion. Failing to do so can result in a "functionally redundant" operator basis and the wrong expectation as to how experimental quantities are related in the EFT. We illustrate the issues involved considering the $\\rm S$ parameter and the off shell triple gauge coupling (TGC) verticies. We show that the relationships between $h \\rightarrow V \\bar{f} \\, f$ decay and the off shell TGC verticies are subject to these subtleties, and how the connections between these observables vanish in the limit of strong bounds due to LEP. The challenge of using constructed observables...

  2. Consistently weighted measures for complex network topologies

    CERN Document Server

    Heitzig, Jobst; Zou, Yong; Marwan, Norbert; Kurths, Jürgen

    2011-01-01

    When network and graph theory are used in the study of complex systems, a typically finite set of nodes of the network under consideration is frequently either explicitly or implicitly considered representative of a much larger finite or infinite set of objects of interest. The selection procedure, e.g., formation of a subset or some kind of discretization or aggregation, typically results in individual nodes of the studied network representing quite differently sized parts of the domain of interest. This heterogeneity may induce substantial bias and artifacts in derived network statistics. To avoid this bias, we propose an axiomatic scheme based on the idea of {\\em node splitting invariance} to derive consistently weighted variants of various commonly used statistical network measures. The practical relevance and applicability of our approach is demonstrated for a number of example networks from different fields of research, and is shown to be of fundamental importance in particular in the study of climate n...

  3. Consistent 4-form fluxes for maximal supergravity

    CERN Document Server

    Godazgar, Hadi; Krueger, Olaf; Nicolai, Hermann

    2015-01-01

    We derive new ansaetze for the 4-form field strength of D=11 supergravity corresponding to uplifts of four-dimensional maximal gauged supergravity. In particular, the ansaetze directly yield the components of the 4-form field strength in terms of the scalars and vectors of the four-dimensional maximal gauged supergravity---in this way they provide an explicit uplift of all four-dimensional consistent truncations of D=11 supergravity. The new ansaetze provide a substantially simpler method for uplifting d=4 flows compared to the previously available method using the 3-form and 6-form potential ansaetze. The ansatz for the Freund-Rubin term allows us to conjecture a `master formula' for the latter in terms of the scalar potential of d=4 gauged supergravity and its first derivative. We also resolve a long-standing puzzle concerning the antisymmetry of the flux obtained from uplift ansaetze.

  4. Quantum cosmological consistency condition for inflation

    Energy Technology Data Exchange (ETDEWEB)

    Calcagni, Gianluca [Instituto de Estructura de la Materia, CSIC, calle Serrano 121, 28006 Madrid (Spain); Kiefer, Claus [Institut für Theoretische Physik, Universität zu Köln, Zülpicher Strasse 77, 50937 Köln (Germany); Steinwachs, Christian F., E-mail: calcagni@iem.cfmac.csic.es, E-mail: kiefer@thp.uni-koeln.de, E-mail: christian.steinwachs@physik.uni-freiburg.de [Physikalisches Institut, Albert-Ludwigs-Universität Freiburg, Hermann-Herder-Str. 3, 79104 Freiburg (Germany)

    2014-10-01

    We investigate the quantum cosmological tunneling scenario for inflationary models. Within a path-integral approach, we derive the corresponding tunneling probability distribution. A sharp peak in this distribution can be interpreted as the initial condition for inflation and therefore as a quantum cosmological prediction for its energy scale. This energy scale is also a genuine prediction of any inflationary model by itself, as the primordial gravitons generated during inflation leave their imprint in the B-polarization of the cosmic microwave background. In this way, one can derive a consistency condition for inflationary models that guarantees compatibility with a tunneling origin and can lead to a testable quantum cosmological prediction. The general method is demonstrated explicitly for the model of natural inflation.

  5. Internal Branding and Employee Brand Consistent Behaviours

    DEFF Research Database (Denmark)

    Mazzei, Alessandra; Ravazzani, Silvia

    2017-01-01

    Employee behaviours conveying brand values, named brand consistent behaviours, affect the overall brand evaluation. Internal branding literature highlights a knowledge gap in terms of communication practices intended to sustain such behaviours. This study contributes to the development of a non......-normative and constitutive approach to internal branding by proposing an enablement-oriented communication approach. The conceptual background presents a holistic model of the inside-out process of brand building. This model adopts a theoretical approach to internal branding as a nonnormative practice that facilitates...... constitutive processes. In particular, the paper places emphasis on the role and kinds of communication practices as a central part of the nonnormative and constitutive internal branding process. The paper also discusses an empirical study based on interviews with 32 Italian and American communication managers...

  6. Quantum cosmological consistency condition for inflation

    CERN Document Server

    Calcagni, Gianluca; Steinwachs, Christian F

    2014-01-01

    We investigate the quantum cosmological tunneling scenario for inflationary models. Within a path-integral approach, we derive the corresponding tunneling probability distribution. A sharp peak in this distribution can be interpreted as the initial condition for inflation and therefore as a quantum cosmological prediction for its energy scale. This energy scale is also a genuine prediction of any inflationary model by itself, as the primordial gravitons generated during inflation leave their imprint in the B-polarization of the cosmic microwave background. In this way, one can derive a consistency condition for inflationary models that guarantees compatibility with a tunneling origin and can lead to a testable quantum cosmological prediction. The general method is demonstrated explicitly for the model of natural inflation.

  7. Thermodynamically consistent model calibration in chemical kinetics

    Directory of Open Access Journals (Sweden)

    Goutsias John

    2011-05-01

    Full Text Available Abstract Background The dynamics of biochemical reaction systems are constrained by the fundamental laws of thermodynamics, which impose well-defined relationships among the reaction rate constants characterizing these systems. Constructing biochemical reaction systems from experimental observations often leads to parameter values that do not satisfy the necessary thermodynamic constraints. This can result in models that are not physically realizable and may lead to inaccurate, or even erroneous, descriptions of cellular function. Results We introduce a thermodynamically consistent model calibration (TCMC method that can be effectively used to provide thermodynamically feasible values for the parameters of an open biochemical reaction system. The proposed method formulates the model calibration problem as a constrained optimization problem that takes thermodynamic constraints (and, if desired, additional non-thermodynamic constraints into account. By calculating thermodynamically feasible values for the kinetic parameters of a well-known model of the EGF/ERK signaling cascade, we demonstrate the qualitative and quantitative significance of imposing thermodynamic constraints on these parameters and the effectiveness of our method for accomplishing this important task. MATLAB software, using the Systems Biology Toolbox 2.1, can be accessed from http://www.cis.jhu.edu/~goutsias/CSS lab/software.html. An SBML file containing the thermodynamically feasible EGF/ERK signaling cascade model can be found in the BioModels database. Conclusions TCMC is a simple and flexible method for obtaining physically plausible values for the kinetic parameters of open biochemical reaction systems. It can be effectively used to recalculate a thermodynamically consistent set of parameter values for existing thermodynamically infeasible biochemical reaction models of cellular function as well as to estimate thermodynamically feasible values for the parameters of new

  8. Design of Nut Picked Platform in Hill Country

    Directory of Open Access Journals (Sweden)

    Jian Zhou

    2014-10-01

    Full Text Available The subject target is to design agricultural production facility. Hydraulic lifting platform consists of high quality steel, hydraulic pumps, fluid cylinders, tubing, rail wheels and other combinations. The hydraulic lifting table of maximum load is 500 kg, The lifting table is composed of two parts: Mechanism body and hydraulic system. Lifting operation and controlling system are done by a hydraulic system and hydraulic cylinders, all combinations of hydraulic system are driven by hydraulic pump. Hydraulic pump and hydraulic cylinder mainly belong to standard components. The affected maximum force of fluid cylinder is analyzed through supporting hinge frame for raising goods, torque of lifting platform is also calculated, liquid system is also studied carefully. Through comparing and checking the relative datum, the design can satisfy the demand of picking fruits.

  9. Design of Takagi-Sugeno fuzzy model based nonlinear sliding model controller

    Institute of Scientific and Technical Information of China (English)

    Xu Yong; Chen Zengqiang; Yuan Zhuzhi

    2005-01-01

    A design method is presented for Takagi-Sugeno (T-S) fuzzy model based nonlinear sliding model controller. First, the closed-loop fuzzy system is divided into a set of dominant local linear systems according to operating sub-regions. In each sub-region the fuzzy system consists of nominal linear system and a group of interacting systems. Then the controller composed two parts is designed. One part is designed to control the nominal system, the other is designed to control the interacting systems with sliding mode theory. The proposed controller can improve the robustness and guarantee tracking performance of the fuzzy system. Stability is guaranteed without finding a common positive definite matrix.

  10. Consistent lattice Boltzmann equations for phase transitions.

    Science.gov (United States)

    Siebert, D N; Philippi, P C; Mattila, K K

    2014-11-01

    Unlike conventional computational fluid dynamics methods, the lattice Boltzmann method (LBM) describes the dynamic behavior of fluids in a mesoscopic scale based on discrete forms of kinetic equations. In this scale, complex macroscopic phenomena like the formation and collapse of interfaces can be naturally described as related to source terms incorporated into the kinetic equations. In this context, a novel athermal lattice Boltzmann scheme for the simulation of phase transition is proposed. The continuous kinetic model obtained from the Liouville equation using the mean-field interaction force approach is shown to be consistent with diffuse interface model using the Helmholtz free energy. Density profiles, interface thickness, and surface tension are analytically derived for a plane liquid-vapor interface. A discrete form of the kinetic equation is then obtained by applying the quadrature method based on prescribed abscissas together with a third-order scheme for the discretization of the streaming or advection term in the Boltzmann equation. Spatial derivatives in the source terms are approximated with high-order schemes. The numerical validation of the method is performed by measuring the speed of sound as well as by retrieving the coexistence curve and the interface density profiles. The appearance of spurious currents near the interface is investigated. The simulations are performed with the equations of state of Van der Waals, Redlich-Kwong, Redlich-Kwong-Soave, Peng-Robinson, and Carnahan-Starling.

  11. Exploring the Consistent behavior of Information Services

    Directory of Open Access Journals (Sweden)

    Kapidakis Sarantos

    2016-01-01

    Full Text Available Computer services are normally assumed to work well all the time. This usually happens for crucial services like bank electronic services, but not necessarily so for others, that there is no commercial interest in their operation. In this work we examined the operation and the errors of information services and tried to find clues that will help predicting the consistency of the behavior and the quality of the harvesting, which is harder because of the transient conditions and the many services and the huge amount of harvested information. We found many unexpected situations. The services that always successfully satisfy a request may in fact return part of it. A significant part of the OAI services have ceased working while many other serves occasionally fail to respond. Some services fail in the same way each time, and we pronounce them dead, as we do not see a way to overcome that. Others also always, or sometimes fail, but not in the same way, and we hope that their behavior is affected by temporary factors, that may improve later on. We categorized the services into classes, to study their behavior in more detail.

  12. Consistent quadrupole-octupole collective model

    Science.gov (United States)

    Dobrowolski, A.; Mazurek, K.; Góźdź, A.

    2016-11-01

    Within this work we present a consistent approach to quadrupole-octupole collective vibrations coupled with the rotational motion. A realistic collective Hamiltonian with variable mass-parameter tensor and potential obtained through the macroscopic-microscopic Strutinsky-like method with particle-number-projected BCS (Bardeen-Cooper-Schrieffer) approach in full vibrational and rotational, nine-dimensional collective space is diagonalized in the basis of projected harmonic oscillator eigensolutions. This orthogonal basis of zero-, one-, two-, and three-phonon oscillator-like functions in vibrational part, coupled with the corresponding Wigner function is, in addition, symmetrized with respect to the so-called symmetrization group, appropriate to the collective space of the model. In the present model it is D4 group acting in the body-fixed frame. This symmetrization procedure is applied in order to provide the uniqueness of the Hamiltonian eigensolutions with respect to the laboratory coordinate system. The symmetrization is obtained using the projection onto the irreducible representation technique. The model generates the quadrupole ground-state spectrum as well as the lowest negative-parity spectrum in 156Gd nucleus. The interband and intraband B (E 1 ) and B (E 2 ) reduced transition probabilities are also calculated within those bands and compared with the recent experimental results for this nucleus. Such a collective approach is helpful in searching for the fingerprints of the possible high-rank symmetries (e.g., octahedral and tetrahedral) in nuclear collective bands.

  13. A Consistent Phylogenetic Backbone for the Fungi

    Science.gov (United States)

    Ebersberger, Ingo; de Matos Simoes, Ricardo; Kupczok, Anne; Gube, Matthias; Kothe, Erika; Voigt, Kerstin; von Haeseler, Arndt

    2012-01-01

    The kingdom of fungi provides model organisms for biotechnology, cell biology, genetics, and life sciences in general. Only when their phylogenetic relationships are stably resolved, can individual results from fungal research be integrated into a holistic picture of biology. However, and despite recent progress, many deep relationships within the fungi remain unclear. Here, we present the first phylogenomic study of an entire eukaryotic kingdom that uses a consistency criterion to strengthen phylogenetic conclusions. We reason that branches (splits) recovered with independent data and different tree reconstruction methods are likely to reflect true evolutionary relationships. Two complementary phylogenomic data sets based on 99 fungal genomes and 109 fungal expressed sequence tag (EST) sets analyzed with four different tree reconstruction methods shed light from different angles on the fungal tree of life. Eleven additional data sets address specifically the phylogenetic position of Blastocladiomycota, Ustilaginomycotina, and Dothideomycetes, respectively. The combined evidence from the resulting trees supports the deep-level stability of the fungal groups toward a comprehensive natural system of the fungi. In addition, our analysis reveals methodologically interesting aspects. Enrichment for EST encoded data—a common practice in phylogenomic analyses—introduces a strong bias toward slowly evolving and functionally correlated genes. Consequently, the generalization of phylogenomic data sets as collections of randomly selected genes cannot be taken for granted. A thorough characterization of the data to assess possible influences on the tree reconstruction should therefore become a standard in phylogenomic analyses. PMID:22114356

  14. Consistency between GRUAN sondes, LBLRTM and IASI

    Directory of Open Access Journals (Sweden)

    X. Calbet

    2017-06-01

    Full Text Available Radiosonde soundings from the GCOS Reference Upper-Air Network (GRUAN data record are shown to be consistent with Infrared Atmospheric Sounding Instrument (IASI-measured radiances via LBLRTM (Line-By-Line Radiative Transfer Model in the part of the spectrum that is mostly affected by water vapour absorption in the upper troposphere (from 700 hPa up. This result is key for climate data records, since GRUAN, IASI and LBLRTM constitute reference measurements or a reference radiative transfer model in each of their fields. This is specially the case for night-time radiosonde measurements. Although the sample size is small (16 cases, daytime GRUAN radiosonde measurements seem to have a small dry bias of 2.5 % in absolute terms of relative humidity, located mainly in the upper troposphere, with respect to LBLRTM and IASI. Full metrological closure is not yet possible and will not be until collocation uncertainties are better characterized and a full uncertainty covariance matrix is clarified for GRUAN.

  15. Retrocausation, Consistency, and the Bilking Paradox

    Science.gov (United States)

    Dobyns, York H.

    2011-11-01

    Retrocausation seems to admit of time paradoxes in which events prevent themselves from occurring and thereby create a physical instance of the liar's paradox, an event which occurs iff it does not occur. The specific version in which a retrocausal event is used to trigger an intervention which prevents its own future cause is called the bilking paradox (the event is bilked of its cause). The analysis of Echeverria, Klinkhammer, and Thorne (EKT) suggests time paradoxes cannot arise even in the presence of retrocausation. Any self-contradictory event sequence will be replaced in reality by a closely related but noncontradictory sequence. The EKT analysis implies that attempts to create bilking must instead produce logically consistent sequences wherein the bilked event arises from alternative causes. Bilking a retrocausal information channel of limited reliability usually results only in failures of signaling. An exception applies when the bilking is conducted in response only to some of the signal values that can be carried on the channel. Theoretical analysis based on EKT predicts that, since some of the channel outcomes are not bilked, the channel is capable of transmitting data with its normal reliability, and the paradox-avoidance effects will instead suppress the outcomes that would lead to forbidden (bilked) transmissions. A recent parapsychological experiment by Bem displays a retrocausal information channel of sufficient reliability to test this theoretical model of physical reality's response to retrocausal effects. A modified version with partial bilking would provide a direct test of the generality of the EKT mechanism.

  16. Ciliate communities consistently associated with coral diseases

    Science.gov (United States)

    Sweet, M. J.; Séré, M. G.

    2016-07-01

    Incidences of coral disease are increasing. Most studies which focus on diseases in these organisms routinely assess variations in bacterial associates. However, other microorganism groups such as viruses, fungi and protozoa are only recently starting to receive attention. This study aimed at assessing the diversity of ciliates associated with coral diseases over a wide geographical range. Here we show that a wide variety of ciliates are associated with all nine coral diseases assessed. Many of these ciliates such as Trochilia petrani and Glauconema trihymene feed on the bacteria which are likely colonizing the bare skeleton exposed by the advancing disease lesion or the necrotic tissue itself. Others such as Pseudokeronopsis and Licnophora macfarlandi are common predators of other protozoans and will be attracted by the increase in other ciliate species to the lesion interface. However, a few ciliate species (namely Varistrombidium kielum, Philaster lucinda, Philaster guamense, a Euplotes sp., a Trachelotractus sp. and a Condylostoma sp.) appear to harbor symbiotic algae, potentially from the coral themselves, a result which may indicate that they play some role in the disease pathology at the very least. Although, from this study alone we are not able to discern what roles any of these ciliates play in disease causation, the consistent presence of such communities with disease lesion interfaces warrants further investigation.

  17. Fundamentals of piping design

    CERN Document Server

    Smith, Peter

    2013-01-01

    Written for the piping engineer and designer in the field, this two-part series helps to fill a void in piping literature,since the Rip Weaver books of the '90s were taken out of print at the advent of the Computer Aid Design(CAD) era. Technology may have changed, however the fundamentals of piping rules still apply in the digitalrepresentation of process piping systems. The Fundamentals of Piping Design is an introduction to the designof piping systems, various processes and the layout of pipe work connecting the major items of equipment forthe new hire, the engineering student and the vetera

  18. Improving electrofishing catch consistency by standardizing power

    Science.gov (United States)

    Burkhardt, Randy W.; Gutreuter, Steve

    1995-01-01

    The electrical output of electrofishing equipment is commonly standardized by using either constant voltage or constant amperage, However, simplified circuit and wave theories of electricity suggest that standardization of power (wattage) available for transfer from water to fish may be critical for effective standardization of electrofishing. Electrofishing with standardized power ensures that constant power is transferable to fish regardless of water conditions. The in situ performance of standardized power output is poorly known. We used data collected by the interagency Long Term Resource Monitoring Program (LTRMP) in the upper Mississippi River system to assess the effectiveness of standardizing power output. The data consisted of 278 electrofishing collections, comprising 9,282 fishes in eight species groups, obtained during 1990 from main channel border, backwater, and tailwater aquatic areas in four reaches of the upper Mississippi River and one reach of the Illinois River. Variation in power output explained an average of 14.9% of catch variance for night electrofishing and 12.1 % for day electrofishing. Three patterns in catch per unit effort were observed for different species: increasing catch with increasing power, decreasing catch with increasing power, and no power-related pattern. Therefore, in addition to reducing catch variation, controlling power output may provide some capability to select particular species. The LTRMP adopted standardized power output beginning in 1991; standardized power output is adjusted for variation in water conductivity and water temperature by reference to a simple chart. Our data suggest that by standardizing electrofishing power output, the LTRMP has eliminated substantial amounts of catch variation at virtually no additional cost.

  19. On Consistency of Operational Transformation Approach

    Directory of Open Access Journals (Sweden)

    Aurel Randolph

    2013-02-01

    Full Text Available The Operational Transformation (OT approach, used in many collaborative editors, allows a group of users to concurrently update replicas of a shared object and exchange their updates in any order. The basic idea of this approach is to transform any received update operation before its execution on a replica of the object. This transformation aims to ensure the convergence of the different replicas of the object, even though the operations are executed in different orders. However, designing transformation functions for achieving convergence is a critical and challenging issue. Indeed, the transformation functions proposed in the literature are all revealed incorrect. In this paper, we investigate the existence of transformation functions for a shared string altered by insert and delete operations. From the theoretical point of view, two properties – named TP1 and TP2 – are necessary and sufficient to ensure convergence. Using controller synthesis technique, we show that there are some transformation functions which satisfy only TP1 for the basic signatures of insert and delete operations. As a matter of fact, it is impossible to meet both properties TP1 and TP2 with these simple signatures.

  20. Wetting of polymer liquids: Monte Carlo simulations and self-consistent field calculations

    CERN Document Server

    Müller, M

    2003-01-01

    Using Monte Carlo simulations and self-consistent field (SCF) theory we study the surface and interface properties of a coarse grained off-lattice model. In the simulations we employ the grand canonical ensemble together with a reweighting scheme in order to measure surface and interface free energies and discuss various methods for accurately locating the wetting transition. In the SCF theory, we use a partial enumeration scheme to incorporate single-chain properties on all length scales and use a weighted density functional for the excess free energy. The results of various forms of the density functional are compared quantitatively to the simulation results. For the theory to be accurate, it is important to decompose the free energy functional into a repulsive and an attractive part, with different approximations for the two parts. Measuring the effective interface potential for our coarse grained model we explore routes for controlling the equilibrium wetting properties. (i) Coating of the substrate by an...

  1. Design and construction of a VHGT-attached WDM-type triplex transceiver module using polymer PLC hybrid integration technology

    Science.gov (United States)

    Jerábek, Vitezslav; Hüttel, Ivan; Prajzler, Václav; Busek, K.; Seliger, P.

    2008-11-01

    We report about design and construction of the bidirectional transceiver TRx module for subscriber part of the passive optical network PON for a fiber to the home FTTH topology. The TRx module consists of a epoxy novolak resin polymer planar lightwave circuit (PLC) hybrid integration technology with volume holographic grating triplex filter VHGT, surface-illuminated photodetectors and spot-size converted Fabry-Pérot laser diode in SMD package. The hybrid PLC has composed from a two parts-polymer optical waveguide including VHGT filter section and a optoelectronic microwave section. The both parts are placed on the composite substrate.

  2. Structural Consistency: Enabling XML Keyword Search to Eliminate Spurious Results Consistently

    CERN Document Server

    Lee, Ki-Hoon; Han, Wook-Shin; Kim, Min-Soo

    2009-01-01

    XML keyword search is a user-friendly way to query XML data using only keywords. In XML keyword search, to achieve high precision without sacrificing recall, it is important to remove spurious results not intended by the user. Efforts to eliminate spurious results have enjoyed some success by using the concepts of LCA or its variants, SLCA and MLCA. However, existing methods still could find many spurious results. The fundamental cause for the occurrence of spurious results is that the existing methods try to eliminate spurious results locally without global examination of all the query results and, accordingly, some spurious results are not consistently eliminated. In this paper, we propose a novel keyword search method that removes spurious results consistently by exploiting the new concept of structural consistency.

  3. Context-specific metabolic networks are consistent with experiments.

    Directory of Open Access Journals (Sweden)

    Scott A Becker

    2008-05-01

    Full Text Available Reconstructions of cellular metabolism are publicly available for a variety of different microorganisms and some mammalian genomes. To date, these reconstructions are "genome-scale" and strive to include all reactions implied by the genome annotation, as well as those with direct experimental evidence. Clearly, many of the reactions in a genome-scale reconstruction will not be active under particular conditions or in a particular cell type. Methods to tailor these comprehensive genome-scale reconstructions into context-specific networks will aid predictive in silico modeling for a particular situation. We present a method called Gene Inactivity Moderated by Metabolism and Expression (GIMME to achieve this goal. The GIMME algorithm uses quantitative gene expression data and one or more presupposed metabolic objectives to produce the context-specific reconstruction that is most consistent with the available data. Furthermore, the algorithm provides a quantitative inconsistency score indicating how consistent a set of gene expression data is with a particular metabolic objective. We show that this algorithm produces results consistent with biological experiments and intuition for adaptive evolution of bacteria, rational design of metabolic engineering strains, and human skeletal muscle cells. This work represents progress towards producing constraint-based models of metabolism that are specific to the conditions where the expression profiling data is available.

  4. Time-Consistent and Market-Consistent Evaluations (Revised version of 2012-086)

    NARCIS (Netherlands)

    Stadje, M.A.; Pelsser, A.

    2014-01-01

    Abstract: We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  5. Methodological issues in examining measurement equivalence in patient reported outcomes measures: Methods overview to the two-part series, “Measurement equivalence of the Patient Reported Outcomes Measurement Information System® (PROMIS® short forms”

    Directory of Open Access Journals (Sweden)

    Jeanne A. Teresi

    2016-03-01

    Full Text Available The purpose of this article is to introduce the methods used and challenges confronted by the authors of this two-part series of articles describing the results of analyses of measurement equivalence of the short form scales from the Patient Reported Outcomes Measurement Information System® (PROMIS®. Qualitative and quantitative approaches used to examine differential item functioning (DIF are reviewed briefly. Qualitative methods focused on generation of DIF hypotheses. The basic quantitative approaches used all rely on a latent variable model, and examine parameters either derived directly from item response theory (IRT or from structural equation models (SEM. A key methods focus of these articles is to describe state-of-the art approaches to examination of measurement equivalence in eight domains: physical health, pain, fatigue, sleep, depression, anxiety, cognition, and social function. These articles represent the first time that DIF has been examined systematically in the PROMIS short form measures, particularly among ethnically diverse groups. This is also the first set of analyses to examine the performance of PROMIS short forms in patients with cancer. Latent variable model state-of-the-art methods for examining measurement equivalence are introduced briefly in this paper to orient readers to the approaches adopted in this set of papers. Several methodological challenges underlying (DIF-free anchor item selection and model assumption violations are presented as a backdrop for the articles in this two-part series on measurement equivalence of PROMIS measures.

  6. DOE handbook: Design considerations

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-04-01

    The Design Considerations Handbook includes information and suggestions for the design of systems typical to nuclear facilities, information specific to various types of special facilities, and information useful to various design disciplines. The handbook is presented in two parts. Part 1, which addresses design considerations, includes two sections. The first addresses the design of systems typically used in nuclear facilities to control radiation or radioactive materials. Specifically, this part addresses the design of confinement systems and radiation protection and effluent monitoring systems. The second section of Part 1 addresses the design of special facilities (i.e., specific types of nonreactor nuclear facilities). The specific design considerations provided in this section were developed from review of DOE 6430.1A and are supplemented with specific suggestions and considerations from designers with experience designing and operating such facilities. Part 2 of the Design Considerations Handbook describes good practices and design principles that should be considered in specific design disciplines, such as mechanical systems and electrical systems. These good practices are based on specific experiences in the design of nuclear facilities by design engineers with related experience. This part of the Design Considerations Handbook contains five sections, each of which applies to a particular engineering discipline.

  7. Consistently Showing Your Best Side? Intra-individual Consistency in #Selfie Pose Orientation

    Science.gov (United States)

    Lindell, Annukka K.

    2017-01-01

    Painted and photographic portraits of others show an asymmetric bias: people favor their left cheek. Both experimental and database studies confirm that the left cheek bias extends to selfies. To date all such selfie studies have been cross-sectional; whether individual selfie-takers tend to consistently favor the same pose orientation, or switch between multiple poses, remains to be determined. The present study thus examined intra-individual consistency in selfie pose orientations. Two hundred selfie-taking participants (100 male and 100 female) were identified by searching #selfie on Instagram. The most recent 10 single-subject selfies for the each of the participants were selected and coded for type of selfie (normal; mirror) and pose orientation (left, midline, right), resulting in a sample of 2000 selfies. Results indicated that selfie-takers do tend to consistently adopt a preferred pose orientation (α = 0.72), with more participants showing an overall left cheek bias (41%) than would be expected by chance (overall right cheek bias = 31.5%; overall midline bias = 19.5%; no overall bias = 8%). Logistic regression modellng, controlling for the repeated measure of participant identity, indicated that sex did not affect pose orientation. However, selfie type proved a significant predictor when comparing left and right cheek poses, with a stronger left cheek bias for mirror than normal selfies. Overall, these novel findings indicate that selfie-takers show intra-individual consistency in pose orientation, and in addition, replicate the previously reported left cheek bias for selfies and other types of portrait, confirming that the left cheek bias also presents within individuals’ selfie corpora.

  8. Consistently Showing Your Best Side? Intra-individual Consistency in #Selfie Pose Orientation.

    Science.gov (United States)

    Lindell, Annukka K

    2017-01-01

    Painted and photographic portraits of others show an asymmetric bias: people favor their left cheek. Both experimental and database studies confirm that the left cheek bias extends to selfies. To date all such selfie studies have been cross-sectional; whether individual selfie-takers tend to consistently favor the same pose orientation, or switch between multiple poses, remains to be determined. The present study thus examined intra-individual consistency in selfie pose orientations. Two hundred selfie-taking participants (100 male and 100 female) were identified by searching #selfie on Instagram. The most recent 10 single-subject selfies for the each of the participants were selected and coded for type of selfie (normal; mirror) and pose orientation (left, midline, right), resulting in a sample of 2000 selfies. Results indicated that selfie-takers do tend to consistently adopt a preferred pose orientation (α = 0.72), with more participants showing an overall left cheek bias (41%) than would be expected by chance (overall right cheek bias = 31.5%; overall midline bias = 19.5%; no overall bias = 8%). Logistic regression modellng, controlling for the repeated measure of participant identity, indicated that sex did not affect pose orientation. However, selfie type proved a significant predictor when comparing left and right cheek poses, with a stronger left cheek bias for mirror than normal selfies. Overall, these novel findings indicate that selfie-takers show intra-individual consistency in pose orientation, and in addition, replicate the previously reported left cheek bias for selfies and other types of portrait, confirming that the left cheek bias also presents within individuals' selfie corpora.

  9. Damage-consistent hazard assessment - the revival of intensities

    Science.gov (United States)

    Klügel, Jens-Uwe

    2016-04-01

    . The paper discusses in detail the possible use of intensities in seismic hazard assessment as well as the conversion of intensities into any required set of engineering parameters. This opens the space for a damage-consistent design of structures and therefore, for a truly performance-based approach to earthquake engineering design.

  10. Preliminary design of a coffee harvester

    Directory of Open Access Journals (Sweden)

    Raphael Magalhães Gomes Moreira

    2016-10-01

    Full Text Available Design of an agricultural machine is a highly complex process due to interactions between the operator, machine, and environment. Mountain coffee plantations constitute an economic sector that requires huge investments for the development of agricultural machinery to improve the harvesting and post-harvesting processes and to overcome the scarcity of work forces in the fields. The aim of this study was to develop a preliminary design for a virtual prototype of a coffee fruit harvester. In this study, a project methodology was applied and adapted for the development of the following steps: project planning, informational design, conceptual design, and preliminary design. The construction of a morphological matrix made it possible to obtain a list of different mechanisms with specific functions. The union between these mechanisms resulted in variants, which were weighed to attribute scores for each selected criterion. From each designated proposal, two variants with the best scores were selected and this permitted the preparation of the preliminary design of both variants. The archetype was divided in two parts, namely the hydraulically articulated arms and the harvesting system that consisted of the vibration mechanism and the detachment mechanism. The proposed innovation involves the use of parallel rods, which were fixed in a plane and rectangular metal sheet. In this step, dimensions including a maximum length of 4.7 m, a minimum length of 3.3 m, and a total height of 2.15 m were identified based on the functioning of the harvester in relation to the coupling point of the tractor.

  11. Sustaining biological welfare for our future through consistent science.

    Science.gov (United States)

    Shimomura, Yoshihiro; Katsuura, Tetsuo

    2013-01-15

    Physiological anthropology presently covers a very broad range of human knowledge and engineering technologies. This study reviews scientific inconsistencies within a variety of areas: sitting posture; negative air ions; oxygen inhalation; alpha brain waves induced by music and ultrasound; 1/f fluctuations; the evaluation of feelings using surface electroencephalography; Kansei; universal design; and anti-stress issues. We found that the inconsistencies within these areas indicate the importance of integrative thinking and the need to maintain the perspective on the biological benefit to humanity. Analytical science divides human physiological functions into discrete details, although individuals comprise a unified collection of whole-body functions. Such disparate considerations contribute to the misunderstanding of physiological functions and the misevaluation of positive and negative values for humankind. Research related to human health will, in future, depend on the concept of maintaining physiological functions based on consistent science and on sustaining human health to maintain biological welfare in future generations.

  12. Sustaining biological welfare for our future through consistent science

    Directory of Open Access Journals (Sweden)

    Shimomura Yoshihiro

    2013-01-01

    Full Text Available Abstract Physiological anthropology presently covers a very broad range of human knowledge and engineering technologies. This study reviews scientific inconsistencies within a variety of areas: sitting posture; negative air ions; oxygen inhalation; alpha brain waves induced by music and ultrasound; 1/f fluctuations; the evaluation of feelings using surface electroencephalography; Kansei; universal design; and anti-stress issues. We found that the inconsistencies within these areas indicate the importance of integrative thinking and the need to maintain the perspective on the biological benefit to humanity. Analytical science divides human physiological functions into discrete details, although individuals comprise a unified collection of whole-body functions. Such disparate considerations contribute to the misunderstanding of physiological functions and the misevaluation of positive and negative values for humankind. Research related to human health will, in future, depend on the concept of maintaining physiological functions based on consistent science and on sustaining human health to maintain biological welfare in future generations.

  13. Consistent Prediction of Properties of Systems with Lipids

    DEFF Research Database (Denmark)

    Cunico, Larissa; Ceriani, Roberta; Sarup, Bent

    Equilibria between vapour, liquid and/or solid phases, pure component properties and also the mixture-phase properties are necessary for synthesis, design and analysis of different unit operations found in the production of edible oils, fats and biodiesel. A systematic numerical analysis...... is employed to determine the needs of phase equilibria and related properties in processes such as Deodorization, Dry Fractionation, Solvent Extraction and Biodiesel Production. Other important use for the data and analysis is in property model development for correct and consistent property prediction....... Lipids are found in almost all mixtures involving edible oils, fats and biodiesel. They are also being extracted for use in the pharma-industry. A database for pure components (lipids) present in these processes and mixtures properties has been developed and made available for different applications...

  14. Self-Consistent Study of Conjugated Aromatic Molecular Transistors

    Science.gov (United States)

    Wang, Jing; Liang, Yun-Ye; Chen, Hao; Wang, Peng; Note, R.; Mizuseki, H.; Kawazoe, Y.

    2010-06-01

    We study the current through conjugated aromatic molecular transistors modulated by a transverse field. The self-consistent calculation is realized with density function theory through the standard quantum chemistry software Gaussian03 and the non-equilibrium Green's function formalism. The calculated I - V curves controlled by the transverse field present the characteristics of different organic molecular transistors, the transverse field effect of which is improved by the substitutions of nitrogen atoms or fluorine atoms. On the other hand, the asymmetry of molecular configurations to the axis connecting two sulfur atoms is in favor of realizing the transverse field modulation. Suitably designed conjugated aromatic molecular transistors possess different I - V characteristics, some of them are similar to those of metal-oxide-semiconductor field-effect transistors (MOSFET). Some of the calculated molecular devices may work as elements in graphene electronics. Our results present the richness and flexibility of molecular transistors, which describe the colorful prospect of next generation devices.

  15. UFMG Sydenham's chorea rating scale (USCRS): reliability and consistency.

    Science.gov (United States)

    Teixeira, Antônio Lúcio; Maia, Débora P; Cardoso, Francisco

    2005-05-01

    Despite the renewed interest in Sydenham's chorea (SC) in recent years, there were no valid and reliable scales to rate the several signs and symptoms of patients with SC and related disorders. The Universidade Federal de Minas Gerais (UFMG) Sydenham's Chorea Rating Scale (USCRS) was designed to provide a detailed quantitative description of the performance of activities of daily living, behavioral abnormalities, and motor function of subjects with SC. The scale comprises 27 items and each one is scored from 0 (no symptom or sign) to 4 (severe disability or finding). Data from 84 subjects, aged 4.9 to 33.6 years, support the interrater reliability and internal consistency of the scale. The USCRS is a promising instrument for rating the clinical features of SC as well as their functional impact in children and adults.

  16. Planck 2013 results. XXXI. Consistency of the Planck data

    CERN Document Server

    Ade, P A R; Ashdown, M; Aumont, J; Baccigalupi, C; Banday, A.J; Barreiro, R.B; Battaner, E; Benabed, K; Benoit-Levy, A; Bernard, J.P; Bersanelli, M; Bielewicz, P; Bond, J.R; Borrill, J; Bouchet, F.R; Burigana, C; Cardoso, J.F; Catalano, A; Challinor, A; Chamballu, A; Chiang, H.C; Christensen, P.R; Clements, D.L; Colombi, S; Colombo, L.P.L; Couchot, F; Coulais, A; Crill, B.P; Curto, A; Cuttaia, F; Danese, L; Davies, R.D; Davis, R.J; de Bernardis, P; de Rosa, A; de Zotti, G; Delabrouille, J; Desert, F.X; Dickinson, C; Diego, J.M; Dole, H; Donzelli, S; Dore, O; Douspis, M; Dupac, X; Ensslin, T.A; Eriksen, H.K; Finelli, F; Forni, O; Frailis, M; Fraisse, A A; Franceschi, E; Galeotta, S; Ganga, K; Giard, M; Gonzalez-Nuevo, J; Gorski, K.M.; Gratton, S.; Gregorio, A; Gruppuso, A; Gudmundsson, J E; Hansen, F.K; Hanson, D; Harrison, D; Henrot-Versille, S; Herranz, D; Hildebrandt, S.R; Hivon, E; Hobson, M; Holmes, W.A.; Hornstrup, A; Hovest, W.; Huffenberger, K.M; Jaffe, T.R; Jaffe, A.H; Jones, W.C; Keihanen, E; Keskitalo, R; Knoche, J; Kunz, M; Kurki-Suonio, H; Lagache, G; Lahteenmaki, A; Lamarre, J.M; Lasenby, A; Lawrence, C.R; Leonardi, R; Leon-Tavares, J; Lesgourgues, J; Liguori, M; Lilje, P.B; Linden-Vornle, M; Lopez-Caniego, M; Lubin, P.M; Macias-Perez, J.F; Maino, D; Mandolesi, N; Maris, M; Martin, P.G; Martinez-Gonzalez, E; Masi, S; Matarrese, S; Mazzotta, P; Meinhold, P.R; Melchiorri, A; Mendes, L; Mennella, A; Migliaccio, M; Mitra, S; Miville-Deschenes, M.A; Moneti, A; Montier, L; Morgante, G; Mortlock, D; Moss, A; Munshi, D; Murphy, J A; Naselsky, P; Nati, F; Natoli, P; Norgaard-Nielsen, H.U; Noviello, F; Novikov, D; Novikov, I; Oxborrow, C.A; Pagano, L; Pajot, F; Paoletti, D; Partridge, B; Pasian, F; Patanchon, G; Pearson, D; Pearson, T.J; Perdereau, O; Perrotta, F; Piacentini, F; Piat, M; Pierpaoli, E; Pietrobon, D; Plaszczynski, S; Pointecouteau, E; Polenta, G; Ponthieu, N; Popa, L; Pratt, G.W; Prunet, S; Puget, J.L; Rachen, J.P; Reinecke, M; Remazeilles, M; Renault, C; Ricciardi, S.; Ristorcelli, I; Rocha, G.; Roudier, G; Rubino-Martin, J.A; Rusholme, B; Sandri, M; Scott, D; Stolyarov, V; Sudiwala, R; Sutton, D; Suur-Uski, A.S; Sygnet, J.F; Tauber, J.A; Terenzi, L; Toffolatti, L; Tomasi, M; Tristram, M; Tucci, M; Valenziano, L; Valiviita, J; Van Tent, B; Vielva, P; Villa, F; Wade, L.A; Wandelt, B.D; Wehus, I K; White, S D M; Yvon, D; Zacchei, A; Zonca, A

    2014-01-01

    The Planck design and scanning strategy provide many levels of redundancy that can be exploited to provide tests of internal consistency. One of the most important is the comparison of the 70 GHz (amplifier) and 100 GHz (bolometer) channels. Based on different instrument technologies, with feeds located differently in the focal plane, analysed independently by different teams using different software, and near the minimum of diffuse foreground emission, these channels are in effect two different experiments. The 143 GHz channel has the lowest noise level on Planck, and is near the minimum of unresolved foreground emission. In this paper, we analyse the level of consistency achieved in the 2013 Planck data. We concentrate on comparisons between the 70, 100, and 143 GHz channel maps and power spectra, particularly over the angular scales of the first and second acoustic peaks, on maps masked for diffuse Galactic emission and for strong unresolved sources. Difference maps covering angular scales from 8°...

  17. 粤韵园音,气韵相合--粤剧艺术博物馆创作理念%Consistency of Rhythmic Vitality Between the Cantonese Operaand Gardens:the Design Proposal of the Art Museum of Cantonese Opera

    Institute of Scientific and Technical Information of China (English)

    郭谦; 李晓雪

    2015-01-01

    abstract The Art Museum of Cantonese Opera uses the important concept of Chinese traditional art"Rhythmic vitality (QiYun)" as the design spirit. The design proposal borrows the essence of the art of Cantonese opera and Lingnan garden, utilizing it in spiritual meaning and building performance between Cantonese Opera and Lingnan garden, and translating the construction entity into the garden space. Its gardening, building, hill making and water design all use“Rhythmic vitality (QiYun)”as a guideline, achieving a contemporary dialogue across time and space of the two major cultural symbols in Lingnan.%粤剧艺术博物馆以中国传统艺术领域的重要概念“气韵”作为核心设计追求,注重对粤剧艺术及岭南园林艺术精髓的深入挖掘,主动在设计中追求与粤剧艺术在精神内涵及园林建筑表现上的气韵相合,将建筑实体转化为园林空间,造园、建筑、筑山理水皆以“气韵”为设计指导,从而力图带动全园“气韵”生动,达成岭南两大文化标志跨越时空的当代对话。

  18. Online Tonsillectomy Resources: Are Parents Getting Consistent and Readable Recommendations?

    Science.gov (United States)

    Wozney, Lori; Chorney, Jill; Huguet, Anna; Song, Jin Soo; Boss, Emily F; Hong, Paul

    2017-05-01

    Objective Parents frequently refer to information on the Internet to confirm or broaden their understanding of surgical procedures and to research postoperative care practices. Our study evaluated the readability, comprehensiveness, and consistency around online recommendations directed at parents of children undergoing tonsillectomy. Study Design A cross-sectional study design was employed. Setting Thirty English-language Internet websites. Subjects and Methods Three validated measures of readability were applied and content analysis was employed to evaluate the comprehensiveness of information in domains of perioperative education. Frequency effect sizes and percentile ranks were calculated to measure dispersion of recommendations across sites. Results The mean readability level of all sites was above a grade 10 level with fewer than half of the sites (n = 14, 47%) scoring at or below the eight-grade level. Provided information was often incomplete with a noted lack of psychosocial support and skills-training recommendations. Content analysis showed 67 unique recommendations spanning the full perioperative period. Most recommendations had low consensus, being reported in 5 or fewer sites (frequency effect size information easier to read.

  19. Does Maximizing Information at the Cut Score Always Maximize Classification Accuracy and Consistency?

    Science.gov (United States)

    Wyse, Adam E.; Babcock, Ben

    2016-01-01

    A common suggestion made in the psychometric literature for fixed-length classification tests is that one should design tests so that they have maximum information at the cut score. Designing tests in this way is believed to maximize the classification accuracy and consistency of the assessment. This article uses simulated examples to illustrate…

  20. Intelligent Stability Design of Large Underground Hydraulic Caverns: Chinese Method and Practice

    Directory of Open Access Journals (Sweden)

    Xiating Feng

    2011-10-01

    Full Text Available The global energy shortage has revived the interest in hydroelectric power, but extreme geological condition always pose challenges to the construction of hydroelectric power stations with large underground caverns. To solve the problem of safe design of large underground caverns, a Chinese-style intelligent stability design, representing recent developments in Chinese techniques for the construction of underground hydropower systems is presented. The basic aim of this method is to help designers improve the stability and design efficiency of large underground hydropower cavern groups. Its flowchart consists of two parts, one is initial design with an ordinal structure, and the other is dynamic design with a closed loop structure. In each part of the flowchart, analysis techniques, analysis content and design parameters for caverns’ stability are defined, respectively. Thus, the method provides designers with a bridge from the basic information of objective engineering to reasonable design parameters for managing the stability of hydraulic cavern groups. Application to two large underground caverns shows that it is a scientific and economical method for safely constructing underground hydraulic caverns.

  1. The design of mobile robot control system for the aged and the disabled

    Science.gov (United States)

    Qiang, Wang; Lei, Shi; Xiang, Gao; Jin, Zhang

    2017-01-01

    This paper designs a control system of mobile robot for the aged and the disabled, which consists of two main parts: human-computer interaction and drive control module. The data of the two parts is transferred via universal asynchronous receiver/transmitter. In the former part, the speed and direction information of the mobile robot is obtained by hall joystick. In the latter part, the electronic differential algorithm is developed to implement the robot mobile function by driving two-wheel motors. In order to improve the comfort of the robot when speed or direction is changed, the least squares algorithm is used to optimize the speed characteristic curves of the two motors. Experimental results have verified the effectiveness of the designed system.

  2. Longitudinal tDCS: Consistency across Working Memory Training Studies

    Directory of Open Access Journals (Sweden)

    Marian E. Berryhill

    2017-04-01

    Full Text Available There is great interest in enhancing and maintaining cognitive function. In recent years, advances in noninvasive brain stimulation devices, such as transcranial direct current stimulation (tDCS, have targeted working memory in particular. Despite controversy surrounding outcomes of single-session studies, a growing field of working memory training studies incorporate multiple sessions of tDCS. It is useful to take stock of these findings because there is a diversity of paradigms employed and the outcomes observed between research groups. This will be important in assessing cognitive training programs paired with stimulation techniques and identifying the more useful and less effective approaches. Here, we treat the tDCS+ working memory training field as a case example, but also survey training benefits in other neuromodulatory techniques (e.g., tRNS, tACS. There are challenges associated with the broad parameter space including: individual differences, stimulation intensity, duration, montage, session number, session spacing, training task selection, timing of follow up testing, near and far transfer tasks. In summary, although the field of assisted cognitive training is young, some design choices are more favorable than others. By way of heuristic, the current evidence supports including more training/tDCS sessions (5+, applying anodal tDCS targeting prefrontal regions, including follow up testing on trained and transfer tasks after a period of no contact. What remains unclear, but important for future translational value is continuing work to pinpoint optimal values for the tDCS parameters on a per cognitive task basis. Importantly the emerging literature shows notable consistency in the application of tDCS for WM across various participant populations compared to single session experimental designs.

  3. An Extended Model Driven Framework for End-to-End Consistent Model Transformation

    Directory of Open Access Journals (Sweden)

    Mr. G. Ramesh

    2016-08-01

    Full Text Available Model Driven Development (MDD results in quick transformation from models to corresponding systems. Forward engineering features of modelling tools can help in generating source code from models. To build a robust system it is important to have consistency checking in the design models and the same between design model and the transformed implementation. Our framework named as Extensible Real Time Software Design Inconsistency Checker (XRTSDIC proposed in our previous papers supports consistency checking in design models. This paper focuses on automatic model transformation. An algorithm and defined transformation rules for model transformation from UML class diagram to ERD and SQL are being proposed. The model transformation bestows many advantages such as reducing cost of development, improving quality, enhancing productivity and leveraging customer satisfaction. Proposed framework has been enhanced to ensure that the transformed implementations conform to their model counterparts besides checking end-to-end consistency.

  4. The need for consistent criteria for identifying malnutrition.

    Science.gov (United States)

    Hoffer, L John

    2009-01-01

    The lack of consistent criteria for diagnosing malnutrition and protein-energy malnutrition (PEM) creates problems in educating medical students and physicians, setting the parameters for observational and controlled clinical trials, and formulating clinical guidelines. There is no validated formal definition of malnutrition (or PEM), and the tools that have been developed to screen for it, or diagnose it, vary in their agreement. I make the following suggestions. First, avoid unqualified use of the term 'malnutrition', as it is ambiguous. Second, carefully distinguish between screening and diagnosis, which have different aims and implications. Third, consider the notion that in medicine the diagnosis of PEM is reached by 'narrative-interpretive' reasoning, which regards the disease as a pathophysiological entity in a specific clinical context. I recommend that the concept of PEM as a disease (not a score) be imbedded in teaching and the practice of medicine, and in the design of clinical trials and the setting of guidelines. Fourth, disagreements in screening-derived risk scores and uncertainty in diagnosis are difficult to avoid, but only in the grey zone. It would be prudent, at least until the greater medical world considers the nutritional paradigm plausible enough to invest in it, to enroll only patients who have unambiguously diagnosed PEM in prospective trials with hard clinical endpoints. Copyright (c) 2009 S. Karger AG, Basel.

  5. Improving risk assessment by defining consistent and reliable system scenarios

    Directory of Open Access Journals (Sweden)

    B. Mazzorana

    2009-02-01

    Full Text Available During the entire procedure of risk assessment for hydrologic hazards, the selection of consistent and reliable scenarios, constructed in a strictly systematic way, is fundamental for the quality and reproducibility of the results. However, subjective assumptions on relevant impact variables such as sediment transport intensity on the system loading side and weak point response mechanisms repeatedly cause biases in the results, and consequently affect transparency and required quality standards. Furthermore, the system response of mitigation measures to extreme event loadings represents another key variable in hazard assessment, as well as the integral risk management including intervention planning. Formative Scenario Analysis, as a supplement to conventional risk assessment methods, is a technique to construct well-defined sets of assumptions to gain insight into a specific case and the potential system behaviour. By two case studies, carried out (1 to analyse sediment transport dynamics in a torrent section equipped with control measures, and (2 to identify hazards induced by woody debris transport at hydraulic weak points, the applicability of the Formative Scenario Analysis technique is presented. It is argued that during scenario planning in general and with respect to integral risk management in particular, Formative Scenario Analysis allows for the development of reliable and reproducible scenarios in order to design more specifically an application framework for the sustainable assessment of natural hazards impact. The overall aim is to optimise the hazard mapping and zoning procedure by methodologically integrating quantitative and qualitative knowledge.

  6. Design of Annular Linear Induction Pump for High Temperature Liquid Lead Transportation

    Energy Technology Data Exchange (ETDEWEB)

    Kwak, Jae Sik; Kim, Hee Reyoung [Ulsan National Institute of Science and Technology, Ulsan (Korea, Republic of)

    2014-05-15

    EM(Electro Magnetic) Pump is divided into two parts, which consisted of the primary one with electromagnetic core and exciting coils, and secondary one with liquid lead flow. The main geometrical variables of the pump included core length, inner diameter and flow gap while the electromagnetic ones covered pole pitch, turns of coil, number of pole pairs, input current and input frequency. The characteristics of design variables are analyzed by electrical equivalent circuit method taking into account hydraulic head loss in the narrow annular channel of the ALIP. The design program, which was composed by using MATLAB language, was developed to draw pump design variables according to input requirements of the flow rate, developing pressure and operation temperature from the analyses. The analysis on the design of ALIP for high temperature liquid lead transportation was carried for the produce of ALIP designing program based on MATLAB. By the using of ALIP designing program, we don't have to bother about geometrical relationship between each component during detail designing process because code calculate automatically. And prediction of outputs about designing pump can be done easily before manufacturing. By running the code, we also observe and analysis change of outputs caused by changing of pump factors. It will be helpful for the research about optimization of pump outputs.

  7. Characterization of consistent triggers of migraine with aura

    DEFF Research Database (Denmark)

    Hauge, Anne Werner; Kirchmann, Malene; Olesen, Jes

    2011-01-01

    The aim of the present study was to characterize perceived consistent triggers of migraine with aura (MA).......The aim of the present study was to characterize perceived consistent triggers of migraine with aura (MA)....

  8. Characterization of consistent triggers of migraine with aura

    DEFF Research Database (Denmark)

    Hauge, Anne Werner; Kirchmann, Malene; Olesen, Jes

    2011-01-01

    The aim of the present study was to characterize perceived consistent triggers of migraine with aura (MA).......The aim of the present study was to characterize perceived consistent triggers of migraine with aura (MA)....

  9. The utility of theory of planned behavior in predicting consistent ...

    African Journals Online (AJOL)

    admin

    outcomes of the behavior and the evaluations of these outcomes (behavioral beliefs) ... belief towards consistent condom use and motivation for compliance with .... consistency of the items used before constructing a scale. Results. All of the ...

  10. Generalized contexts and consistent histories in quantum mechanics

    Science.gov (United States)

    Losada, Marcelo; Laura, Roberto

    2014-05-01

    We analyze a restriction of the theory of consistent histories by imposing that a valid description of a physical system must include quantum histories which satisfy the consistency conditions for all states. We prove that these conditions are equivalent to imposing the compatibility conditions of our formalism of generalized contexts. Moreover, we show that the theory of consistent histories with the consistency conditions for all states and the formalism of generalized context are equally useful representing expressions which involve properties at different times.

  11. Study on consistent query answering in inconsistent databases

    Institute of Scientific and Technical Information of China (English)

    XIE Dong; YANG Luming

    2007-01-01

    Consistent query answering is an approach to retrieving consistent answers over databases that might be inconsistent with respect to some given integrity constraints The approach is based on a concept of repair.This paper surveys several recent researches on obtaining consistent information from inconsistent databases,such as the underlying semantic model,a number of approaches to computing consistent query answers and the computational complexity of this problem.Furthermore,the work outlines potential research directions in this area.

  12. ECO DESIGN IN DESIGN PROCESS

    Directory of Open Access Journals (Sweden)

    PRALEA Jeni

    2014-05-01

    Full Text Available Eco-design is a new domain, required by the new trends and existing concerns worldwide, generated by the necessity of adopting new design principles. New design principles require the designer to provide a friendly relationship between concept created, environment and consume. This "friendly" relationship should be valid both at present and in the future, generating new opportunities for product, product components or materials from which it was made. Awareness, by the designer, the importance of this new trend, permits the establishment of concepts that have as their objective the protection of present values and ensuring the legacy of future generations. Ecodesig, by its principles, is involved in the design process, from early stage, the stage of product design. Priority objective of the designers will consist in reducing the negative effects on the environment through the entire life cycle and after it is taken out of use. The main aspects of the eco-design will consider extending product exploitation, make better use of materials, reduction of emission of waste. The design process in the "eco"domein must be started by selecting the function of the concept, materials and technological processes, causing the shape of macro and micro geometric of the product through an analysis that involves optimizing and streamlining the product. This paper presents the design process of a cross-sports footwear concept, built on the basis of the principles of ecodesign

  13. CONSISTENCY OF LS ESTIMATOR IN SIMPLE LINEAR EV REGRESSION MODELS

    Institute of Scientific and Technical Information of China (English)

    Liu Jixue; Chen Xiru

    2005-01-01

    Consistency of LS estimate of simple linear EV model is studied. It is shown that under some common assumptions of the model, both weak and strong consistency of the estimate are equivalent but it is not so for quadratic-mean consistency.

  14. Checking Consistency of Pedigree Information is NP-complete

    DEFF Research Database (Denmark)

    Aceto, Luca; Hansen, Jens A.; Ingolfsdottir, Anna

    Consistency checking is a fundamental computational problem in genetics. Given a pedigree and information on the genotypes of some of the individuals in it, the aim of consistency checking is to determine whether these data are consistent with the classic Mendelian laws of inheritance. This probl...

  15. Non-Numeric Intrajudge Consistency Feedback in an Angoff Procedure

    Science.gov (United States)

    Harrison, George M.

    2015-01-01

    The credibility of standard-setting cut scores depends in part on two sources of consistency evidence: intrajudge and interjudge consistency. Although intrajudge consistency feedback has often been provided to Angoff judges in practice, more evidence is needed to determine whether it achieves its intended effect. In this randomized experiment with…

  16. Thermodynamically consistent Bayesian analysis of closed biochemical reaction systems

    Directory of Open Access Journals (Sweden)

    Goutsias John

    2010-11-01

    Full Text Available Abstract Background Estimating the rate constants of a biochemical reaction system with known stoichiometry from noisy time series measurements of molecular concentrations is an important step for building predictive models of cellular function. Inference techniques currently available in the literature may produce rate constant values that defy necessary constraints imposed by the fundamental laws of thermodynamics. As a result, these techniques may lead to biochemical reaction systems whose concentration dynamics could not possibly occur in nature. Therefore, development of a thermodynamically consistent approach for estimating the rate constants of a biochemical reaction system is highly desirable. Results We introduce a Bayesian analysis approach for computing thermodynamically consistent estimates of the rate constants of a closed biochemical reaction system with known stoichiometry given experimental data. Our method employs an appropriately designed prior probability density function that effectively integrates fundamental biophysical and thermodynamic knowledge into the inference problem. Moreover, it takes into account experimental strategies for collecting informative observations of molecular concentrations through perturbations. The proposed method employs a maximization-expectation-maximization algorithm that provides thermodynamically feasible estimates of the rate constant values and computes appropriate measures of estimation accuracy. We demonstrate various aspects of the proposed method on synthetic data obtained by simulating a subset of a well-known model of the EGF/ERK signaling pathway, and examine its robustness under conditions that violate key assumptions. Software, coded in MATLAB®, which implements all Bayesian analysis techniques discussed in this paper, is available free of charge at http://www.cis.jhu.edu/~goutsias/CSS%20lab/software.html. Conclusions Our approach provides an attractive statistical methodology for

  17. On consistency of the weighted arithmetical mean complex judgement matrix

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The weighted arithmetical mean complex judgement matrix(WAMCJM)is the most common method for aggregating group opinions,but it has a shortcoming,namely the WAMCJM of the perfectly consistent judgement matrices given by experts canot guarantee its perfect consistency.An upper bound of the WAMCJM's consistency is presented.Simultaneously,a compatibility index of judging the aggregating extent of group opinions is also introduced.The WAMCJM is of acceptable consistency and is proved provided the compatibilities of all judgement matrices given by experts are smaller than the threshold value of acceptable consistency.These conclusions are important to group decision making.

  18. On multidimensional consistent systems of asymmetric quad-equations

    CERN Document Server

    Boll, Raphael

    2012-01-01

    Multidimensional Consistency becomes more and more important in the theory of discrete integrable systems. Recently, we gave a classification of all 3D consistent 6-tuples of equations with the tetrahedron property, where several novel asymmetric systems have been found. In the present paper we discuss higher-dimensional consistency for 3D consistent systems coming up with this classification. In addition, we will give a classification of certain 4D consistent systems of quad-equations. The results of this paper allow for a proof of the Bianchi permutability among other applications.

  19. Overcoming complexities for consistent, continental-scale flood mapping

    Science.gov (United States)

    Smith, Helen; Zaidman, Maxine; Davison, Charlotte

    2013-04-01

    The EU Floods Directive requires all member states to produce flood hazard maps by 2013. Although flood mapping practices are well developed in Europe, there are huge variations in the scale and resolution of the maps between individual countries. Since extreme flood events are rarely confined to a single country, this is problematic, particularly for the re/insurance industry whose exposures often extend beyond country boundaries. Here, we discuss the challenges of large-scale hydrological and hydraulic modelling, using our experience of developing a 12-country model and set of maps, to illustrate how consistent, high-resolution river flood maps across Europe can be produced. The main challenges addressed include: data acquisition; manipulating the vast quantities of high-resolution data; and computational resources. Our starting point was to develop robust flood-frequency models that are suitable for estimating peak flows for a range of design flood return periods. We used the index flood approach, based on a statistical analysis of historic river flow data pooled on the basis of catchment characteristics. Historical flow data were therefore sourced for each country and collated into a large pan-European database. After a lengthy validation these data were collated into 21 separate analysis zones or regions, grouping smaller river basins according to their physical and climatic characteristics. The very large continental scale basins were each modelled separately on account of their size (e.g. Danube, Elbe, Drava and Rhine). Our methodology allows the design flood hydrograph to be predicted at any point on the river network for a range of return periods. Using JFlow+, JBA's proprietary 2D hydraulic hydrodynamic model, the calculated out-of-bank flows for all watercourses with an upstream drainage area exceeding 50km2 were routed across two different Digital Terrain Models in order to map the extent and depth of floodplain inundation. This generated modelling for

  20. Incompatible multiple consistent sets of histories and measures of quantumness

    Science.gov (United States)

    Halliwell, J. J.

    2017-07-01

    In the consistent histories approach to quantum theory probabilities are assigned to histories subject to a consistency condition of negligible interference. The approach has the feature that a given physical situation admits multiple sets of consistent histories that cannot in general be united into a single consistent set, leading to a number of counterintuitive or contrary properties if propositions from different consistent sets are combined indiscriminately. An alternative viewpoint is proposed in which multiple consistent sets are classified according to whether or not there exists any unifying probability for combinations of incompatible sets which replicates the consistent histories result when restricted to a single consistent set. A number of examples are exhibited in which this classification can be made, in some cases with the assistance of the Bell, Clauser-Horne-Shimony-Holt, or Leggett-Garg inequalities together with Fine's theorem. When a unifying probability exists logical deductions in different consistent sets can in fact be combined, an extension of the "single framework rule." It is argued that this classification coincides with intuitive notions of the boundary between classical and quantum regimes and in particular, the absence of a unifying probability for certain combinations of consistent sets is regarded as a measure of the "quantumness" of the system. The proposed approach and results are closely related to recent work on the classification of quasiprobabilities and this connection is discussed.

  1. GPS Space Service Volume: Ensuring Consistent Utility Across GPS Design Builds for Space Users

    Science.gov (United States)

    Bauer, Frank H.; Parker, Joel Jefferson Konkl; Valdez, Jennifer Ellen

    2015-01-01

    GPS availability and signal strength originally specified for users on or near surface of Earth with transmitted power levels specified at edge-of-Earth, 14.3 degrees. Prior to the SSV specification, on-orbit performance of GPS varied from block build to block build (IIA, IIRM, IIF) due to antenna gain and beam width variances. Unstable on-orbit performance results in significant risk to space users. Side-lobe signals, although not specified, were expected to significantly boost GPS signal availability for users above the constellation. During GPS III Phase A, NASA noted significant discrepancies in power levels specified in GPS III specification documents, and measured on-orbit performance. To stabilize the signal for high altitude space users, NASA DoD team in 2003-2005 led the creation of new Space Service Volume (SSV) definition and specifications.

  2. Integrable Heisenberg Ferromagnet Equations with self-consistent potentials

    CERN Document Server

    Zhunussova, Zh Kh; Tungushbaeva, D I; Mamyrbekova, G K; Nugmanova, G N; Myrzakulov, R

    2013-01-01

    In this paper, we consider some integrable Heisenberg Ferromagnet Equations with self-consistent potentials. We study their Lax representations. In particular we give their equivalent counterparts which are nonlinear Schr\\"odinger type equations. We present the integrable reductions of the Heisenberg Ferromagnet Equations with self-consistent potentials. These integrable Heisenberg Ferromagnet Equations with self-consistent potentials describe nonlinear waves in ferromagnets with magnetic fields.

  3. Behavioural consistency and life history of Rana dalmatina tadpoles

    OpenAIRE

    Urszán, Tamás Janós; Török, János; Hettyey, Attila; Garamszegi, László Z; Herczeg, Gábor

    2015-01-01

    The focus of evolutionary behavioural ecologists has recently turned towards understanding the causes and consequences of behavioural consistency, manifesting either as animal personality (consistency in a single behaviour) or behavioural syndrome (consistency across more behaviours). Behavioural type (mean individual behaviour) has been linked to life-history strategies, leading to the emergence of the integrated pace-of-life syndrome (POLS) theory. Using Rana dalmatina tadpoles as models, w...

  4. Students’ conceptual understanding consistency of heat and temperature

    Science.gov (United States)

    Slamet Budiarti, Indah; Suparmi; Sarwanto; Harjana

    2017-01-01

    The aims of the research were to explore and to describe the consistency of students’ understanding of heat and temperature concept. The sample that was taken using purposive random sampling technique consisted of 99 high school students from 3 senior high schools in Jayapura city. The descriptive qualitative method was employed in this study. The data were collected using tests and interviews regarding the subject matters of Heat and Temperature. Based on the results of data analysis, it was concluded that 3.03% of the students was the consistency of right answer, 79.80% of the students was consistency but wrong answer and 17.17% of the students was inconsistency.

  5. Bootstrap-Based Inference for Cube Root Consistent Estimators

    DEFF Research Database (Denmark)

    Cattaneo, Matias D.; Jansson, Michael; Nagasawa, Kenichi

    This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known to be inconsis......This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known...

  6. Managing Consistency Anomalies in Distributed Integrated Databases with Relaxed ACID Properties

    DEFF Research Database (Denmark)

    Frank, Lars; Ulslev Pedersen, Rasmus

    2014-01-01

    In central databases the consistency of data is normally implemented by using the ACID (Atomicity, Consistency, Isolation and Durability) properties of a DBMS (Data Base Management System). This is not possible if distributed and/or mobile databases are involved and the availability of data also...... has to be optimized. Therefore, we will in this paper use so called relaxed ACID properties across different locations. The objective of designing relaxed ACID properties across different database locations is that the users can trust the data they use even if the distributed database temporarily...... been committed and completed, the execution has the consistency property. The above definition of the consistency property is not useful in distributed databases with relaxed ACID properties because such a database is almost always inconsistent. In the following, we will use the concept Consistency...

  7. The role of interactive control systems in obtaining internal consistency in the management control system package

    DEFF Research Database (Denmark)

    Toldbod, Thomas; Israelsen, Poul

    2014-01-01

    Companies rely on multiple Management Control Systems to obtain their short and long term objectives. When applying a multifaceted perspective on Management Control System the concept of internal consistency has been found to be important in obtaining goal congruency in the company. However...... of MCSs when analyzing internal consistency in the MCS package and how managers obtain internal consistency in the new MCS package when a MCS change occur. This study focuses specifically on changes to administrative controls, which are not internal consistent with the current cybernetic controls. As top......, to date we know little about how managers maintain internal consistency, when individual MCSs change and do not fit with the other MCSs. Based on a case study in a global Danish manufacturing company this study finds that it is necessary to distinguish between the design characteristics of MCS and use...

  8. DC Brushless Motor Control Design and Preliminary Testing for Independent 4-Wheel Drive Rev-11 Robotic Platform

    Directory of Open Access Journals (Sweden)

    Roni Permana Saputra

    2012-03-01

    Full Text Available This paper discusses the design of control system for brushless DC motor using microcontroller ATMega 16 that will be applied to an independent 4-wheel drive Mobile Robot LIPI version 2 (REV-11. The control system consists of two parts which are brushless DC motor control module and supervisory control module that coordinates the desired command to the motor control module. To control the REV-11 platform, supervisory control transmit the reference data of speed and direction of motor to control the speed and direction of each actuator on the platform REV-11. From the test results it is concluded that the designed control system work properly to coordinate and control the speed and direction of motion of the actuator motor REV-11 platform. 

  9. Personality Consistency in Dogs: A Meta-Analysis

    Science.gov (United States)

    Fratkin, Jamie L.; Sinn, David L.; Patall, Erika A.; Gosling, Samuel D.

    2013-01-01

    Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that ‘puppy tests’ measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family). Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43). Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., ‘puppy tests’) versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed. PMID:23372787

  10. Video Design Games

    DEFF Research Database (Denmark)

    Smith, Rachel Charlotte; Christensen, Kasper Skov; Iversen, Ole Sejer;

    2016-01-01

    We introduce Video Design Games to train educators in teaching design. The Video Design Game is a workshop format consisting of three rounds in which participants observe, reflect and generalize based on video snippets from their own practice. The paper reports on a Video Design Game workshop...

  11. Video Design Games

    DEFF Research Database (Denmark)

    Smith, Rachel Charlotte; Christensen, Kasper Skov; Iversen, Ole Sejer

    We introduce Video Design Games to train educators in teaching design. The Video Design Game is a workshop format consisting of three rounds in which participants observe, reflect and generalize based on video snippets from their own practice. The paper reports on a Video Design Game workshop...

  12. HBV infection in relation to consistent condom use: a population-based study in Peru.

    Directory of Open Access Journals (Sweden)

    Antonio Bernabe-Ortiz

    Full Text Available BACKGROUND: Data on hepatitis B virus (HBV prevalence are limited in developing countries. There is also limited information of consistent condom use efficacy for reducing HBV transmission at the population level. The study goal was to evaluate the prevalence and factors associated with HBV infection in Peru, and the relationship between anti-HBc positivity and consistent condom use. METHODS AND FINDINGS: Data from two different surveys performed in 28 mid-sized Peruvian cities were analyzed. Participants aged 18-29 years were selected using a multistage cluster sampling. Information was collected through a validated two-part questionnaire. The first part (face-to-face concerned demographic data, while the second part (self-administered using handheld computers concerned sexual behavior. Hepatitis B core antibody (anti-HBc was tested in 7,000 blood samples. Prevalences and associations were adjusted for sample strata, primary sampling units and population weights. Anti-HBc prevalence was 5.0% (95%CI 4.1%-5.9%, with the highest prevalence among jungle cities: 16.3% (95%CI 13.8%-19.1%. In the multivariable analysis, Anti-HBc positivity was directly associated with geographic region (highlands OR = 2.05; 95%CI 1.28-3.27, and jungle OR = 4.86; 95%CI 3.05-7.74; compared to coastal region; and inversely associated with age at sexual debut (OR = 0.90; 95%CI 0.85-0.97. Consistent condom use, evaluated in about 40% of participants, was associated with reduced prevalence (OR = 0.34; 95%CI 0.15-0.79 after adjusting for gender, geographic region, education level, lifetime number of sex partners, age at sexual debut and year of survey. CONCLUSION: Residence in highlands or jungle cities is associated with higher anti-HBc prevalences, whereas increasing age at sexual debut were associated with lower prevalences. Consistent condom use was associated with decreased risk of anti-HBc. Findings from this study emphasize the need of primary

  13. Affective-cognitive consistency and thought-induced attitude polarization.

    Science.gov (United States)

    Chaiken, S; Yates, S

    1985-12-01

    Subjects whose preexperimental attitudes toward either capital punishment or censorship were high or low in affective-cognitive consistency were identified. These four groups thought about their attitudes by writing two essays, one on the topic for which consistency had been assessed (relevant essay) and one on the unassessed topic (distractor essay). In accord with the hypothesis that thought-induced attitude polarization requires the presence of a well-developed knowledge structure, high-consistency subjects evidenced greater polarization than low-consistency subjects only on the relevant topic after writing the relevant essay. Content analyses of subjects' relevant essays yielded additional data confirming Tesser's ideas regarding mediation: High (vs. low) consistency subjects expressed a greater proportion of cognitions that were evaluatively consistent with their prior affect toward the attitude object and a smaller proportion of evaluatively inconsistent and neutral cognitions. Moreover, although high-and low-consistency subjects did not differ in the amount of attitudinally relevant information they possessed or their awareness of inconsistent cognitions, their method of dealing with discrepant information diverged: High-consistency subjects evidenced a greater tendency to assimilate discrepant information by generating refutational thoughts that discredited or minimized the importance of inconsistent information.

  14. Philosophical and Methodological Problem of Consistency of Mathematical Theories

    Directory of Open Access Journals (Sweden)

    Michailova N. V.

    2013-01-01

    Full Text Available Increased abstraction of modern mathematical theories has revived interest in traditional philosophical and methodological problem of internally consistent system of axioms where the contradicting each other statements can’t be deduced. If we are talking about axioms describing a well-known area of mathematical objects from the standpoint of local consistency this problem does not appear to be as relevant. But these problems are associated with the various attempts of formalists to explain the mathematical existence through consistency. But, for example, with regard to the problem of establishing of consistency of mathematical analysis the solution of which would clarify the fate of Hilbert's proof theory it has not solved yet so as the problem of the consistency of axiomatic set theory. Therefore it can be assumed that the criterion of consistency despite its essential role in axiomatic systems both formal and substantive nature is the same auxiliary logical criterion as well as mathematical provability. An adequate solution of the problem of consistency of mathematics can be achieved in the area of methodological and substantive arguments revealing the mechanism of appearance of contradictions in the mathematical theory. The paper shows that from a systemic point of view in the context of philosophical and methodological synthesis of various directions of justification of modern mathematics it can’t insist on only the rationale for consistency of mathematical theories.

  15. MANUFACTURE OF THE FERMENTED SAUSAGES WITH THE SMEARED CONSISTENCE

    Directory of Open Access Journals (Sweden)

    Nesterenko A. A.

    2014-10-01

    Full Text Available In foreign practice we have a great demand of using smoked sausage products with a smeared consistence. In the article the basic aspects of manufacturing smoked sausages with a smeared consistence are resulted: the choice of spices, starting cultures and the way of drawing up of forcemeat

  16. Uniform Consistency for Nonparametric Estimators in Null Recurrent Time Series

    DEFF Research Database (Denmark)

    Gao, Jiti; Kanaya, Shin; Li, Degui

    2015-01-01

    This paper establishes uniform consistency results for nonparametric kernel density and regression estimators when time series regressors concerned are nonstationary null recurrent Markov chains. Under suitable regularity conditions, we derive uniform convergence rates of the estimators. Our...... results can be viewed as a nonstationary extension of some well-known uniform consistency results for stationary time series....

  17. Delimiting Coefficient a from Internal Consistency and Unidimensionality

    Science.gov (United States)

    Sijtsma, Klaas

    2015-01-01

    I discuss the contribution by Davenport, Davison, Liou, & Love (2015) in which they relate reliability represented by coefficient a to formal definitions of internal consistency and unidimensionality, both proposed by Cronbach (1951). I argue that coefficient a is a lower bound to reliability and that concepts of internal consistency and…

  18. Carl Rogers during Initial Interviews: A Moderate and Consistent Therapist.

    Science.gov (United States)

    Edwards, H. P.; And Others

    1982-01-01

    Analyzed two initial interviews by Carl Rogers in their entirety using the Carkhuff scales, Hill's category system, and a brief grammatical analysis to establish the level and consistency with which Rogers provides facilitative conditions. Results indicated his behavior as counselor was stable and consistent within and across interviews. (Author)

  19. Decentralized Consistency Checking in Cross-organizational Workflows

    NARCIS (Netherlands)

    Wombacher, Andreas

    Service Oriented Architectures facilitate loosely coupled composed services, which are established in a decentralized way. One challenge for such composed services is to guarantee consistency, i.e., deadlock-freeness. This paper presents a decentralized approach to consistency checking, which

  20. The Self-Consistency Model of Subjective Confidence

    Science.gov (United States)

    Koriat, Asher

    2012-01-01

    How do people monitor the correctness of their answers? A self-consistency model is proposed for the process underlying confidence judgments and their accuracy. In answering a 2-alternative question, participants are assumed to retrieve a sample of representations of the question and base their confidence on the consistency with which the chosen…

  1. Dynamic Consistency between Value and Coordination Models - Research Issues.

    NARCIS (Netherlands)

    Bodenstaff, L.; Wombacher, Andreas; Reichert, M.U.; meersman, R; Tari, Z; herrero, p

    Inter-organizational business cooperations can be described from different viewpoints each fulfilling a specific purpose. Since all viewpoints describe the same system they must not contradict each other, thus, must be consistent. Consistency can be checked based on common semantic concepts of the

  2. Decentralized Consistency Checking in Cross-organizational Workflows

    NARCIS (Netherlands)

    Wombacher, A.

    2006-01-01

    Service Oriented Architectures facilitate loosely coupled composed services, which are established in a decentralized way. One challenge for such composed services is to guarantee consistency, i.e., deadlock-freeness. This paper presents a decentralized approach to consistency checking, which utiliz

  3. Logical consistency and sum-constrained linear models

    NARCIS (Netherlands)

    van Perlo -ten Kleij, Frederieke; Steerneman, A.G.M.; Koning, Ruud H.

    2006-01-01

    A topic that has received quite some attention in the seventies and eighties is logical consistency of sum-constrained linear models. Loosely defined, a sum-constrained model is logically consistent if the restrictions on the parameters and explanatory variables are such that the sum constraint is a

  4. Consistent Sets and Contrary Inferences Reply to Griffiths and Hartle

    CERN Document Server

    Kent, A

    1998-01-01

    It was pointed out recently [A. Kent, Phys. Rev. Lett. 78 (1997) 2874] that the consistent histories approach allows contrary inferences to be made from the same data, corresponding to commuting orthogonal projections in different consistent sets. To many, this seems undesirable in a theory of physical inferences. It also raises a specific problem for the consistent histories formalism, since that formalism is set up so as to eliminate contradictory inferences, yet there seems to be no sensible physical distinction between contradictory and contrary inferences. It seems particularly hard to defend this asymmetry, since (i) there is a well-defined quantum histories formalisms which admits both contradictory and contrary inferences, and (ii) there is also a well-defined formalism, based on ordered consistent sets of histories, which excludes both. In a recent comment, Griffiths and Hartle, while accepting the validity of the examples given in the above paper, restate their own preference for the consistent hist...

  5. The construction and combined operation for fuzzy consistent matrixes

    Institute of Scientific and Technical Information of China (English)

    YAO Min; SHEN Bin; LUO Jian-hua

    2005-01-01

    Fuzziness is one of the general characteristics of human thinking and objective things. Introducing fuzzy techniques into decision-making yields very good results. Fuzzy consistent matrix has many excellent characteristics, especially center-division transitivity conforming to the reality of the human thinking process in decision-making. This paper presents a new approach for creating fuzzy consistent matrix from mutual supplementary matrix in fuzzy decision-making. At the same time,based on the distance between individual fuzzy consistent matrix and average fuzzy consistent matrix, a kind of combined operation for several fuzzy consistent matrixes is presented which reflects most opinions of experienced experts. Finally, a practical example shows its flexibility and practicability further.

  6. Personality and Situation Predictors of Consistent Eating Patterns.

    Directory of Open Access Journals (Sweden)

    Uku Vainik

    Full Text Available A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studied.A community-based sample of 164 women completed various personality tests, and 139 of them also reported their eating behaviour 6 times/day over 10 observational days. We focused on observations with meals (breakfast, lunch, or dinner. The participants indicated if their momentary eating patterns were consistent with their own baseline eating patterns in terms of healthiness or size of the meal. Further, participants described various characteristics of each eating situation.Eating consistency was positively predicted by trait self-control. Eating consistency was undermined by eating in the evening, eating with others, eating away from home, having consumed alcohol and having undertaken physical exercise. Interactions emerged between personality traits and situations, including punishment sensitivity, restraint, physical activity and alcohol consumption.Trait self-control and several eating situation variables were related to eating consistency. These findings provide a starting point for targeting interventions to improve consistency, suggesting that a focus on self-control skills, together with addressing contextual factors such as social situations and time of day, may be most promising. This work is a first step to provide people with the tools they need to maintain a consistently healthy lifestyle in a food-rich environment.

  7. Personality and Situation Predictors of Consistent Eating Patterns.

    Science.gov (United States)

    Vainik, Uku; Dubé, Laurette; Lu, Ji; Fellows, Lesley K

    2015-01-01

    A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studied. A community-based sample of 164 women completed various personality tests, and 139 of them also reported their eating behaviour 6 times/day over 10 observational days. We focused on observations with meals (breakfast, lunch, or dinner). The participants indicated if their momentary eating patterns were consistent with their own baseline eating patterns in terms of healthiness or size of the meal. Further, participants described various characteristics of each eating situation. Eating consistency was positively predicted by trait self-control. Eating consistency was undermined by eating in the evening, eating with others, eating away from home, having consumed alcohol and having undertaken physical exercise. Interactions emerged between personality traits and situations, including punishment sensitivity, restraint, physical activity and alcohol consumption. Trait self-control and several eating situation variables were related to eating consistency. These findings provide a starting point for targeting interventions to improve consistency, suggesting that a focus on self-control skills, together with addressing contextual factors such as social situations and time of day, may be most promising. This work is a first step to provide people with the tools they need to maintain a consistently healthy lifestyle in a food-rich environment.

  8. Variationally consistent discretization schemes and numerical algorithms for contact problems

    Science.gov (United States)

    Wohlmuth, Barbara

    We consider variationally consistent discretization schemes for mechanical contact problems. Most of the results can also be applied to other variational inequalities, such as those for phase transition problems in porous media, for plasticity or for option pricing applications from finance. The starting point is to weakly incorporate the constraint into the setting and to reformulate the inequality in the displacement in terms of a saddle-point problem. Here, the Lagrange multiplier represents the surface forces, and the constraints are restricted to the boundary of the simulation domain. Having a uniform inf-sup bound, one can then establish optimal low-order a priori convergence rates for the discretization error in the primal and dual variables. In addition to the abstract framework of linear saddle-point theory, complementarity terms have to be taken into account. The resulting inequality system is solved by rewriting it equivalently by means of the non-linear complementarity function as a system of equations. Although it is not differentiable in the classical sense, semi-smooth Newton methods, yielding super-linear convergence rates, can be applied and easily implemented in terms of a primal-dual active set strategy. Quite often the solution of contact problems has a low regularity, and the efficiency of the approach can be improved by using adaptive refinement techniques. Different standard types, such as residual- and equilibrated-based a posteriori error estimators, can be designed based on the interpretation of the dual variable as Neumann boundary condition. For the fully dynamic setting it is of interest to apply energy-preserving time-integration schemes. However, the differential algebraic character of the system can result in high oscillations if standard methods are applied. A possible remedy is to modify the fully discretized system by a local redistribution of the mass. Numerical results in two and three dimensions illustrate the wide range of

  9. Concurrent materials and process selection in conceptual design

    Energy Technology Data Exchange (ETDEWEB)

    Kleban, Stephen D.; Knorovsky, Gerald A.

    2000-08-16

    A method for concurrent selection of materials and a joining process based on product requirements using a knowledge-based, constraint satisfaction approach facilitates the product design and manufacturing process. Using a Windows-based computer video display and a data base of materials and their properties, the designer can ascertain the preferred composition of two parts based on various operating/environmental constraints such as load, temperature, lifetime, etc. Optimum joinder of the two parts may simultaneously be determined using a joining process data base based upon the selected composition of the components as well as the operating/environmental constraints.

  10. The internal consistency and validity of the Self-assessment Parkinson's Disease Disability Scale.

    NARCIS (Netherlands)

    Biemans, M.A.J.E.; Dekker, J.; Woude, L.H.V. van der

    2001-01-01

    OBJECTIVE: To test the consistency and validity of the Self-assessment Parkinson's Disease Disability Scale in patients with Parkinson's disease living at home. DESIGN: Patients with Parkinson's disease responded to a set of questionnaires. In addition, an observation of the performance of daily

  11. Internal consistency and validity of the self-assessment Parkinson's Disease disability scale. Abstract.

    NARCIS (Netherlands)

    Dekker, J.; Biemans, M.A.J.E.; Woude, L.H.V. van der

    2000-01-01

    OBJECTIVE: To test the consistency and validity of the Self-assessment Parkinson's Disease Disability Scale in patients with Parkinson's disease living at home. DESIGN: Patients with Parkinson's disease responded to a set of questionnaires. In addition, an observation of the performance of daily

  12. The internal consistency and validity of the Self-Assessment Parkinson's Disease Disability Scale

    NARCIS (Netherlands)

    Biemans, M A; Dekker, J; van der Woude, L H

    2001-01-01

    OBJECTIVE: To test the consistency and validity of the Self-assessment Parkinson's Disease Disability Scale in patients with Parkinson's disease living at home. DESIGN: Patients with Parkinson's disease responded to a set of questionnaires. In addition, an observation of the performance of daily act

  13. Consistent assignment of nurse aides: association with turnover and absenteeism.

    Science.gov (United States)

    Castle, Nicholas G

    2013-01-01

    Consistent assignment refers to the same caregivers consistently caring for the same residents almost every time caregivers are on duty. This article examines the association of consistent assignment of nurse aides with turnover and absenteeism. Data came from a survey of nursing home administrators, the Online Survey Certification and Reporting data, and the Area Resource File. The measures were from 2007 and came from 3,941 nursing homes. Multivariate logistic regression models were used to examine turnover and absenteeism. An average of 68% of nursing homes reported using consistent assignment, with 28% of nursing homes using nurse aides consistent assignment at the often recommended level of 85% (or more). Nursing homes using recommended levels of consistent assignment had significantly lower rates of turnover and of absenteeism. In the multivariate analyses, consistent assignment was significantly associated with both lower turnover and lower absenteeism (p assignment is a practice recommended by many policy makers, government agencies, and industry advocates. The findings presented here provide some evidence that the use of this staffing practice can be beneficial.

  14. THE CONSISTENCY OF STATISTICAL ESTIMATES OF THURSTONE-MOSTELLER

    Directory of Open Access Journals (Sweden)

    Y. V. Bugaev

    2015-01-01

    Full Text Available The traditional method of analysis procedures of collective choice involves three different approaches: investigation operator voting against the characteristic conditions, investigation the properties of the function of choice, analysis on the possibility of manipulating (verification the stability of the voting process under the influence of negative impacts from voters or organizer. Research team of the department ITMU VSUET proposed and implemented a fourth approach, which is to research the probabilistic characteristics the results of the procedures (value of the displacement valuation of estimate the usefulness of specific alternative to its true value, the standard deviation of evaluating of estimate the usefulness of alternative from its true value, the probability of correct ranking of alternatives at the output the procedure of choice, etc.. This article is dedicated to the analysis of the consistency of estimates the usefulness to compare alternatives, obtained at the output of the traditional procedure Thurstone-Mosteller and its generalizations, created by the authors of. In the general, the term of consistency of estimator of statistical estimation assumes tending to zero error of estimation by increasing the sample size. However, depending on the interpretation of "calculation errors" in science are the following main types of consistency: the weak consistency of statistical estimation, based on the notion of convergence in probability of the random quantity; the strong consistency, based on the concept of convergence with probability to one; the consistency of statistical estimation in the mean square. The variance of this assessment tends to zero. This article provides a proof of the theorem, according to which the assumptions rather general nature of estimates the usefulness being ranked alternatives obtained using the procedure Thurstone-Mosteller satisfied the consistency of statistical estimation in the mean square. In this

  15. The Bilevel Design Problem for Communication Networks on Trains: Model, Algorithm, and Verification

    Directory of Open Access Journals (Sweden)

    Yin Tian

    2014-01-01

    Full Text Available This paper proposes a novel method to solve the problem of train communication network design. Firstly, we put forward a general description of such problem. Then, taking advantage of the bilevel programming theory, we created the cost-reliability-delay model (CRD model that consisted of two parts: the physical topology part aimed at obtaining the networks with the maximum reliability under constrained cost, while the logical topology part focused on the communication paths yielding minimum delay based on the physical topology delivered from upper level. We also suggested a method to solve the CRD model, which combined the genetic algorithm and the Floyd-Warshall algorithm. Finally, we used a practical example to verify the accuracy and the effectiveness of the CRD model and further applied the novel method on a train with six carriages.

  16. The Consistent Preferences Approach to Deductive Reasoning in Games

    CERN Document Server

    Asheim, Geir B

    2006-01-01

    "The Consistent Preferences Approach to Deductive Reasoning in Games" presents, applies, and synthesizes what my co-authors and I have called the 'consistent preferences' approach to deductive reasoning in games. Briefly described, this means that the object of the analysis is the ranking by each player of his own strategies, rather than his choice. The ranking can be required to be consistent (in different senses) with his beliefs about the opponent's ranking of her strategies. This can be contrasted to the usual 'rational choice' approach where a player's strategy choice is (in dif

  17. On the Consistency of ZFn in ZFn+3

    Institute of Scientific and Technical Information of China (English)

    李旭华

    1993-01-01

    By restricting the common replacement axiom schema of ZF to ∑n-formulae,Profexxor Zhang Jinwen constructed a series of subsystems of Zermelo-Frankel set theory ZF and he called them ZFn.Zhao Xi shun show that the consistency of ZFn can be deducted from ZF.Porfessor Zhang Jinwen raised the question whether the consistency of ZFn can be deducted from ZFn+m(n) for some m(n)≥1.In this paper,we get a positive solution to Professor Zhang's problem.Moreover,we show that the consistency of ZFn can be deducted from ZFn+3.

  18. Model Checking Data Consistency for Cache Coherence Protocols

    Institute of Scientific and Technical Information of China (English)

    Hong Pan; Hui-Min Lin; Yi Lv

    2006-01-01

    A method for automatic verification of cache coherence protocols is presented, in which cache coherence protocols are modeled as concurrent value-passing processes, and control and data consistency requirement are described as formulas in first-orderμ-calculus. A model checker is employed to check if the protocol under investigation satisfies the required properties. Using this method a data consistency error has been revealed in a well-known cache coherence protocol.The error has been corrected, and the revised protocol has been shown free from data consistency error for any data domain size, by appealing to data independence technique.

  19. Consistency of assertive, aggressive, and submissive behavior for children.

    Science.gov (United States)

    Deluty, R H

    1985-10-01

    The interpersonal behavior of 50 third- through fifth-grade children was assessed over an 8-month period in a wide variety of naturally occurring school activities. The consistency of the children's behavior was found to vary as a function of the child's sex, the class of behavior examined, and the similarity/dissimilarity of the contexts in which the behaviors occurred. Boys demonstrated remarkable consistency in their aggressive expression; 46 of 105 intercorrelations for the aggressiveness dimensions were statistically significant. In general, the consistency of assertive behavior for both boys and girls was unexpectedly high.

  20. On exact triangles consisting of stable vector bundles on tori

    CERN Document Server

    Kobayashi, Kazushi

    2016-01-01

    In this paper, we consider the exact triangles consisting of stable holomorphic vector bundles on one-dimensional complex tori, and discuss their relations with the corresponding Fukaya category via the homological mirror symmetry.

  1. A new insight into the consistency of smoothed particle hydrodynamics

    CERN Document Server

    Sigalotti, Leonardo Di G; Klapp, Jaime; Vargas, Carlos A; Campos, Kilver

    2016-01-01

    In this paper the problem of consistency of smoothed particle hydrodynamics (SPH) is solved. A novel error analysis is developed in $n$-dimensional space using the Poisson summation formula, which enables the treatment of the kernel and particle approximation errors in combined fashion. New consistency integral relations are derived for the particle approximation which correspond to the cosine Fourier transform of the classically known consistency conditions for the kernel approximation. The functional dependence of the error bounds on the SPH interpolation parameters, namely the smoothing length $h$ and the number of particles within the kernel support ${\\cal{N}}$ is demonstrated explicitly from which consistency conditions are seen to follow naturally. As ${\\cal{N}}\\to\\infty$, the particle approximation converges to the kernel approximation independently of $h$ provided that the particle mass scales with $h$ as $m\\propto h^{\\beta}$, with $\\beta >n$. This implies that as $h\\to 0$, the joint limit $m\\to 0$, $...

  2. Island of Stability for Consistent Deformations of Einstein's Gravity

    DEFF Research Database (Denmark)

    Dietrich, Dennis D.; Berkhahn, Felix; Hofmann, Stefan;

    2012-01-01

    We construct deformations of general relativity that are consistent and phenomenologically viable, since they respect, in particular, cosmological backgrounds. These deformations have unique symmetries in accordance with their Minkowski cousins (Fierz-Pauli theory for massive gravitons) and incor...

  3. Consistency of Trend Break Point Estimator with Underspecified Break Number

    Directory of Open Access Journals (Sweden)

    Jingjing Yang

    2017-01-01

    Full Text Available This paper discusses the consistency of trend break point estimators when the number of breaks is underspecified. The consistency of break point estimators in a simple location model with level shifts has been well documented by researchers under various settings, including extensions such as allowing a time trend in the model. Despite the consistency of break point estimators of level shifts, there are few papers on the consistency of trend shift break point estimators in the presence of an underspecified break number. The simulation study and asymptotic analysis in this paper show that the trend shift break point estimator does not converge to the true break points when the break number is underspecified. In the case of two trend shifts, the inconsistency problem worsens if the magnitudes of the breaks are similar and the breaks are either both positive or both negative. The limiting distribution for the trend break point estimator is developed and closely approximates the finite sample performance.

  4. Consistency in experiments on multistable driven delay systems

    Science.gov (United States)

    Oliver, Neus; Larger, Laurent; Fischer, Ingo

    2016-10-01

    We investigate the consistency properties in the responses of a nonlinear delay optoelectronic intensity oscillator subject to different drives, in particular, harmonic and self-generated waveforms. This system, an implementation of the Ikeda oscillator, is operating in a closed-loop configuration, exhibiting its autonomous dynamics while the drive signals are additionally introduced. Applying the same drive multiple times, we compare the dynamical responses of the optoelectronic oscillator and quantify the degree of consistency among them via their correlation. Our results show that consistency is not restricted to conditions close to the first Hopf bifurcation but can be found in a broad range of dynamical regimes, even in the presence of multistability. Finally, we discuss the dependence of consistency on the nature of the drive signal.

  5. Energy-Consistent Multiscale Algorithms for Granular Flows

    Science.gov (United States)

    2014-08-07

    8-98) v Prescribed by ANSI Std. Z39.18 30-07-2014 Final 01-MAY-2011 - 30-APR-2014 AFOSR YIP Energy-Consistent Multiscale Algorithms for Granular...document the achievements made as a result of this Young Investigator Program ( YIP ) project. We worked on the development of multi scale energy... YIP ) project. We worked on the development of multi scale energy-consistent algorithms to simulate and capture flow phenomena in granular

  6. Consistent Reconstruction of Cortical Surfaces from Longitudinal Brain MR Images

    OpenAIRE

    Li, Gang; Nie, Jingxin; Shen, Dinggang

    2011-01-01

    Accurate and consistent reconstruction of cortical surfaces from longitudinal human brain MR images is of great importance in studying subtle morphological changes of the cerebral cortex. This paper presents a new deformable surface method for consistent and accurate reconstruction of inner, central and outer cortical surfaces from longitudinal MR images. Specifically, the cortical surfaces of the group-mean image of all aligned longitudinal images of the same subject are first reconstructed ...

  7. Consistent Reconstruction of Cortical Surfaces from Longitudinal Brain MR Images

    OpenAIRE

    Li, Gang; Nie, Jingxin; Wu, Guorong; Wang, Yaping; Shen, Dinggang

    2011-01-01

    Accurate and consistent reconstruction of cortical surfaces from longitudinal human brain MR images is of great importance in studying longitudinal subtle change of the cerebral cortex. This paper presents a novel deformable surface method for consistent and accurate reconstruction of inner, central and outer cortical surfaces from longitudinal brain MR images. Specifically, the cortical surfaces of the group-mean image of all aligned longitudinal images of the same subject are first reconstr...

  8. On the consistency of coset space dimensional reduction

    Energy Technology Data Exchange (ETDEWEB)

    Chatzistavrakidis, A. [Institute of Nuclear Physics, NCSR DEMOKRITOS, GR-15310 Athens (Greece); Physics Department, National Technical University of Athens, GR-15780 Zografou Campus, Athens (Greece)], E-mail: cthan@mail.ntua.gr; Manousselis, P. [Physics Department, National Technical University of Athens, GR-15780 Zografou Campus, Athens (Greece); Department of Engineering Sciences, University of Patras, GR-26110 Patras (Greece)], E-mail: pman@central.ntua.gr; Prezas, N. [CERN PH-TH, 1211 Geneva (Switzerland)], E-mail: nikolaos.prezas@cern.ch; Zoupanos, G. [Physics Department, National Technical University of Athens, GR-15780 Zografou Campus, Athens (Greece)], E-mail: george.zoupanos@cern.ch

    2007-11-15

    In this Letter we consider higher-dimensional Yang-Mills theories and examine their consistent coset space dimensional reduction. Utilizing a suitable ansatz and imposing a simple set of constraints we determine the four-dimensional gauge theory obtained from the reduction of both the higher-dimensional Lagrangian and the corresponding equations of motion. The two reductions yield equivalent results and hence they constitute an example of a consistent truncation.

  9. Truncations driven by constraints: consistency and conditions for correct upliftings

    CERN Document Server

    Pons, J M; Pons, Josep M.; Talavera, Pere

    2004-01-01

    We discuss the mechanism of truncations driven by the imposition of constraints. We show how the consistency of such truncations is controlled, and give general theorems that establish conditions for the correct uplifting of solutions. We show in some particular examples how one can get correct upliftings from 7d supergravities to 10d type IIB supergravity, even in cases when the truncation is not initially consistent by its own.

  10. S Matrix Proof of Consistency Condition Derived from Mixed Anomaly

    Science.gov (United States)

    Bhansali, Vineer

    For a confining quantum field theory with conserved current J and stress tensor T, the JJJ> and anomalies computed in terms of elementary quanta must be precisely equal to the same anomalies computed in terms of the exact physical spectrum if the conservation law corresponding to J is unbroken. These strongly constrain the allowed representations of the low energy spectrum. We present a proof of the latter consistency condition based on the proof by Coleman and Grossman of the former consistency condition.

  11. Consistent histories, quantum truth functionals, and hidden variables

    Science.gov (United States)

    Griffiths, Robert B.

    2000-01-01

    A central principle of consistent histories quantum theory, the requirement that quantum descriptions be based upon a single framework (or family), is employed to show that there is no conflict between consistent histories and a no-hidden-variables theorem of Bell, and Kochen and Specker, contrary to a recent claim by Bassi and Ghirardi. The argument makes use of `truth functionals' defined on a Boolean algebra of classical or quantum properties.

  12. Consistent histories, quantum truth functionals, and hidden variables

    CERN Document Server

    Griffiths, R B

    1999-01-01

    A central principle of consistent histories quantum theory, the requirement that quantum descriptions be based upon a single framework (or family), is employed to show that there is no conflict between consistent histories and a no-hidden-variables theorem of Bell, and Kochen and Specker, contrary to a recent claim by Bassi and Ghirardi. The argument makes use of ``truth functionals'' defined on a Boolean algebra of classical or quantum properties.

  13. Behavioural consistency and life history of Rana dalmatina tadpoles.

    Science.gov (United States)

    Urszán, Tamás János; Török, János; Hettyey, Attila; Garamszegi, László Zsolt; Herczeg, Gábor

    2015-05-01

    The focus of evolutionary behavioural ecologists has recently turned towards understanding the causes and consequences of behavioural consistency, manifesting either as animal personality (consistency in a single behaviour) or behavioural syndrome (consistency across more behaviours). Behavioural type (mean individual behaviour) has been linked to life-history strategies, leading to the emergence of the integrated pace-of-life syndrome (POLS) theory. Using Rana dalmatina tadpoles as models, we tested if behavioural consistency and POLS could be detected during the early ontogenesis of this amphibian. We targeted two ontogenetic stages and measured activity, exploration and risk-taking in a common garden experiment, assessing both individual behavioural type and intra-individual behavioural variation. We observed that activity was consistent in all tadpoles, exploration only became consistent with advancing age and risk-taking only became consistent in tadpoles that had been tested, and thus disturbed, earlier. Only previously tested tadpoles showed trends indicative of behavioural syndromes. We found an activity-age at metamorphosis POLS in the previously untested tadpoles irrespective of age. Relative growth rate correlated positively with the intra-individual variation of activity of the previously untested older tadpoles. In previously tested older tadpoles, intra-individual variation of exploration correlated negatively and intra-individual variation of risk-taking correlated positively with relative growth rate. We provide evidence for behavioural consistency and POLS in predator- and conspecific-naive tadpoles. Intra-individual behavioural variation was also correlated to life history, suggesting its relevance for the POLS theory. The strong effect of moderate disturbance related to standard behavioural testing on later behaviour draws attention to the pitfalls embedded in repeated testing.

  14. TWO APPROACHES TO IMPROVING THE CONSISTENCY OF COMPLEMENTARY JUDGEMENT MATRIX

    Institute of Scientific and Technical Information of China (English)

    XuZeshui

    2002-01-01

    By the transformation relations between complementary judgement matrix and reciprocal judgement matrix ,this paper proposes two methods for improving the consistency of complementary judgement matrix and gives two simple practical iterative algorithms. These two algorithms are easy to implement on computer,and the modified complementary judgement matrices remain most information that original matrix contains. Thus the methods supplement and develop the theory and methodology for improving consistency of complementary judgement matrix.

  15. Design Methodology - Design Synthesis

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup

    2003-01-01

    ABSTRACT Design Methodology shall be seen as our understanding of how to design; it is an early (emerging late 60ies) and original articulation of teachable and learnable methodics. The insight is based upon two sources: the nature of the designed artefacts and the nature of human designing. Today...... Design Methodology is part of our practice and our knowledge about designing, and it has been strongly supported by the establishing and work of a design research community. The aim of this article is to broaden the reader¿s view of designing and Design Methodology. This is done by sketching...... the development of Design Methodology through time and sketching some important approaches and methods. The development is mainly forced by changing industrial condition, by the growth of IT support for designing, but also by the growth of insight into designing created by design researchers....

  16. Measuring consistency of autobiographical memory recall in depression.

    LENUS (Irish Health Repository)

    Semkovska, Maria

    2012-05-15

    Autobiographical amnesia assessments in depression need to account for normal changes in consistency over time, contribution of mood and type of memories measured. We report herein validation studies of the Columbia Autobiographical Memory Interview - Short Form (CAMI-SF), exclusively used in depressed patients receiving electroconvulsive therapy (ECT) but without previous published report of normative data. The CAMI-SF was administered twice with a 6-month interval to 44 healthy volunteers to obtain normative data for retrieval consistency of its Semantic, Episodic-Extended and Episodic-Specific components and assess their reliability and validity. Healthy volunteers showed significant large decreases in retrieval consistency on all components. The Semantic and Episodic-Specific components demonstrated substantial construct validity. We then assessed CAMI-SF retrieval consistencies over a 2-month interval in 30 severely depressed patients never treated with ECT compared with healthy controls (n=19). On initial assessment, depressed patients produced less episodic-specific memories than controls. Both groups showed equivalent amounts of consistency loss over a 2-month interval on all components. At reassessment, only patients with persisting depressive symptoms were distinguishable from controls on episodic-specific memories retrieved. Research quantifying retrograde amnesia following ECT for depression needs to control for normal loss in consistency over time and contribution of persisting depressive symptoms.

  17. Consistency analysis of accelerated degradation mechanism based on gray theory

    Institute of Scientific and Technical Information of China (English)

    Yunxia Chen; Hongxia Chen; Zhou Yang; Rui Kang; Yi Yang

    2014-01-01

    A fundamental premise of an accelerated testing is that the failure mechanism under elevated and normal stress levels should remain the same. Thus, verification of the consistency of failure mechanisms is essential during an accelerated testing. A new consistency analysis method based on the gray theory is pro-posed for complex products. First of al , existing consistency ana-lysis methods are reviewed with a focus on the comparison of the differences among them. Then, the proposed consistency ana-lysis method is introduced. Two effective gray prediction models, gray dynamic model and new information and equal dimensional (NIED) model, are adapted in the proposed method. The process to determine the dimension of NIED model is also discussed, and a decision rule is expanded. Based on that, the procedure of ap-plying the new consistent analysis method is developed. Final y, a case study of the consistency analysis of a reliability enhancement testing is conducted to demonstrate and validate the proposed method.

  18. Self-consistency in Bicultural Persons: Dialectical Self-beliefs Mediate the Relation between Identity Integration and Self-consistency

    Science.gov (United States)

    Zhang, Rui; Noels, Kimberly A.; Lalonde, Richard N.; Salas, S. J.

    2017-01-01

    Prior research differentiates dialectical (e.g., East Asian) from non-dialectical cultures (e.g., North American and Latino) and attributes cultural differences in self-concept consistency to naïve dialecticism. In this research, we explored the effects of managing two cultural identities on consistency within the bicultural self-concept via the role of dialectical beliefs. Because the challenge of integrating more than one culture within the self is common to biculturals of various heritage backgrounds, the effects of bicultural identity integration should not depend on whether the heritage culture is dialectical or not. In four studies across diverse groups of bicultural Canadians, we showed that having an integrated bicultural identity was associated with being more consistent across roles (Studies 1–3) and making less ambiguous self-evaluations (Study 4). Furthermore, dialectical self-beliefs mediated the effect of bicultural identity integration on self-consistency (Studies 2–4). Finally, Latino biculturals reported being more consistent across roles than did East Asian biculturals (Study 2), revealing the ethnic heritage difference between the two groups. We conclude that both the content of heritage culture and the process of integrating cultural identities influence the extent of self-consistency among biculturals. Thus, consistency within the bicultural self-concept can be understood, in part, to be a unique psychological product of bicultural experience. PMID:28326052

  19. Asynchronous sequential machine design and analysis

    CERN Document Server

    Tinder, Richard

    2009-01-01

    Asynchronous Sequential Machine Design and Analysis provides a lucid, in-depth treatment of asynchronous state machine design and analysis presented in two parts: Part I on the background fundamentals related to asynchronous sequential logic circuits generally, and Part II on self-timed systems, high-performance asynchronous programmable sequencers, and arbiters.Part I provides a detailed review of the background fundamentals for the design and analysis of asynchronous finite state machines (FSMs). Included are the basic models, use of fully documented state diagrams, and the design and charac

  20. Does object view influence the scene consistency effect?

    Science.gov (United States)

    Sastyin, Gergo; Niimi, Ryosuke; Yokosawa, Kazuhiko

    2015-04-01

    Traditional research on the scene consistency effect only used clearly recognizable object stimuli to show mutually interactive context effects for both the object and background components on scene perception (Davenport & Potter in Psychological Science, 15, 559-564, 2004). However, in real environments, objects are viewed from multiple viewpoints, including an accidental, hard-to-recognize one. When the observers named target objects in scenes (Experiments 1a and 1b, object recognition task), we replicated the scene consistency effect (i.e., there was higher accuracy for the objects with consistent backgrounds). However, there was a significant interaction effect between consistency and object viewpoint, which indicated that the scene consistency effect was more important for identifying objects in the accidental view condition than in the canonical view condition. Therefore, the object recognition system may rely more on the scene context when the object is difficult to recognize. In Experiment 2, the observers identified the background (background recognition task) while the scene consistency and object views were manipulated. The results showed that object viewpoint had no effect, while the scene consistency effect was observed. More specifically, the canonical and accidental views both equally provided contextual information for scene perception. These findings suggested that the mechanism for conscious recognition of objects could be dissociated from the mechanism for visual analysis of object images that were part of a scene. The "context" that the object images provided may have been derived from its view-invariant, relatively low-level visual features (e.g., color), rather than its semantic information.

  1. Inter-laboratory consistency of gait analysis measurements.

    Science.gov (United States)

    Benedetti, M G; Merlo, A; Leardini, A

    2013-09-01

    The dissemination of gait analysis as a clinical assessment tool requires the results to be consistent, irrespective of the laboratory. In this work a baseline assessment of between site consistency of one healthy subject examined at 7 different laboratories is presented. Anthropometric and spatio-temporal parameters, pelvis and lower limb joint rotations, joint sagittal moments and powers, and ground reaction forces were compared. The consistency between laboratories for single parameters was assessed by the median absolute deviation and maximum difference, for curves by linear regression. Twenty-one lab-to-lab comparisons were performed and averaged. Large differences were found between the characteristics of the laboratories (i.e. motion capture systems and protocols). Different values for the anthropometric parameters were found, with the largest variability for a pelvis measurement. The spatio-temporal parameters were in general consistent. Segment and joint kinematics consistency was in general high (R2>0.90), except for hip and knee joint rotations. The main difference among curves was a vertical shift associated to the corresponding value in the static position. The consistency between joint sagittal moments ranged form R2=0.90 at the ankle to R2=0.66 at the hip, the latter was increasing when comparing separately laboratories using the same protocol. Pattern similarity was good for ankle power but not satisfactory for knee and hip power. The force was found the most consistent, as expected. The differences found were in general lower than the established minimum detectable changes for gait kinematics and kinetics for healthy adults.

  2. Designing Material Materialising Design

    DEFF Research Database (Denmark)

    Nicholas, Paul

    2013-01-01

    Designing Material Materialising Design documents five projects developed at the Centre for Information Technology and Architecture (CITA) at the Royal Danish Academy of Fine Arts, School of Architecture. These projects explore the idea that new designed materials might require new design methods....... Focusing on fibre reinforced composites, this book sustains an exploration into the design and making of elastically tailored architectural structures that rely on the use of computational design to predict sensitive interdependencies between geometry and behaviour. Developing novel concepts...

  3. Near-resonant absorption in the time-dependent self-consistent field and multiconfigurational self-consistent field approximations

    DEFF Research Database (Denmark)

    Norman, Patrick; Bishop, David M.; Jensen, Hans Jørgen Aa;

    2001-01-01

    Computationally tractable expressions for the evaluation of the linear response function in the multiconfigurational self-consistent field approximation were derived and implemented. The finite lifetime of the electronically excited states was considered and the linear response function was shown...

  4. Martial arts striking hand peak acceleration, accuracy and consistency.

    Science.gov (United States)

    Neto, Osmar Pinto; Marzullo, Ana Carolina De Miranda; Bolander, Richard P; Bir, Cynthia A

    2013-01-01

    The goal of this paper was to investigate the possible trade-off between peak hand acceleration and accuracy and consistency of hand strikes performed by martial artists of different training experiences. Ten male martial artists with training experience ranging from one to nine years volunteered to participate in the experiment. Each participant performed 12 maximum effort goal-directed strikes. Hand acceleration during the strikes was obtained using a tri-axial accelerometer block. A pressure sensor matrix was used to determine the accuracy and consistency of the strikes. Accuracy was estimated by the radial distance between the centroid of each subject's 12 strikes and the target, whereas consistency was estimated by the square root of the 12 strikes mean squared distance from their centroid. We found that training experience was significantly correlated to hand peak acceleration prior to impact (r(2)=0.456, p =0.032) and accuracy (r(2)=0. 621, p=0.012). These correlations suggest that more experienced participants exhibited higher hand peak accelerations and at the same time were more accurate. Training experience, however, was not correlated to consistency (r(2)=0.085, p=0.413). Overall, our results suggest that martial arts training may lead practitioners to achieve higher striking hand accelerations with better accuracy and no change in striking consistency.

  5. Cognitive consistency and math-gender stereotypes in Singaporean children.

    Science.gov (United States)

    Cvencek, Dario; Meltzoff, Andrew N; Kapur, Manu

    2014-01-01

    In social psychology, cognitive consistency is a powerful principle for organizing psychological concepts. There have been few tests of cognitive consistency in children and no research about cognitive consistency in children from Asian cultures, who pose an interesting developmental case. A sample of 172 Singaporean elementary school children completed implicit and explicit measures of math-gender stereotype (male=math), gender identity (me=male), and math self-concept (me=math). Results showed strong evidence for cognitive consistency; the strength of children's math-gender stereotypes, together with their gender identity, significantly predicted their math self-concepts. Cognitive consistency may be culturally universal and a key mechanism for developmental change in social cognition. We also discovered that Singaporean children's math-gender stereotypes increased as a function of age and that boys identified with math more strongly than did girls despite Singaporean girls' excelling in math. The results reveal both cultural universals and cultural variation in developing social cognition. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. Analysis of Consistency of Printing Blankets using Correlation Technique

    Directory of Open Access Journals (Sweden)

    Lalitha Jayaraman

    2010-01-01

    Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thicknessdirection. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimizing the torque for blankets from different manufacturers.

  7. GRAVITATIONALLY CONSISTENT HALO CATALOGS AND MERGER TREES FOR PRECISION COSMOLOGY

    Energy Technology Data Exchange (ETDEWEB)

    Behroozi, Peter S.; Wechsler, Risa H.; Wu, Hao-Yi [Kavli Institute for Particle Astrophysics and Cosmology, Department of Physics, Stanford University, Stanford, CA 94305 (United States); Busha, Michael T. [Institute for Theoretical Physics, University of Zurich, CH-8006 Zurich (Switzerland); Klypin, Anatoly A. [Astronomy Department, New Mexico State University, Las Cruces, NM 88003 (United States); Primack, Joel R., E-mail: behroozi@stanford.edu, E-mail: rwechsler@stanford.edu [Department of Physics, University of California at Santa Cruz, Santa Cruz, CA 95064 (United States)

    2013-01-20

    We present a new algorithm for generating merger trees and halo catalogs which explicitly ensures consistency of halo properties (mass, position, and velocity) across time steps. Our algorithm has demonstrated the ability to improve both the completeness (through detecting and inserting otherwise missing halos) and purity (through detecting and removing spurious objects) of both merger trees and halo catalogs. In addition, our method is able to robustly measure the self-consistency of halo finders; it is the first to directly measure the uncertainties in halo positions, halo velocities, and the halo mass function for a given halo finder based on consistency between snapshots in cosmological simulations. We use this algorithm to generate merger trees for two large simulations (Bolshoi and Consuelo) and evaluate two halo finders (ROCKSTAR and BDM). We find that both the ROCKSTAR and BDM halo finders track halos extremely well; in both, the number of halos which do not have physically consistent progenitors is at the 1%-2% level across all halo masses. Our code is publicly available at http://code.google.com/p/consistent-trees. Our trees and catalogs are publicly available at http://hipacc.ucsc.edu/Bolshoi/.

  8. Self-consistent generalized Langevin equation for colloidal mixtures.

    Science.gov (United States)

    Chávez-Rojo, Marco Antonio; Medina-Noyola, Magdaleno

    2005-09-01

    A self-consistent theory of collective and tracer diffusion in colloidal mixtures is presented. This theory is based on exact results for the partial intermediate scattering functions derived within the framework of the generalized Langevin equation formalism, plus a number of conceptually simple and sensible approximations. The first of these consists of a Vineyard-like approximation between collective and tracer diffusion, which writes the collective dynamics in terms of the memory function related to tracer diffusion. The second consists of interpolating this only unknown memory function between its two exact limits at small and large wave vectors; for this, a phenomenologically determined, but not arbitrary, interpolating function is introduced: a Lorentzian with its inflection point located at the first minimum of the partial static structure factor. The small wave-vector exact limit involves a time-dependent friction function, for which we take a general approximate result, previously derived within the generalized Langevin equation formalism. This general result expresses the time-dependent friction function in terms of the partial intermediate scattering functions, thus closing the system of equations into a fully self-consistent scheme. This extends to mixtures a recently proposed self-consistent theory developed for monodisperse suspensions [Yeomans-Reyna and Medina-Noyola, Phys. Rev. E 64, 066114 (2001)]. As an illustration of its quantitative accuracy, its application to a simple model of a binary dispersion in the absence of hydrodynamic interactions is reported.

  9. Pulsed laser photoacoustic monitoring of paper pulp consistency

    Science.gov (United States)

    Zhao, Zuomin; Törmänen, Matti; Myllylä, Risto

    2008-06-01

    This study involves measurements of pulp consistency in cuvette and by an online apparatus, by innovatively scattering photoacoustic (SPA) method. The theoretical aspects were described at first. Then, a few kinds of wood fiber suspensions with consistencies from 0.5% to 5% were studied in cuvette. After that, a pilot of online apparatus was built to measure suspensions with fiber consistency lower than 1% and filler content up to 3%. The results showed that although there were many fiber flocks in cuvette which strongly affected the measurement accuracy of samples consistencies, the apparatus can sense fiber types with different optical and acoustic properties. The measurement accuracy can be greatly improved in the online style apparatus, by pumping suspension fluids in a circulating system to improve the suspension homogeneity. The results demonstrated that wood fibers cause larger attenuation of acoustic waves but fillers do not. On the other hand, fillers cause stronger scattering of incident light. Therefore, our SPA apparatus has a potential ability to simultaneously determine fiber and filler fractions in pulp suspensions with consistency up to 5%.

  10. Consistency of Scalar Potentials from Quantum de Sitter Space

    CERN Document Server

    Espinosa, José R; Trépanier, Maxime

    2015-01-01

    We derive constraints on the scalar potential of a quantum field theory in de Sitter space. The constraints, which we argue should be understood as consistency conditions for quantum field theories in dS space, originate from a consistent interpretation of quantum de Sitter space through its Coleman-De Luccia tunneling rate. Indeed, consistency of de Sitter space as a quantum theory of gravity with a finite number of degrees of freedom suggests the tunneling rates to vacua with negative cosmological constants be interpreted as Poincar\\'e recurrences. Demanding the tunneling rate to be a Poincar\\'e recurrence imposes two constraints, or consistency conditions, on the scalar potential. Although the exact consistency conditions depend on the shape of the scalar potential, generically they correspond to: the distance in field space between the de Sitter vacuum and any other vacuum with negative cosmological constant must be of the order of the reduced Planck mass or larger; and the fourth root of the vacuum energ...

  11. Gravitationally Consistent Halo Catalogs and Merger Trees for Precision Cosmology

    Science.gov (United States)

    Behroozi, Peter S.; Wechsler, Risa H.; Wu, Hao-Yi; Busha, Michael T.; Klypin, Anatoly A.; Primack, Joel R.

    2013-01-01

    We present a new algorithm for generating merger trees and halo catalogs which explicitly ensures consistency of halo properties (mass, position, and velocity) across time steps. Our algorithm has demonstrated the ability to improve both the completeness (through detecting and inserting otherwise missing halos) and purity (through detecting and removing spurious objects) of both merger trees and halo catalogs. In addition, our method is able to robustly measure the self-consistency of halo finders; it is the first to directly measure the uncertainties in halo positions, halo velocities, and the halo mass function for a given halo finder based on consistency between snapshots in cosmological simulations. We use this algorithm to generate merger trees for two large simulations (Bolshoi and Consuelo) and evaluate two halo finders (ROCKSTAR and BDM). We find that both the ROCKSTAR and BDM halo finders track halos extremely well; in both, the number of halos which do not have physically consistent progenitors is at the 1%-2% level across all halo masses. Our code is publicly available at http://code.google.com/p/consistent-trees. Our trees and catalogs are publicly available at http://hipacc.ucsc.edu/Bolshoi/.

  12. Analysis of Consistency of Printing Blankets using Correlation Technique

    Directory of Open Access Journals (Sweden)

    Balaraman Kumar

    2010-06-01

    Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thickness direction. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimising the torque for blankets from different manufacturers.

  13. Designing the Urban Microclimate. A framework for a design-decision support tool for the dissemination of knowledge on the urban microclimate to the urban design process

    Directory of Open Access Journals (Sweden)

    Marjolein Pijpers-van Esch

    2015-06-01

    Full Text Available This doctoral thesis presents research on the integration and transfer of knowledge from the specialized field of urban microclimatology into the generic field of urban design. Both fields are studied in order to identify crosslinks and reveal gaps. The main research question of the research is: How can the design of urban neighbourhoods contribute to microclimates that support physical well-being and what kind of information and form of presentation does the urban designer need in order to make design decisions regarding such urban microclimates? This question consists of two parts, which are addressed separately in the first two parts of the dissertation.Part 1 concerns an assessment of relevant knowledge on urban design by literature review, followed by a field study into the use of expert information in the urban design process. Part 2 discusses the influence of the urban environment on its microclimate and, consequently, the living quality of its inhabitants – both by means of literature review.Combined, Parts 1 and 2 serve as a basis for a framework for a design-decision support tool, which is discussed in Part 3. This tool is proposed as a means to integrate knowledge of the urban microclimate into the urban design process, bridging an observed gap.Urban design is concerned with shaping the physical environment to facilitate urban life in all its aspects. This is a complex task, which requires the integration and translation of different stakeholder interests into a proposition for the realization of physical-spatial constructs in the urban environment. Such a proposition comprises different planning elements in the following categories: spatial-functional organization, city plan, public space design and rules for architecture. During the design process, the urban designer has to deal with incomplete, often contradictory and/or changing constraints and quality demands as well as other uncertainties. He/ she handles this complexity by

  14. A Framework for Preliminary Design of Aircraft Structures Based on Process Information. Part 1

    Science.gov (United States)

    Rais-Rohani, Masoud

    1998-01-01

    This report discusses the general framework and development of a computational tool for preliminary design of aircraft structures based on process information. The described methodology is suitable for multidisciplinary design optimization (MDO) activities associated with integrated product and process development (IPPD). The framework consists of three parts: (1) product and process definitions; (2) engineering synthesis, and (3) optimization. The product and process definitions are part of input information provided by the design team. The backbone of the system is its ability to analyze a given structural design for performance as well as manufacturability and cost assessment. The system uses a database on material systems and manufacturing processes. Based on the identified set of design variables and an objective function, the system is capable of performing optimization subject to manufacturability, cost, and performance constraints. The accuracy of the manufacturability measures and cost models discussed here depend largely on the available data on specific methods of manufacture and assembly and associated labor requirements. As such, our focus in this research has been on the methodology itself and not so much on its accurate implementation in an industrial setting. A three-tier approach is presented for an IPPD-MDO based design of aircraft structures. The variable-complexity cost estimation methodology and an approach for integrating manufacturing cost assessment into design process are also discussed. This report is presented in two parts. In the first part, the design methodology is presented, and the computational design tool is described. In the second part, a prototype model of the preliminary design Tool for Aircraft Structures based on Process Information (TASPI) is described. Part two also contains an example problem that applies the methodology described here for evaluation of six different design concepts for a wing spar.

  15. A Dynamical Mechanism for Large Volumes with Consistent Couplings

    CERN Document Server

    Abel, Steven

    2016-01-01

    A mechanism for addressing the 'decompactification problem' is proposed, which consists of balancing the vacuum energy in Scherk-Schwarzed theories against contributions coming from non- perturbative physics. Universality of threshold corrections ensures that, in such situations, the stable minimum will have consistent gauge couplings for any gauge group that shares the same N = 2 beta function for the bulk excitations as the gauge group that takes part in the minimisation. Scherk- Schwarz compactification from 6D to 4D in heterotic strings is discussed explicitly, together with two alternative possibilities for the non-perturbative physics, namely metastable SQCD vacua and a single gaugino condensate. In the former case, it is shown that modular symmetries gives various consistency checks, and allow one to follow soft-terms, playing a similar role to R-symmetry in global SQCD. The latter case is particularly attractive when there is nett Bose-Fermi degeneracy in the massless sector. In such cases, because th...

  16. Consistent group selection in high-dimensional linear regression

    CERN Document Server

    Wei, Fengrong; 10.3150/10-BEJ252

    2010-01-01

    In regression problems where covariates can be naturally grouped, the group Lasso is an attractive method for variable selection since it respects the grouping structure in the data. We study the selection and estimation properties of the group Lasso in high-dimensional settings when the number of groups exceeds the sample size. We provide sufficient conditions under which the group Lasso selects a model whose dimension is comparable with the underlying model with high probability and is estimation consistent. However, the group Lasso is, in general, not selection consistent and also tends to select groups that are not important in the model. To improve the selection results, we propose an adaptive group Lasso method which is a generalization of the adaptive Lasso and requires an initial estimator. We show that the adaptive group Lasso is consistent in group selection under certain conditions if the group Lasso is used as the initial estimator.

  17. Lightness constancy through transparency: internal consistency in layered surface representations.

    Science.gov (United States)

    Singh, Manish

    2004-01-01

    Asymmetric lightness matching was employed to measure how the visual system assigns lightness to surface patches seen through partially-transmissive surfaces. Observers adjusted the luminance of a comparison patch seen through transparency, in order to match the lightness of a standard patch seen in plain view. Plots of matched-to-standard luminance were linear, and their slopes were consistent with Metelli's alpha. A control experiment confirmed that these matches were indeed transparency based. Consistent with recent results, however, when observers directly matched the transmittance of transparent surfaces, their matches deviated strongly and systematically from Metelli's alpha. Although the two sets of results appear to be contradictory, formal analysis reveals a deeper mutual consistency in the representation of the two layers. A ratio-of-contrasts model is shown to explain both the success of Metelli's model in predicting lightness through transparency, and its failure to predict perceived transmittance--and hence is seen to play the primary role in perceptual transparency.

  18. A Consistent Semantics of Self-Adjusting Computation

    CERN Document Server

    Acar, Umut A; Donham, Jacob

    2011-01-01

    This paper presents a semantics of self-adjusting computation and proves that the semantics are correct and consistent. The semantics integrate change propagation with the classic idea of memoization to enable reuse of computations under mutation to memory. During evaluation, reuse of a computation via memoization triggers a change propagation that adjusts the reused computation to reflect the mutated memory. Since the semantics integrate memoization and change-propagation, it involves both non-determinism (due to memoization) and mutation (due to change propagation). Our consistency theorem states that the non-determinism is not harmful: any two evaluations of the same program starting at the same state yield the same result. Our correctness theorem states that mutation is not harmful: self-adjusting programs are consistent with purely functional programming. We formalize the semantics and their meta-theory in the LF logical framework and machine check our proofs using Twelf.

  19. Consistency and Reconciliation Model In Regional Development Planning

    Directory of Open Access Journals (Sweden)

    Dina Suryawati

    2016-10-01

    Full Text Available The aim of this study was to identify the problems and determine the conceptual model of regional development planning. Regional development planning is a systemic, complex and unstructured process. Therefore, this study used soft systems methodology to outline unstructured issues with a structured approach. The conceptual models that were successfully constructed in this study are a model of consistency and a model of reconciliation. Regional development planning is a process that is well-integrated with central planning and inter-regional planning documents. Integration and consistency of regional planning documents are very important in order to achieve the development goals that have been set. On the other hand, the process of development planning in the region involves technocratic system, that is, both top-down and bottom-up system of participation. Both must be balanced, do not overlap and do not dominate each other. regional, development, planning, consistency, reconciliation

  20. The consistency approach for the quality control of vaccines.

    Science.gov (United States)

    Hendriksen, Coenraad; Arciniega, Juan L; Bruckner, Lukas; Chevalier, Michel; Coppens, Emmanuelle; Descamps, Johan; Duchêne, Michel; Dusek, David Michael; Halder, Marlies; Kreeftenberg, Hans; Maes, Alexandrine; Redhead, Keith; Ravetkar, Satish D; Spieser, Jean-Marc; Swam, Hanny

    2008-01-01

    Current lot release testing of conventional vaccines emphasizes quality control of the final product and is characterized by its extensive use of laboratory animals. This report, which is based on the outcome of an ECVAM (European Centre for Validation of Alternative Methods, Institute for Health and Consumer Protection, European Commission Joint Research Centre, Ispra, Italy) workshop, discusses the concept of consistency testing as an alternative approach for lot release testing. The consistency approach for the routine release of vaccines is based upon the principle that the quality of vaccines is a consequence of a quality system and of consistent production of lots with similar characteristics to those lots that have been shown to be safe and effective in humans or the target species. The report indicates why and under which circumstances this approach can be applied, the role of the different stakeholders, and the need for international harmonization. It also gives recommendations for its implementation.

  1. Self-consistent modelling of resonant tunnelling structures

    DEFF Research Database (Denmark)

    Fiig, T.; Jauho, A.P.

    1992-01-01

    We report a comprehensive study of the effects of self-consistency on the I-V-characteristics of resonant tunnelling structures. The calculational method is based on a simultaneous solution of the effective-mass Schrödinger equation and the Poisson equation, and the current is evaluated with the ......We report a comprehensive study of the effects of self-consistency on the I-V-characteristics of resonant tunnelling structures. The calculational method is based on a simultaneous solution of the effective-mass Schrödinger equation and the Poisson equation, and the current is evaluated...... applied voltages and carrier densities at the emitter-barrier interface. We include the two-dimensional accumulation layer charge and the quantum well charge in our self-consistent scheme. We discuss the evaluation of the current contribution originating from the two-dimensional accumulation layer charges...

  2. Model-Consistent Sparse Estimation through the Bootstrap

    CERN Document Server

    Bach, Francis

    2009-01-01

    We consider the least-square linear regression problem with regularization by the $\\ell^1$-norm, a problem usually referred to as the Lasso. In this paper, we first present a detailed asymptotic analysis of model consistency of the Lasso in low-dimensional settings. For various decays of the regularization parameter, we compute asymptotic equivalents of the probability of correct model selection. For a specific rate decay, we show that the Lasso selects all the variables that should enter the model with probability tending to one exponentially fast, while it selects all other variables with strictly positive probability. We show that this property implies that if we run the Lasso for several bootstrapped replications of a given sample, then intersecting the supports of the Lasso bootstrap estimates leads to consistent model selection. This novel variable selection procedure, referred to as the Bolasso, is extended to high-dimensional settings by a provably consistent two-step procedure.

  3. Consistency of the group Lasso and multiple kernel learning

    CERN Document Server

    Bach, Francis

    2007-01-01

    We consider the least-square regression problem with regularization by a block 1-norm, i.e., a sum of Euclidean norms over spaces of dimensions larger than one. This problem, referred to as the group Lasso, extends the usual regularization by the 1-norm where all spaces have dimension one, where it is commonly referred to as the Lasso. In this paper, we study the asymptotic model consistency of the group Lasso. We derive necessary and sufficient conditions for the consistency of group Lasso under practical assumptions, such as model misspecification. When the linear predictors and Euclidean norms are replaced by functions and reproducing kernel Hilbert norms, the problem is usually referred to as multiple kernel learning and is commonly used for learning from heterogeneous data sources and for non linear variable selection. Using tools from functional analysis, and in particular covariance operators, we extend the consistency results to this infinite dimensional case and also propose an adaptive scheme to obt...

  4. The consistent histories approach to loop quantum cosmology

    CERN Document Server

    Craig, David A

    2016-01-01

    We review the application of the consistent (or decoherent) histories formulation of quantum theory to canonical loop quantum cosmology. Conventional quantum theory relies crucially on "measurements" to convert unrealized quantum potentialities into physical outcomes that can be assigned probabilities. In the early universe and other physical contexts in which there are no observers or measuring apparatus (or indeed, in any closed quantum system), what criteria determine which alternative outcomes may be realized and what their probabilities are? In the consistent histories formulation it is the vanishing of interference between the branch wave functions describing alternative histories -- as determined by the system's decoherence functional -- that determines which alternatives may be assigned probabilities. We describe the consistent histories formulation and how it may be applied to canonical loop quantum cosmology, describing in detail the application to homogeneous and isotropic cosmological models with ...

  5. One-particle-irreducible consistency relations for cosmological perturbations

    CERN Document Server

    Goldberger, Walter D; Nicolis, Alberto

    2013-01-01

    We derive consistency relations for correlators of scalar cosmological perturbations which hold in the "squeezed limit" in which one or more of the external momenta become soft. Our results are formulated as relations between suitably defined one-particle irreducible N-point and (N-1)-point functions that follow from residual spatial conformal diffeomorphisms of the unitary gauge Lagrangian. As such, some of these relations are exact to all orders in perturbation theory, and do not rely on approximate deSitter invariance or other dynamical assumptions (e.g., properties of the operator product expansion or the behavior of modes at horizon crossing). The consistency relations apply model-independently to cosmological scenarios where the time evolution is driven by a single scalar field. Besides reproducing the known results for single-field inflation in the slow roll limit, we verify that our consistency relations hold more generally, for instance in ghost condensate models in flat space. We comment on possible...

  6. Multiscale Parameter Regionalization for consistent global water resources modelling

    Science.gov (United States)

    Wanders, Niko; Wood, Eric; Pan, Ming; Samaniego, Luis; Thober, Stephan; Kumar, Rohini; Sutanudjaja, Edwin; van Beek, Rens; Bierkens, Marc F. P.

    2017-04-01

    Due to an increasing demand for high- and hyper-resolution water resources information, it has become increasingly important to ensure consistency in model simulations across scales. This consistency can be ensured by scale independent parameterization of the land surface processes, even after calibration of the water resource model. Here, we use the Multiscale Parameter Regionalization technique (MPR, Samaniego et al. 2010, WRR) to allow for a novel, spatially consistent, scale independent parameterization of the global water resource model PCR-GLOBWB. The implementation of MPR in PCR-GLOBWB allows for calibration at coarse resolutions and subsequent parameter transfer to the hyper-resolution. In this study, the model was calibrated at 50 km resolution over Europe and validation carried out at resolutions of 50 km, 10 km and 1 km. MPR allows for a direct transfer of the calibrated transfer function parameters across scales and we find that we can maintain consistent land-atmosphere fluxes across scales. Here we focus on the 2003 European drought and show that the new parameterization allows for high-resolution calibrated simulations of water resources during the drought. For example, we find a reduction from 29% to 9.4% in the percentile difference in the annual evaporative flux across scales when compared against default simulations. Soil moisture errors are reduced from 25% to 6.9%, clearly indicating the benefits of the MPR implementation. This new parameterization allows us to show more spatial detail in water resources simulations that are consistent across scales and also allow validation of discharge for smaller catchments, even with calibrations at a coarse 50 km resolution. The implementation of MPR allows for novel high-resolution calibrated simulations of a global water resources model, providing calibrated high-resolution model simulations with transferred parameter sets from coarse resolutions. The applied methodology can be transferred to other

  7. Neighborhood consistency in mental arithmetic: Behavioral and ERP evidence

    Directory of Open Access Journals (Sweden)

    Verguts Tom

    2007-12-01

    Full Text Available Abstract Background Recent cognitive and computational models (e.g. the Interacting Neighbors Model state that in simple multiplication decade and unit digits of the candidate answers (including the correct result are represented separately. Thus, these models challenge holistic views of number representation as well as traditional accounts of the classical problem size effect in simple arithmetic (i.e. the finding that large problems are answered slower and less accurate than small problems. Empirical data supporting this view are still scarce. Methods Data of 24 participants who performed a multiplication verification task with Arabic digits (e.g. 8 × 4 = 36 - true or false? are reported. Behavioral (i.e. RT and errors and EEG (i.e. ERP measures were recorded in parallel. Results We provide evidence for neighborhood-consistency effects in the verification of simple multiplication problems (e.g. 8 × 4. Behaviorally, we find that decade-consistent lures, which share their decade digit with the correct result (e.g. 36, are harder to reject than matched inconsistent lures, which differ in both digits from the correct result (e.g. 28. This neighborhood consistency effect in product verification is similar to recent observations in the production of multiplication results. With respect to event-related potentials we find significant differences for consistent compared to inconsistent lures in the N400 (increased negativity and Late Positive Component (reduced positivity. In this respect consistency effects in our paradigm resemble lexico-semantic effects earlier found in simple arithmetic and in orthographic input processing. Conclusion Our data suggest that neighborhood consistency effects in simple multiplication stem at least partly from central (lexico-semantic' stages of processing. These results are compatible with current models on the representation of simple multiplication facts – in particular with the Interacting Neighbors Model

  8. Agent-Based Context Consistency Management in Smart Space Environments

    Science.gov (United States)

    Jih, Wan-Rong; Hsu, Jane Yung-Jen; Chang, Han-Wen

    Context-aware systems in smart space environments must be aware of the context of their surroundings and adapt to changes in highly dynamic environments. Data management of contextual information is different from traditional approaches because the contextual information is dynamic, transient, and fallible in nature. Consequently, the capability to detect context inconsistency and maintain consistent contextual information are two key issues for context management. We propose an ontology-based model for representing, deducing, and managing consistent contextual information. In addition, we use ontology reasoning to detect and resolve context inconsistency problems, which will be described in a Smart Alarm Clock scenario.

  9. Consistency among integral measurements of aggregate decay heat power

    Energy Technology Data Exchange (ETDEWEB)

    Takeuchi, H.; Sagisaka, M.; Oyamatsu, K.; Kukita, Y. [Nagoya Univ. (Japan)

    1998-03-01

    Persisting discrepancies between summation calculations and integral measurements force us to assume large uncertainties in the recommended decay heat power. In this paper, we develop a hybrid method to calculate the decay heat power of a fissioning system from those of different fissioning systems. Then, this method is applied to examine consistency among measured decay heat powers of {sup 232}Th, {sup 233}U, {sup 235}U, {sup 238}U and {sup 239}Pu at YAYOI. The consistency among the measured values are found to be satisfied for the {beta} component and fairly well for the {gamma} component, except for cooling times longer than 4000 s. (author)

  10. The consistency service of the ATLAS Distributed Data Management system

    CERN Document Server

    Serfon, C; The ATLAS collaboration

    2011-01-01

    With the continuously increasing volume of data produced by ATLAS and stored on the WLCG sites, the probability of data corruption or data losses, due to software and hardware failures is increasing. In order to ensure the consistency of all data produced by ATLAS a Consistency Service has been developed as part of the DQ2 Distributed Data Management system. This service is fed by the different ATLAS tools, i.e. the analysis tools, production tools, DQ2 site services or by site administrators that report corrupted or lost files. It automatically corrects the errors reported and informs the users in case of irrecoverable file loss.

  11. The Consistency Service of the ATLAS Distributed Data Management system

    CERN Document Server

    Serfon, C; The ATLAS collaboration

    2010-01-01

    With the continuously increasing volume of data produced by ATLAS and stored on the WLCG sites, the probability of data corruption or data losses, due to software and hardware failure is increasing. In order to ensure the consistency of all data produced by ATLAS a Consistency Service has been developed as part of the DQ2 Distributed Data Management system. This service is fed by the different ATLAS tools, i.e. the analysis tools, production tools, DQ2 site services or by site administrators that report corrupted or lost files. It automatically correct the errors reported and informs the users in case of irrecoverable file loss.

  12. Towards consistent nuclear models and comprehensive nuclear data evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Bouland, O [Los Alamos National Laboratory; Hale, G M [Los Alamos National Laboratory; Lynn, J E [Los Alamos National Laboratory; Talou, P [Los Alamos National Laboratory; Bernard, D [FRANCE; Litaize, O [FRANCE; Noguere, G [FRANCE; De Saint Jean, C [FRANCE; Serot, O [FRANCE

    2010-01-01

    The essence of this paper is to enlighten the consistency achieved nowadays in nuclear data and uncertainties assessments in terms of compound nucleus reaction theory from neutron separation energy to continuum. Making the continuity of theories used in resolved (R-matrix theory), unresolved resonance (average R-matrix theory) and continuum (optical model) rangcs by the generalization of the so-called SPRT method, consistent average parameters are extracted from observed measurements and associated covariances are therefore calculated over the whole energy range. This paper recalls, in particular, recent advances on fission cross section calculations and is willing to suggest some hints for future developments.

  13. Standard Model Vacuum Stability and Weyl Consistency Conditions

    DEFF Research Database (Denmark)

    Antipin, Oleg; Gillioz, Marc; Krog, Jens;

    2013-01-01

    At high energy the standard model possesses conformal symmetry at the classical level. This is reflected at the quantum level by relations between the different beta functions of the model. These relations are known as the Weyl consistency conditions. We show that it is possible to satisfy them...... order by order in perturbation theory, provided that a suitable coupling constant counting scheme is used. As a direct phenomenological application, we study the stability of the standard model vacuum at high energies and compare with previous computations violating the Weyl consistency conditions....

  14. Remark on the Consistent Gauge Anomaly in Supersymmetric Theories

    CERN Document Server

    Ohshima, Y; Suzuki, H; Yasuta, H; Ohshima, Yoshihisa; Okuyama, Kiyoshi; Suzuki, Hiroshi; Yasuta, Hirofumi

    1999-01-01

    We present a direct field theoretical calculation of the consistent gauge anomaly in the superfield formalism, on the basis of a definition of the effective action through the covariant gauge current. The scheme is conceptually and technically simple and the gauge covariance in intermediate steps reduces calculational labors considerably. The resultant superfield anomaly, being proportional to the anomaly $d^{abc}=\\tr T^a\\{T^b,T^c\\}$, is minimal even without supplementing any counterterms. Our anomaly coincides with the anomaly obtained by Marinkovi\\'c as the solution of the Wess-Zumino consistency condition.

  15. A Van Atta reflector consisting of half-wave dipoles

    DEFF Research Database (Denmark)

    Appel-Hansen, Jørgen

    1966-01-01

    The reradiation pattern of a passive Van Atta reflector consisting of half-wave dipoles is investigated. The character of the reradiation pattern first is deduced by qualitative and physical considerations. Various types of array elements are considered and several geometrical configurations...... of these elements are outlined. Following this, an analysis is made of the reradiation pattern of a linear Van Atta array consisting of four equispaced half-wave dipoles. The general form of the reradiation pattern is studied analytically. The influence of scattering and coupling is determined and the dependence...

  16. HIGH CONSISTENCY PULPING OF OLD NEWSPRINT AND ITS FLOTATION PROPERTIES

    Institute of Scientific and Technical Information of China (English)

    Chunhui Zhang; Menghua Qin

    2004-01-01

    The mechanical and chemical effect on the pulping properties of the old newsprint was studied using a FORMAX Micro-Maelstrom Laboratory Pulper, and the flotation conditions such as velocity of air flow,air pressure and flotation time were also discussed with a FORMAX Deink Cell. The results show that sodium hydroxide, sodium silicate, hydrogen peroxide and deinking agent are the key factors in the chemical effect, and pulping consistency is more important than pulping time and rotation speed in the mechanical effect during the high consistency pulping of the ONP. In general, the chemical effect has a greater influence on the deinked pulp properties than the mechanical effect.

  17. Island of Stability for Consistent Deformations of Einstein's Gravity

    CERN Document Server

    Berkhahn, Felix; Hofmann, Stefan; Kühnel, Florian; Moyassari, Parvin

    2011-01-01

    We construct explicitly deformations of Einstein's theory of gravity that are consistent and phenomenologically viable since they respect, in particular, cosmological backgrounds. We show that these deformations have unique symmetries in accordance with unitarity requirements, and give rise to a curvature induced self-stabilizing mechanism. As a consequence, any nonlinear completed deformation must incorporate self-stabilization on generic spacetimes already at lowest order in perturbation theory. Furthermore, our findings include the possibility of consistent and phenomenologically viable deformations of general relativity that are solely operative on curved spacetime geometries, reducing to Einstein's theory on the Minkowski background.

  18. Quantum monadology: a consistent world model for consciousness and physics.

    Science.gov (United States)

    Nakagomi, Teruaki

    2003-04-01

    The NL world model presented in the previous paper is embodied by use of relativistic quantum mechanics, which reveals the significance of the reduction of quantum states and the relativity principle, and locates consciousness and the concept of flowing time consistently in physics. This model provides a consistent framework to solve apparent incompatibilities between consciousness (as our interior experience) and matter (as described by quantum mechanics and relativity theory). Does matter have an inside? What is the flowing time now? Does physics allow the indeterminism by volition? The problem of quantum measurement is also resolved in this model.

  19. The cluster bootstrap consistency in generalized estimating equations

    KAUST Repository

    Cheng, Guang

    2013-03-01

    The cluster bootstrap resamples clusters or subjects instead of individual observations in order to preserve the dependence within each cluster or subject. In this paper, we provide a theoretical justification of using the cluster bootstrap for the inferences of the generalized estimating equations (GEE) for clustered/longitudinal data. Under the general exchangeable bootstrap weights, we show that the cluster bootstrap yields a consistent approximation of the distribution of the regression estimate, and a consistent approximation of the confidence sets. We also show that a computationally more efficient one-step version of the cluster bootstrap provides asymptotically equivalent inference. © 2012.

  20. Consistent Deformed Bosonic Algebra in Noncommutative Quantum Mechanics

    CERN Document Server

    Zhang, Jian-Zu

    2009-01-01

    In two-dimensional noncommutive space for the case of both position - position and momentum - momentum noncommuting, the consistent deformed bosonic algebra at the non-perturbation level described by the deformed annihilation and creation operators is investigated. A general relation between noncommutative parameters is fixed from the consistency of the deformed Heisenberg - Weyl algebra with the deformed bosonic algebra. A Fock space is found, in which all calculations can be similarly developed as if in commutative space and all effects of spatial noncommutativity are simply represented by parameters.

  1. Consistent Deformed Bosonic Algebra in Noncommutative Quantum Mechanics

    Science.gov (United States)

    Zhang, Jian-Zu

    In two-dimensional noncommutative space for the case of both position-position and momentum-momentum noncommuting, the consistent deformed bosonic algebra at the nonperturbation level described by the deformed annihilation and creation operators is investigated. A general relation between noncommutative parameters is fixed from the consistency of the deformed Heisenberg-Weyl algebra with the deformed bosonic algebra. A Fock space is found, in which all calculations can be similarly developed as if in commutative space and all effects of spatial noncommutativity are simply represented by parameters.

  2. HIGH CONSISTENCY PULPING OF OLD NEWSPRINT AND ITS FLOTATION PROPERTIES

    Institute of Scientific and Technical Information of China (English)

    ChunhuiZhang; MenghuaQin

    2004-01-01

    The mechanical and chemical effect on the pulping properties of the old newsprint was studied using a FORMAX Micro-Maelstrom Laboratory Pulper, and the flotation conditions such as velocity of air flow, air pressure and flotation time were also discussed with a FORMAX Deink Cell The results show that sodium hydroxide, sodium silicate, hydrogen peroxide and deinking agent are the key factors in the chemical effect, and pulping consistency is more important than pulping time and rotation speed in the mechanical effect during the high consistency pulping of the ONP. In general, the chemical effect has a greater influence on the deinked pulp properties than the mechanical effect.

  3. Dynamically Consistent Nonlinear Evaluations with Their Generating Functions in Lp

    Institute of Scientific and Technical Information of China (English)

    Feng HU

    2013-01-01

    In this paper,we study dynamically consistent nonlinear evaluations in Lp (1 < p < 2).One of our aim is to obtain the following result:under a domination condition,an Ft-consistent evaluation is an ∑g-evaluation in Lp.Furthermore,without the assumption that the generating function g(t,ω,y,z) is continuous with respect to t,we provide some useful characterizations of an εg-evaluation by g and give some applications.These results include and extend some existing results.

  4. Consistency of Social Sensing Signatures Across Major US Cities

    CERN Document Server

    Soliman, Aiman; Padmanabhan, Anand; Wang, Shaowen

    2016-01-01

    Previous studies have shown that Twitter users have biases to tweet from certain locations, locational bias, and during certain hours, temporal bias. We used three years of geolocated Twitter Data to quantify these biases and test our central hypothesis that Twitter users biases are consistent across US cities. Our results suggest that temporal and locational bias of Twitter users are inconsistent between three US metropolitan cities. We derive conclusions about the role of the complexity of the underlying data producing process on its consistency and argue for the potential research avenue for Geospatial Data Science to test and quantify these inconsistencies in the class of organically evolved Big Data.

  5. Designing Material Materialising Design

    DEFF Research Database (Denmark)

    Nicholas, Paul

    2013-01-01

    Designing Material Materialising Design documents five projects developed at the Centre for Information Technology and Architecture (CITA) at the Royal Danish Academy of Fine Arts, School of Architecture. These projects explore the idea that new designed materials might require new design methods...

  6. Brief Report: Consistency of Search Engine Rankings for Autism Websites

    Science.gov (United States)

    Reichow, Brian; Naples, Adam; Steinhoff, Timothy; Halpern, Jason; Volkmar, Fred R.

    2012-01-01

    The World Wide Web is one of the most common methods used by parents to find information on autism spectrum disorders and most consumers find information through search engines such as Google or Bing. However, little is known about how the search engines operate or the consistency of the results that are returned over time. This study presents the…

  7. Consistency Within Diversity: Guidelines for Programs to Honor Exemplary Teaching.

    Science.gov (United States)

    Svinicki, Marilla D.; Menges, Robert J.

    1996-01-01

    Good programs for recognizing exemplary college teaching are consistent with institutional mission and values, are grounded in research-based competencies and practices, recognize all significant facets of instruction, reward both collaborative and individual achievements, neither preclude nor replace the institutional reward system, call on those…

  8. Weakly time consistent concave valuations and their dual representations

    NARCIS (Netherlands)

    Roorda, Berend; Schumacher, Johannes M.

    2016-01-01

    We derive dual characterizations of two notions of weak time consistency for concave valuations, which are convex risk measures under a positive sign convention. Combined with a suitable risk aversion property, these notions are shown to amount to three simple rules for not necessarily minimal repr

  9. A Consistent Procedure for Pseudo-Component Delumping

    DEFF Research Database (Denmark)

    Leibovici, Claude; Stenby, Erling Halfdan; Knudsen, Kim

    1996-01-01

    . Thereby infinite dilution K-values can be obtained exactly without any further computation.Based on these results a consistent procedure for the estimation of equilibrium constants in the more classical cases of finite dilution has been developed. It can be used when moderate binary interaction parameters...

  10. Weakly time consistent concave valuations and their dual representations

    NARCIS (Netherlands)

    Roorda, B.; Schumacher, Hans

    2016-01-01

    We derive dual characterizations of two notions of weak time consistency for concave valuations, which are convex risk measures under a positive sign convention. Combined with a suitable risk aversion property, these notions are shown to amount to three simple rules for not necessarily minimal repre

  11. Hippocampography Guides Consistent Mesial Resections in Neocortical Temporal Lobe Epilepsy

    Directory of Open Access Journals (Sweden)

    Marcus C. Ng

    2016-01-01

    Full Text Available Background. The optimal surgery in lesional neocortical temporal lobe epilepsy is unknown. Hippocampal electrocorticography maximizes seizure freedom by identifying normal-appearing epileptogenic tissue for resection and minimizes neuropsychological deficit by limiting resection to demonstrably epileptogenic tissue. We examined whether standardized hippocampal electrocorticography (hippocampography guides resection for more consistent hippocampectomy than unguided resection in conventional electrocorticography focused on the lesion. Methods. Retrospective chart reviews any kind of electrocorticography (including hippocampography as part of combined lesionectomy, anterolateral temporal lobectomy, and hippocampectomy over 8 years . Patients were divided into mesial (i.e., hippocampography and lateral electrocorticography groups. Primary outcome was deviation from mean hippocampectomy length. Results. Of 26 patients, fourteen underwent hippocampography-guided mesial temporal resection. Hippocampography was associated with 2.6 times more consistent resection. The range of hippocampal resection was 0.7 cm in the mesial group and 1.8 cm in the lateral group (p=0.01. 86% of mesial group versus 42% of lateral group patients achieved seizure freedom (p=0.02. Conclusions. By rationally tailoring excision to demonstrably epileptogenic tissue, hippocampography significantly reduces resection variability for more consistent hippocampectomy than unguided resection in conventional electrocorticography. More consistent hippocampal resection may avoid overresection, which poses greater neuropsychological risk, and underresection, which jeopardizes postoperative seizure freedom.

  12. Discrete anomalies in supergravity and consistency of string backgrounds

    Science.gov (United States)

    Minasian, Ruben; Sasmal, Soumya; Savelli, Raffaele

    2017-02-01

    We examine SL(2, ℤ) anomalies in ten and eight-dimensional supergravities, the induced local counterterms and their realization in string theory. Composite connections play an important rôle in the cancellation mechanism. At the same time their global properties lead to novel non-trivial consistency constraints on compactifications.

  13. Robust Visual Tracking Via Consistent Low-Rank Sparse Learning

    KAUST Repository

    Zhang, Tianzhu

    2014-06-19

    Object tracking is the process of determining the states of a target in consecutive video frames based on properties of motion and appearance consistency. In this paper, we propose a consistent low-rank sparse tracker (CLRST) that builds upon the particle filter framework for tracking. By exploiting temporal consistency, the proposed CLRST algorithm adaptively prunes and selects candidate particles. By using linear sparse combinations of dictionary templates, the proposed method learns the sparse representations of image regions corresponding to candidate particles jointly by exploiting the underlying low-rank constraints. In addition, the proposed CLRST algorithm is computationally attractive since temporal consistency property helps prune particles and the low-rank minimization problem for learning joint sparse representations can be efficiently solved by a sequence of closed form update operations. We evaluate the proposed CLRST algorithm against 14 state-of-the-art tracking methods on a set of 25 challenging image sequences. Experimental results show that the CLRST algorithm performs favorably against state-of-the-art tracking methods in terms of accuracy and execution time.

  14. Checking Consistency of Pedigree Information is NP-complete

    DEFF Research Database (Denmark)

    Aceto, Luca; Hansen, Jens A.; Ingolfsdottir, Anna

    arose originally from the geneticists' need to filter their input data from erroneous information, and is well motivated from both a biological and a sociological viewpoint. This paper shows that consistency checking is NP-complete, even in the presence of three alleles. Several other results...

  15. An algebraic method for constructing stable and consistent autoregressive filters

    Energy Technology Data Exchange (ETDEWEB)

    Harlim, John, E-mail: jharlim@psu.edu [Department of Mathematics, the Pennsylvania State University, University Park, PA 16802 (United States); Department of Meteorology, the Pennsylvania State University, University Park, PA 16802 (United States); Hong, Hoon, E-mail: hong@ncsu.edu [Department of Mathematics, North Carolina State University, Raleigh, NC 27695 (United States); Robbins, Jacob L., E-mail: jlrobbi3@ncsu.edu [Department of Mathematics, North Carolina State University, Raleigh, NC 27695 (United States)

    2015-02-15

    In this paper, we introduce an algebraic method to construct stable and consistent univariate autoregressive (AR) models of low order for filtering and predicting nonlinear turbulent signals with memory depth. By stable, we refer to the classical stability condition for the AR model. By consistent, we refer to the classical consistency constraints of Adams–Bashforth methods of order-two. One attractive feature of this algebraic method is that the model parameters can be obtained without directly knowing any training data set as opposed to many standard, regression-based parameterization methods. It takes only long-time average statistics as inputs. The proposed method provides a discretization time step interval which guarantees the existence of stable and consistent AR model and simultaneously produces the parameters for the AR models. In our numerical examples with two chaotic time series with different characteristics of decaying time scales, we find that the proposed AR models produce significantly more accurate short-term predictive skill and comparable filtering skill relative to the linear regression-based AR models. These encouraging results are robust across wide ranges of discretization times, observation times, and observation noise variances. Finally, we also find that the proposed model produces an improved short-time prediction relative to the linear regression-based AR-models in forecasting a data set that characterizes the variability of the Madden–Julian Oscillation, a dominant tropical atmospheric wave pattern.

  16. Consistency of the Takens estimator for the correlation dimension

    NARCIS (Netherlands)

    Borovkova, S; Burton, R; Dehling, H

    1999-01-01

    Motivated by the problem of estimating the fractal dimension of a strange attractor, we prove weak consistency of U-statistics for stationary ergodic and mixing sequences when the kernel function is unbounded, extending by this earlier results of Aaronson, Burton, Dehling, Gilat, Hill and Weiss. We

  17. Usability problem reports for comparative studies: consistency and inspectability

    NARCIS (Netherlands)

    Vermeeren, A.P.O.S.; Attema, J.; Akar, E.; De Ridder, H.; Van Doorn, A.J.; Erburg, Ç.; Berkman, A.E.; Maguire, M.

    2008-01-01

    This study explores issues of consistency and inspectability in usability test data analysis processes and reports. Problem reports resulting from usability tests performed by three professional usability labs in three different countries are compared. Each of the labs conducted a usability test on

  18. Body saccades of Drosophila consist of stereotyped banked turns

    NARCIS (Netherlands)

    Muijres, F.T.; Elzinga, M.J.; Iwasaki, N.A.; Dickinson, M.H.

    2015-01-01

    The flight pattern of many fly species consists of straight flight segments interspersed with rapid turns called body saccades, a strategy that is thought to minimize motion blur. We analyzed the body saccades of fruit flies (Drosophila hydei), using high-speed 3D videography to track body and wing

  19. Assessing atmospheric bias correction for dynamical consistency using potential vorticity

    Science.gov (United States)

    Rocheta, Eytan; Evans, Jason P.; Sharma, Ashish

    2014-12-01

    Correcting biases in atmospheric variables prior to impact studies or dynamical downscaling can lead to new biases as dynamical consistency between the ‘corrected’ fields is not maintained. Use of these bias corrected fields for subsequent impact studies and dynamical downscaling provides input conditions that do not appropriately represent intervariable relationships in atmospheric fields. Here we investigate the consequences of the lack of dynamical consistency in bias correction using a measure of model consistency—the potential vorticity (PV). This paper presents an assessment of the biases present in PV using two alternative correction techniques—an approach where bias correction is performed individually on each atmospheric variable, thereby ignoring the physical relationships that exists between the multiple variables that are corrected, and a second approach where bias correction is performed directly on the PV field, thereby keeping the system dynamically coherent throughout the correction process. In this paper we show that bias correcting variables independently results in increased errors above the tropopause in the mean and standard deviation of the PV field, which are improved when using the alternative proposed. Furthermore, patterns of spatial variability are improved over nearly all vertical levels when applying the alternative approach. Results point to a need for a dynamically consistent atmospheric bias correction technique which results in fields that can be used as dynamically consistent lateral boundaries in follow-up downscaling applications.

  20. Discrete anomalies in supergravity and consistency of string backgrounds

    CERN Document Server

    Minasian, Ruben; Savelli, Raffaele

    2016-01-01

    We examine SL(2, Z) anomalies in ten and eight-dimensional supergravities, the induced local counterterms and their realization in string theory. Composite connections play an important role in the cancellation mechanism. At the same time their global properties lead to novel non-trivial consistency constraints on compactifications.

  1. New sequential quadratic programming algorithm with consistent subproblems

    Institute of Scientific and Technical Information of China (English)

    贺国平; 高自友; 赖炎连

    1997-01-01

    One of the most interesting topics related to sequential quadratic programming algorithms is how to guarantee the consistence of all quadratic programming subproblems. In this decade, much work trying to change the form of constraints to obtain the consistence of the subproblems has been done The method proposed by De O. Panto-ja J F A and coworkers solves the consistent problem of SQP method, and is the best to the authors’ knowledge. However, the scale and complexity of the subproblems in De O. Pantoja’s work will be increased greatly since all equality constraints have to be changed into absolute form A new sequential quadratic programming type algorithm is presented by means of a special ε-active set scheme and a special penalty function. Subproblems of the new algorithm are all consistent, and the form of constraints of the subproblems is as simple as one of the general SQP type algorithms. It can be proved that the new method keeps global convergence and local superhnear convergence.

  2. Personalities in great tits, Parus major : stability and consistency

    NARCIS (Netherlands)

    Carere, C; Drent, Piet J.; Privitera, Lucia; Koolhaas, Jaap M.; Groothuis, TGG

    2005-01-01

    We carried out a longitudinal study on great tits from two lines bidirectionally selected for fast or slow exploratory performance during the juvenile phase, a trait thought to reflect different personalities. We analysed temporal stability and consistency of responses within and between situations

  3. Self-Consistence of Semi-Classical Gravity

    CERN Document Server

    Suen, W M

    1992-01-01

    Simon argued that the semi-classical theory of gravity, unless with some of its solutions excluded, is unacceptable for reasons of both self-consistency and experiment, and that it has to be replaced by a constrained semi-classical theory. We examined whether the evidence is conclusive.

  4. SOCIAL COMPARISON, SELF-CONSISTENCY AND THE PRESENTATION OF SELF.

    Science.gov (United States)

    MORSE, STANLEY J.; GERGEN, KENNETH J.

    TO DISCOVER HOW A PERSON'S (P) SELF-CONCEPT IS AFFECTED BY THE CHARACTERISTICS OF ANOTHER (O) WHO SUDDENLY APPEARS IN THE SAME SOCIAL ENVIRONMENT, SEVERAL QUESTIONNAIRES, INCLUDING THE GERGEN-MORSE (1967) SELF-CONSISTENCY SCALE AND HALF THE COOPERSMITH SELF-ESTEEM INVENTORY, WERE ADMINISTERED TO 78 UNDERGRADUATE MEN WHO HAD ANSWERED AN AD FOR WORK…

  5. Plant functional traits have globally consistent effects on competition

    NARCIS (Netherlands)

    Kunstler, Georges; Falster, Daniel; Coomes, David A.; Poorter, Lourens

    2016-01-01

    Phenotypic traits and their associated trade-offs have been shown to have globally consistent effects on individual plant physiological functions, but how these effects scale up to influence competition, a key driver of community assembly in terrestrial vegetation, has remained unclear. Here we

  6. Fully self-consistent GW calculations for molecules

    DEFF Research Database (Denmark)

    Rostgaard, Carsten; Jacobsen, Karsten Wedel; Thygesen, Kristian Sommer

    2010-01-01

    We calculate single-particle excitation energies for a series of 34 molecules using fully self-consistent GW, one-shot G0W0, Hartree-Fock (HF), and hybrid density-functional theory (DFT). All calculations are performed within the projector-augmented wave method using a basis set of Wannier...

  7. On ZRP wind input term consistency in Hasselmann equation

    CERN Document Server

    Zakharov, Vladimir; Pushkarev, Andrei

    2016-01-01

    The new ZRP wind input source term (Zakharov et al. 2012) is checked for its consistency via numerical simulation of Hasselmann equation. The results are compared to field experimental data, collected at different sites around the world, and theoretical predictions of self-similarity analysis. Good agreement is obtained for limited fetch and time domain statements

  8. Consistency in behavior of the CEO regarding corporate social responsibility

    NARCIS (Netherlands)

    Elving, W.J.L.; Kartal, D.

    2012-01-01

    Purpose - When corporations adopt a corporate social responsibility (CSR) program and use and name it in their external communications, their members should act in line with CSR. The purpose of this paper is to present an experiment in which the consistent or inconsistent behavior of a CEO was

  9. An Intuitionistic Epistemic Logic for Sequential Consistency on Shared Memory

    Science.gov (United States)

    Hirai, Yoichi

    In the celebrated Gödel Prize winning papers, Herlihy, Shavit, Saks and Zaharoglou gave topological characterization of waitfree computation. In this paper, we characterize waitfree communication logically. First, we give an intuitionistic epistemic logic k∨ for asynchronous communication. The semantics for the logic k∨ is an abstraction of Herlihy and Shavit's topological model. In the same way Kripke model for intuitionistic logic informally describes an agent increasing its knowledge over time, the semantics of k∨ describes multiple agents passing proofs around and developing their knowledge together. On top of the logic k∨, we give an axiom type that characterizes sequential consistency on shared memory. The advantage of intuitionistic logic over classical logic then becomes apparent as the axioms for sequential consistency are meaningless for classical logic because they are classical tautologies. The axioms are similar to the axiom type for prelinearity (ϕ ⊃ ψ) ∨ (ψ ⊃ ϕ). This similarity reflects the analogy between sequential consistency for shared memory scheduling and linearity for Kripke frames: both require total order on schedules or models. Finally, under sequential consistency, we give soundness and completeness between a set of logical formulas called waitfree assertions and a set of models called waitfree schedule models.

  10. Noncommuting Electric Fields and Algebraic Consistency in Noncommutative Gauge theories

    CERN Document Server

    Banerjee, R

    2003-01-01

    We show that noncommuting electric fields occur naturally in noncommutative gauge theories. Using this noncommutativity, which is field dependent, and a hamiltonian generalisation of the Seiberg-Witten Map, the algebraic consistency in the lagrangian and hamiltonian formulations of these theories, is established. The stability of the Poisson algebra, under this generalised map, is studied.

  11. Efficient self-consistent quantum transport simulator for quantum devices

    Energy Technology Data Exchange (ETDEWEB)

    Gao, X., E-mail: xngao@sandia.gov; Mamaluy, D.; Nielsen, E.; Young, R. W.; Lilly, M. P.; Bishop, N. C.; Carroll, M. S.; Muller, R. P. [Sandia National Laboratories, 1515 Eubank SE, Albuquerque, New Mexico 87123 (United States); Shirkhorshidian, A. [Sandia National Laboratories, 1515 Eubank SE, Albuquerque, New Mexico 87123 (United States); University of New Mexico, Albuquerque, New Mexico 87131 (United States)

    2014-04-07

    We present a self-consistent one-dimensional (1D) quantum transport simulator based on the Contact Block Reduction (CBR) method, aiming for very fast and robust transport simulation of 1D quantum devices. Applying the general CBR approach to 1D open systems results in a set of very simple equations that are derived and given in detail for the first time. The charge self-consistency of the coupled CBR-Poisson equations is achieved by using the predictor-corrector iteration scheme with the optional Anderson acceleration. In addition, we introduce a new way to convert an equilibrium electrostatic barrier potential calculated from an external simulator to an effective doping profile, which is then used by the CBR-Poisson code for transport simulation of the barrier under non-zero biases. The code has been applied to simulate the quantum transport in a double barrier structure and across a tunnel barrier in a silicon double quantum dot. Extremely fast self-consistent 1D simulations of the differential conductance across a tunnel barrier in the quantum dot show better qualitative agreement with experiment than non-self-consistent simulations.

  12. Context-dependent individual behavioral consistency in Daphnia

    DEFF Research Database (Denmark)

    Heuschele, Jan; Ekvall, Mikael T.; Bianco, Giuseppe

    2017-01-01

    , whereas studies focusing on smaller aquatic organisms are still rare. Here, we show individual differences in the swimming behavior of Daphnia magna, a clonal freshwater invertebrate, before, during, and after being exposed to a lethal threat, ultraviolet radiation (UVR). We show consistency in swimming...

  13. FINITE DEFORMATION ELASTO-PLASTIC THEORY AND CONSISTENT ALGORITHM

    Institute of Scientific and Technical Information of China (English)

    Liu Xuejun; Li Mingrui; Huang Wenbin

    2001-01-01

    By using the logarithmic strain, the finite deformation plastic theory, corresponding to the infinitesimal plastic theory, is established successively. The plastic consistent algorithm with first order accuracy for the finite element method (FEM) is developed. Numerical examples are presented to illustrate the validity of the theory and effectiveness of the algorithm.

  14. A Nonparametric Approach to Estimate Classification Accuracy and Consistency

    Science.gov (United States)

    Lathrop, Quinn N.; Cheng, Ying

    2014-01-01

    When cut scores for classifications occur on the total score scale, popular methods for estimating classification accuracy (CA) and classification consistency (CC) require assumptions about a parametric form of the test scores or about a parametric response model, such as item response theory (IRT). This article develops an approach to estimate CA…

  15. Evaluating Reflective Writing for Appropriateness, Fairness, and Consistency.

    Science.gov (United States)

    Kennison, Monica Metrick; Misselwitz, Shirley

    2002-01-01

    Samples from 17 reflective journals of nursing students were evaluated by 6 faculty. Results indicate a lack of consistency in grading reflective writing, lack of consensus regarding evaluation, and differences among faculty regarding their view of such exercises. (Contains 26 references.) (JOW)

  16. Improving consistency in student evaluation at affiliated family practice centers.

    Science.gov (United States)

    Rabinowitz, H K

    1986-01-01

    The Department of Family Medicine at Jefferson Medical College has since 1974 been successful in administering a required third-year family medicine clerkship, providing students with a structured, didactic, and experiential curriculum in six affiliated family practice centers. Prior analysis (1976-1981) had indicated, however, that variation existed in evaluating similar students, depending on the clerkship training site, i.e., three sites graded students in a significantly different fashion than the three other sites. Utilizing these data to focus on the evaluation process, a comprehensive and specific six-point plan was developed to improve consistency in evaluations at the different training sites. This plan consisted of a yearly meeting of affiliate faculty, assigning predoctoral training administrative responsibility to one faculty member at each training site, increased telephone communication, affiliate-faculty attendance at the university site evaluation session, faculty rotation to spend time at other training sites, and financial reimbursement to the affiliate training sites. After intervention, analysis (1981-1983) indicated that five of the six clerkship sites now grade students in a consistent fashion, with only one affiliate using different grading standards. The intervention was therefore judged to be successful for five of the six training sites, allowing for better communication and more critical and consistent evaluation of medical students.

  17. [Consistent presentation of medical images based on CPI integration profile].

    Science.gov (United States)

    Jiang, Tao; An, Ji-ye; Chen, Zhong-yong; Lu, Xu-dong; Duan, Hui-long

    2007-11-01

    Because of different display parameters and other factors, digital medical images present different display states in different section offices of a hospital. Based on CPI integration profile of IHE, this paper implements the consistent presentation of medical images, and it is helpful for doctors to carry out medical treatments of teamwork.

  18. Measures of Consistency for Holland-Type Codes.

    Science.gov (United States)

    Strahan, Robert F.

    1987-01-01

    Describes two new measures of consistency which refer to the extent to which more closely related scale types are found together in Holland's Self-Directed Search sort. One measure is based on the hexagonal model for use with three-point codes. The other is based on conditional probabilities for use with two-point codes. (Author/ABL)

  19. Vlasov - Maxwell, Self-consistent Electromagnetic Wave Emission Simulations in the Solar Corona

    Science.gov (United States)

    Tsiklauri, David

    2010-12-01

    1.5D Vlasov - Maxwell simulations are employed to model electromagnetic emission generation in a fully self-consistent plasma kinetic model for the first time in the context of solar physics. The simulations mimic the plasma emission mechanism and Larmor-drift instability in a plasma thread that connects the Sun to Earth with the spatial scales compressed appropriately. The effects of spatial density gradients on the generation of electromagnetic radiation are investigated. It is shown that a 1.5D inhomogeneous plasma with a uniform background magnetic field directed transverse to the density gradient is aperiodically unstable to the Larmor-drift instability. The latter results in a novel effect of generation of electromagnetic emission at plasma frequency. The generated perturbations consist of two parts: i) non-escaping (trapped) Langmuir type oscillations, which are localised in the regions of density inhomogeneity, and are highly filamentary, with the period of appearance of the filaments close to electron plasma frequency in the dense regions; and ii) escaping electromagnetic radiation with phase speeds close to the speed of light. When the density gradient is removed ( i.e. when plasma becomes stable to the Larmor-drift instability) and a low density super-thermal, hot beam is injected along the domain, in the direction perpendicular to the magnetic field, the plasma emission mechanism generates non-escaping Langmuir type oscillations, which in turn generate escaping electromagnetic radiation. It is found that in the spatial location where the beam is injected, standing waves, oscillating at the plasma frequency, are excited. These can be used to interpret the horizontal strips (the narrow-band line emission) observed in some dynamical spectra. Predictions of quasilinear theory are: i) the electron free streaming and ii) the long relaxation time of the beam, in accord with the analytic expressions. These are corroborated via direct, fully-kinetic simulation

  20. A quantum physical design flow using ILP and graph drawing

    Science.gov (United States)

    Yazdani, Maryam; Saheb Zamani, Morteza; Sedighi, Mehdi

    2013-10-01

    Implementing large-scale quantum circuits is one of the challenges of quantum computing. One of the central challenges of accurately modeling the architecture of these circuits is to schedule a quantum application and generate the layout while taking into account the cost of communications and classical resources as well as the maximum exploitable parallelism. In this paper, we present and evaluate a design flow for arbitrary quantum circuits in ion trap technology. Our design flow consists of two parts. First, a scheduler takes a description of a circuit and finds the best order for the execution of its quantum gates using integer linear programming regarding the classical resources (qubits) and instruction dependencies. Then a layout generator receives the schedule produced by the scheduler and generates a layout for this circuit using a graph-drawing algorithm. Our experimental results show that the proposed flow decreases the average latency of quantum circuits by about 11 % for a set of attempted benchmarks and by about 9 % for another set of benchmarks compared with the best in literature.

  1. Consistent Data Assimilation of Isotopes: 242Pu and 105Pd

    Energy Technology Data Exchange (ETDEWEB)

    G. Palmiotti; H. Hiruta; M. Salvatores

    2012-09-01

    In this annual report we illustrate the methodology of the consistent data assimilation that allows to use the information coming from integral experiments for improving the basic nuclear parameters used in cross section evaluation. A series of integral experiments are analyzed using the EMPIRE evaluated files for 242Pu and 105Pd. In particular irradiation experiments (PROFIL-1 and -2, TRAPU-1, -2 and -3) provide information about capture cross sections, and a critical configuration, COSMO, where fission spectral indexes were measured, provides information about fission cross section. The observed discrepancies between calculated and experimental results are used in conjunction with the computed sensitivity coefficients and covariance matrix for nuclear parameters in a consistent data assimilation. The results obtained by the consistent data assimilation indicate that not so large modifications on some key identified nuclear parameters allow to obtain reasonable C/E. However, for some parameters such variations are outside the range of 1 s of their initial standard deviation. This can indicate a possible conflict between differential measurements (used to calculate the initial standard deviations) and the integral measurements used in the statistical data adjustment. Moreover, an inconsistency between the C/E of two sets of irradiation experiments (PROFIL and TRAPU) is observed for 242Pu. This is the end of this project funded by the Nuclear Physics Program of the DOE Office of Science. We can indicate that a proof of principle has been demonstrated for a few isotopes for this innovative methodology. However, we are still far from having explored all the possibilities and made this methodology to be considered proved and robust. In particular many issues are worth further investigation: • Non-linear effects • Flexibility of nuclear parameters in describing cross sections • Multi-isotope consistent assimilation • Consistency between differential and integral

  2. Performance and consistency of indicator groups in two biodiversity hotspots.

    Directory of Open Access Journals (Sweden)

    Joaquim Trindade-Filho

    Full Text Available BACKGROUND: In a world limited by data availability and limited funds for conservation, scientists and practitioners must use indicator groups to define spatial conservation priorities. Several studies have evaluated the effectiveness of indicator groups, but still little is known about the consistency in performance of these groups in different regions, which would allow their a priori selection. METHODOLOGY/PRINCIPAL FINDINGS: We systematically examined the effectiveness and the consistency of nine indicator groups in representing mammal species in two top-ranked Biodiversity Hotspots (BH: the Brazilian Cerrado and the Atlantic Forest. To test for group effectiveness we first found the best sets of sites able to maximize the representation of each indicator group in the BH and then calculated the average representation of different target species by the indicator groups in the BH. We considered consistent indicator groups whose representation of target species was not statistically different between BH. We called effective those groups that outperformed the target-species representation achieved by random sets of species. Effective indicator groups required the selection of less than 2% of the BH area for representing target species. Restricted-range species were the most effective indicators for the representation of all mammal diversity as well as target species. It was also the only group with high consistency. CONCLUSIONS/SIGNIFICANCE: We show that several indicator groups could be applied as shortcuts for representing mammal species in the Cerrado and the Atlantic Forest to develop conservation plans, however, only restricted-range species consistently held as the most effective indicator group for such a task. This group is of particular importance in conservation planning as it captures high diversity of endemic and endangered species.

  3. The internal consistency of the North Sea carbonate system

    Science.gov (United States)

    Salt, Lesley A.; Thomas, Helmuth; Bozec, Yann; Borges, Alberto V.; de Baar, Hein J. W.

    2016-05-01

    In 2002 (February) and 2005 (August), the full suite of carbonate system parameters (total alkalinity (AT), dissolved inorganic carbon (DIC), pH, and partial pressure of CO2 (pCO2) were measured on two re-occupations of the entire North Sea basin, with three parameters (AT, DIC, pCO2) measured on four additional re-occupations, covering all four seasons, allowing an assessment of the internal consistency of the carbonate system. For most of the year, there is a similar level of internal consistency, with AT being calculated to within ± 6 μmol kg- 1 using DIC and pH, DIC to ± 6 μmol kg- 1 using AT and pH, pH to ± 0.008 using AT and pCO2, and pCO2 to ± 8 μatm using DIC and pH, with the dissociation constants of Millero et al. (2006). In spring, however, we observe a significant decline in the ability to accurately calculate the carbonate system. Lower consistency is observed with an increasing fraction of Baltic Sea water, caused by the high contribution of organic alkalinity in this water mass, not accounted for in the carbonate system calculations. Attempts to improve the internal consistency by accounting for the unconventional salinity-borate relationships in freshwater and the Baltic Sea, and through application of the new North Atlantic salinity-boron relationship (Lee et al., 2010), resulted in no significant difference in the internal consistency.

  4. Consistency of accuracy assessment indices for soft classification: Simulation analysis

    Science.gov (United States)

    Chen, Jin; Zhu, Xiaolin; Imura, Hidefumi; Chen, Xuehong

    Accuracy assessment plays a crucial role in the implementation of soft classification. Even though many indices of accuracy assessment for soft classification have been proposed, the consistencies among these indices are not clear, and the impact of sample size on these consistencies has not been investigated. This paper examines two kinds of indices: map-level indices, including root mean square error ( rmse), kappa, and overall accuracy ( oa) from the sub-pixel confusion matrix (SCM); and category-level indices, including crmse, user accuracy ( ua) and producer accuracy ( pa). A careful simulation was conducted to investigate the consistency of these indices and the effect of sample size. The major findings were as follows: (1) The map-level indices are highly consistent with each other, whereas the category-level indices are not. (2) The consistency among map-level and category-level indices becomes weaker when the sample size decreases. (3) The rmse is more affected by error distribution among classes than are kappa and oa. Based on these results, we recommend that rmse can be used for map-level accuracy due to its simplicity, although kappa and oa may be better alternatives when the sample size is limited because the two indices are affected less by the error distribution among classes. We also suggest that crmse should be provided when map users are not concerned about the error source, whereas ua and pa are more useful when the complete information about different errors is required. The results of this study will be of benefit to the development and application of soft classifiers.

  5. Consistent SPH Simulations of Protostellar Collapse and Fragmentation

    Science.gov (United States)

    Gabbasov, Ruslan; Sigalotti, Leonardo Di G.; Cruz, Fidel; Klapp, Jaime; Ramírez-Velasquez, José M.

    2017-02-01

    We study the consistency and convergence of smoothed particle hydrodynamics (SPH) as a function of the interpolation parameters, namely the number of particles N, the number of neighbors n, and the smoothing length h, using simulations of the collapse and fragmentation of protostellar rotating cores. The calculations are made using a modified version of the GADGET-2 code that employs an improved scheme for the artificial viscosity and power-law dependences of n and h on N, as was recently proposed by Zhu et al., which comply with the combined limit N\\to ∞ , h\\to 0, and n\\to ∞ with n/N\\to 0 for full SPH consistency as the domain resolution is increased. We apply this realization to the “standard isothermal test case” in the variant calculated by Burkert & Bodenheimer and the Gaussian cloud model of Boss to investigate the response of the method to adaptive smoothing lengths in the presence of large density and pressure gradients. The degree of consistency is measured by tracking how well the estimates of the consistency integral relations reproduce their continuous counterparts. In particular, C 0 and C 1 particle consistency is demonstrated, meaning that the calculations are close to second-order accuracy. As long as n is increased with N, mass resolution also improves as the minimum resolvable mass {M}\\min ∼ {n}-1. This aspect allows proper calculation of small-scale structures in the flow associated with the formation and instability of protostellar disks around the growing fragments, which are seen to develop a spiral structure and fragment into close binary/multiple systems as supported by recent observations.

  6. Self-repairable polymeric networks: Synthesis and network design

    Science.gov (United States)

    Ghosh, Biswajit

    This dissertation describes the design, synthesis and development of a new class of polymeric networks that exhibit self-repairing properties under UV exposure. It consists of two parts: (a) modification and synthesis of oxetane (OXE), and oxolane (OXO) substituted chitosan (CHI) macromonomer, and (b) design, and synthesis of self-repairing polyurethane (PUR) networks consisting of modified chitosan. Unmodified CHI consisting of acetamide (-NHCOCH3), primary hydroxyl (-OH), and amine (-NH2) functional groups were reacted with OXE or OXO compounds under basic conditions in order to substitute the 1° --OH groups, and at the same time, convert -NHCOCH 3 functionalities into -NH2 groups, while maintaining their un-reacted form to generate OXE/OXO-substituted CHI macromonomer. These substituted CHI macromonomers were incorporated within the PUR backbone by reacting with trifunctional isocyanate in the presence of polyethylene glycol (PEG) and dibutyl tin dilaurate catalyst (DBTDL). Utilizing spectroscopic analysis combined with optical microscopy, these studies showed that the kinetics of self-repair depends on the stoichiometry of the individual entities as well as the time required for self-repairing to occur decrease with increasing OXE quantity within the network. Internal reflection infrared imaging (IRIRI) of OXE/OXO-CHI-PUR networks as well as Raman and Fourier transform IR (FT-IR) studies of OXE/OXO-CHI macromonomers revealed that cationic OXE/OXO ring opening, free radical polyurea (PUA)-to-PUR conversion, along with chair-to-boat conformational changes of CHI backbone are responsible for repairing the damaged network. The network remodeling process, investigated by utilizing micro-thermal analyzer (muTA), revealed that mechanical damage generates small fragments or oligomers within the scratch, therefore glass transition temperature (Tg) decreases, and under UV exposure cross-linking reactions propagate from the bottom of the scratch to the top resulting in

  7. Visual Design Principles: An Empirical Study of Design Lore

    Science.gov (United States)

    Kimball, Miles A.

    2013-01-01

    Many books, designers, and design educators talk about visual design principles such as balance, contrast, and alignment, but with little consistency. This study uses empirical methods to explore the lore surrounding design principles. The study took the form of two stages: a quantitative literature review to determine what design principles are…

  8. Visual Design Principles: An Empirical Study of Design Lore

    Science.gov (United States)

    Kimball, Miles A.

    2013-01-01

    Many books, designers, and design educators talk about visual design principles such as balance, contrast, and alignment, but with little consistency. This study uses empirical methods to explore the lore surrounding design principles. The study took the form of two stages: a quantitative literature review to determine what design principles are…

  9. Stable functional networks exhibit consistent timing in the human brain.

    Science.gov (United States)

    Chapeton, Julio I; Inati, Sara K; Zaghloul, Kareem A

    2017-03-01

    Despite many advances in the study of large-scale human functional networks, the question of timing, stability, and direction of communication between cortical regions has not been fully addressed. At the cellular level, neuronal communication occurs through axons and dendrites, and the time required for such communication is well defined and preserved. At larger spatial scales, however, the relationship between timing, direction, and communication between brain regions is less clear. Here, we use a measure of effective connectivity to identify connections between brain regions that exhibit communication with consistent timing. We hypothesized that if two brain regions are communicating, then knowledge of the activity in one region should allow an external observer to better predict activity in the other region, and that such communication involves a consistent time delay. We examine this question using intracranial electroencephalography captured from nine human participants with medically refractory epilepsy. We use a coupling measure based on time-lagged mutual information to identify effective connections between brain regions that exhibit a statistically significant increase in average mutual information at a consistent time delay. These identified connections result in sparse, directed functional networks that are stable over minutes, hours, and days. Notably, the time delays associated with these connections are also highly preserved over multiple time scales. We characterize the anatomic locations of these connections, and find that the propagation of activity exhibits a preferred posterior to anterior temporal lobe direction, consistent across participants. Moreover, networks constructed from connections that reliably exhibit consistent timing between anatomic regions demonstrate features of a small-world architecture, with many reliable connections between anatomically neighbouring regions and few long range connections. Together, our results demonstrate

  10. Consistência do padrão de agrupamento de cultivares de milho Clustering pattern consistency of corn cultivars

    Directory of Open Access Journals (Sweden)

    Alberto Cargnelutti Filho

    2011-09-01

    Full Text Available O objetivo deste trabalho foi avaliar a consistência do padrão de agrupamento obtido a partir da combinação de duas medidas de dissimilaridade e quatro métodos de agrupamento, em cenários formados por combinações de número de cultivares e número de variáveis, com dados reais de cultivares de milho (Zea mays L. e com dados simulados. Foram usados os dados reais de cinco variáveis mensuradas em 69 experimentos de competição de cultivares de milho, cujo número de cultivares avaliadas oscilou entre 9 e 40. A fim de investigar os resultados com maior número de cultivares e de variáveis, foram simulados, sob distribuição normal padrão, 1.000 experimentos para cada um dos 54 cenários formados pela combinação entre o número de cultivares (20, 30, 40, 50, 60, 70, 80, 90 e 100 e o número de variáveis (5, 6, 7, 8, 9 e 10. Foram realizadas análises de correlação, de diagnóstico de multicolinearidade e de agrupamento. A consistência do padrão de agrupamento foi avaliada por meio do coeficiente de correlação cofenética. Há decréscimo da consistência do padrão de agrupamento com o acréscimo do número de cultivares e de variáveis. A distância euclidiana proporciona maior consistência no padrão de agrupamento em relação à distância de Manhattan. A consistência do padrão de agrupamento entre os métodos aumenta na seguinte ordem: Ward, ligação completa, ligação simples e ligação média entre grupo.The objective of this research was to evaluate the clustering pattern consistency obtained from the combination of the two dissimilarity measures and four clustering methods, in scenarios consist of combinations number of cultivars and number of variables, with real data in corn cultivars (Zea mays L. and simulated data. We used real data from five variables measured in 69 trials involving corn cultivars, the number of cultivars ranged between 9 and 40. In order to investigate the results with more cultivars and

  11. Integrated communications: From one look to normative consistency

    DEFF Research Database (Denmark)

    Torp, Simon

    2009-01-01

    and conceptual development in relation to the range and scope of integrated communication.   Findings The ideal of integration in connection with marketing communication is not new. The analysis shows that the IMC field is marked by great diversity and disagreement. The ideal scope of integration has expanded...... ambitious interpretations of the concept the integration endeavour extends from the external integration of visual design to the internal integration of the organization's culture and "soul".   Design/methodology/approach The paper is based on a critical and thematic reading of the integrated marketing...

  12. Violation of consistency relations and the protoinflationary transition

    CERN Document Server

    Giovannini, Massimo

    2014-01-01

    If we posit the validity of the consistency relations, the tensor spectral index and the relative amplitude of the scalar and tensor power spectra are both fixed by a single slow roll parameter. The physics of the protoinflationary transition can break explicitly the consistency relations causing a reduction of the inflationary curvature scale in comparison with the conventional lore. After a critical scrutiny, we argue that the inflationary curvature scale, the total number of inflationary efolds and, ultimately, the excursion of the inflaton across its Planckian boundary are all characterized by a computable theoretical error. While these considerations ease some of the tensions between the Bicep2 data and the other satellite observations, they also demand an improved understanding of the protoinflationary transition whose physical features may be assessed, in the future, through a complete analysis of the spectral properties of the B mode autocorrelations.

  13. Non-trivial checks of novel consistency relations

    Energy Technology Data Exchange (ETDEWEB)

    Berezhiani, Lasha; Khoury, Justin [Center for Particle Cosmology, Department of Physics and Astronomy, University of Pennsylvania, Philadelphia, PA 19104 (United States); Wang, Junpu, E-mail: lashaber@gmail.com, E-mail: jkhoury@sas.upenn.edu, E-mail: jwang217@jhu.edu [Department of Physics and Astronomy, Johns Hopkins University, Baltimore, MD 21218 (United States)

    2014-06-01

    Single-field perturbations satisfy an infinite number of consistency relations constraining the squeezed limit of correlation functions at each order in the soft momentum. These can be understood as Ward identities for an infinite set of residual global symmetries, or equivalently as Slavnov-Taylor identities for spatial diffeomorphisms. In this paper, we perform a number of novel, non-trivial checks of the identities in the context of single field inflationary models with arbitrary sound speed. We focus for concreteness on identities involving 3-point functions with a soft external mode, and consider all possible scalar and tensor combinations for the hard-momentum modes. In all these cases, we check the consistency relations up to and including cubic order in the soft momentum. For this purpose, we compute for the first time the 3-point functions involving 2 scalars and 1 tensor, as well as 2 tensors and 1 scalar, for arbitrary sound speed.

  14. Non-Trivial Checks of Novel Consistency Relations

    CERN Document Server

    Berezhiani, Lasha; Wang, Junpu

    2014-01-01

    Single-field perturbations satisfy an infinite number of consistency relations constraining the squeezed limit of correlation functions at each order in the soft momentum. These can be understood as Ward identities for an infinite set of residual global symmetries, or equivalently as Slavnov-Taylor identities for spatial diffeomorphisms. In this paper, we perform a number of novel, non-trivial checks of the identities in the context of slow-roll single field inflationary models with arbitrary sound speed. We focus for concreteness on identities involving 3-point functions with a soft external mode, and consider all possible scalar and tensor combinations for the hard-momentum modes. In all these cases, we check the consistency relations up to and including cubic order in the soft momentum. For this purpose, we compute for the first time the 3-point functions involving 2 scalars and 1 tensor, as well as 2 tensors and 1 scalar, for arbitrary sound speed.

  15. Multiconfigurational self-consistent reaction field theory for nonequilibrium solvation

    DEFF Research Database (Denmark)

    Mikkelsen, Kurt V.; Cesar, Amary; Ågren, Hans

    1995-01-01

    We present multiconfigurational self-consistent reaction field theory and implementation for solvent effects on a solute molecular system that is not in equilibrium with the outer solvent. The approach incorporates two different polarization vectors for studying the influence of the solvent...... states influenced by the two types of polarization vectors. The general treatment of the correlation problem through the use of complete and restricted active space methodologies makes the present multiconfigurational self-consistent reaction field approach general in that it can handle any type of state......, open-shell, excited, and transition states. We demonstrate the theory by computing solvatochromatic shifts in optical/UV spectra of some small molecules and electron ionization and electron detachment energies of the benzene molecule. It is shown that the dependency of the solvent induced affinity...

  16. Branch dependence in the "consistent histories" approach to quantum mechanics

    CERN Document Server

    Müller, T

    2005-01-01

    In the consistent histories formalism one specifies a family of histories as an exhaustive set of pairwise exclusive descriptions of the dynamics of a quantum system. We define branching families of histories, which strike a middle ground between the two available mathematically precise definitions of families of histories, viz., product families and Isham's history projector operator formalism. The former are too narrow for applications, and the latter's generality comes at a certain cost, barring an intuitive reading of the ``histories''. Branching families retain the intuitiveness of product families, they allow for the interpretation of a history's weight as a probability, and they allow one to distinguish two kinds of coarse-graining. It is shown that for branching families, the ``consistency condition'' is not a precondition for assigning probabilities, but for a specific kind of coarse-graining.

  17. Structures, profile consistency, and transport scaling in electrostatic convection

    DEFF Research Database (Denmark)

    Bian, N.H.; Garcia, O.E.

    2005-01-01

    that for interchange modes, profile consistency is in fact due to mixing by persistent large-scale convective cells. This mechanism is not a turbulent diffusion, cannot occur in collisionless systems, and is the analog of the well-known laminar "magnetic flux expulsion" in magneiohydrodynamics. This expulsion process...... involves a "pinch" across closed streamlines and further results in the formation of pressure fingers along the-separatrix of the convective cells. By nature, these coherent structures are dissipative because the mixing process that leads to their formation relies on a finite amount of collisional...... diffusion. Numerical simulations of two-dimensional interchange modes confirm the role of laminar expulsion by convective cells, for profile consistency and structure formation. They also show that the fingerlike pressure structures ultimately control the rate of heat transport across the plasma layer...

  18. Turbulent MHD transport coefficients - An attempt at self-consistency

    Science.gov (United States)

    Chen, H.; Montgomery, D.

    1987-01-01

    In this paper, some multiple scale perturbation calculations of turbulent MHD transport coefficients begun in earlier papers are first completed. These generalize 'alpha effect' calculations by treating the velocity field and magnetic field on the same footing. Then the problem of rendering such calculations self-consistent is addressed, generalizing an eddy-viscosity hypothesis similar to that of Heisenberg for the Navier-Stokes case. The method also borrows from Kraichnan's direct interaction approximation. The output is a set of integral equations relating the spectra and the turbulent transport coefficients. Previous 'alpha effect' and 'beta effect' coefficients emerge as limiting cases. A treatment of the inertial range can also be given, consistent with a -5/3 energy spectrum power law. In the Navier-Stokes limit, a value of 1.72 is extracted for the Kolmogorov constant. Further applications to MHD are possible.

  19. Consistent return mapping algorithm for plane stress elastoplasticity

    Energy Technology Data Exchange (ETDEWEB)

    Simo, J.C.; Taylor, R.L.

    1985-05-01

    An unconditionally stable algorithm for plane stress elastoplasticity is developed, based upon the motion of elastic predictor return mapping (plastic corrector). Enforcement of the consistency condition is shown to reduce to the solution of a simple nonlinear equation. Consistent elastoplastic tangent moduli are obtained by exact linearization of the algorithm. Use of these moduli is essential in order to preserve the asymptotic rate of quadratic convergence of Newton methods. An exact solution for constant strain rate over the typical time step is derived. On the basis of this solution the accuracy of the algorithm is assessed by means of iso-error maps. The excellent performance of the algorithm for large time steps is illustrated in numerical experiments.

  20. Strong Consistency of the Empirical Martingale Simulation Option Price Estimator

    Institute of Scientific and Technical Information of China (English)

    Zhu-shun Yuan; Ge-mai Chen

    2009-01-01

    A simulation technique known as empirical martingale simulation (EMS) was proposed to improve simulation accuracy. By an adjustment to the standard Monte Carlo simulation, EMS ensures that the simulated price satisfies the rational option pricing bounds and that the estimated derivative contract price is strongly consistent with payoffs that satisfy Lipschitz condition. However, for some currently used contracts such as self-quanto options and asymmetric or symmetric power options, it is open whether the above asymptotic result holds. In this paper, we prove that the strong consistency of the EMS option price estimator holds for a wider class of univariate payoffs than those restricted by Lipschitz condition. Numerical experiments demonstrate that EMS can also substantially increase simulation accuracy in the extended setting.

  1. A correlation consistency based multivariate alarm thresholds optimization approach.

    Science.gov (United States)

    Gao, Huihui; Liu, Feifei; Zhu, Qunxiong

    2016-11-01

    Different alarm thresholds could generate different alarm data, resulting in different correlations. A new multivariate alarm thresholds optimization methodology based on the correlation consistency between process data and alarm data is proposed in this paper. The interpretative structural modeling is adopted to select the key variables. For the key variables, the correlation coefficients of process data are calculated by the Pearson correlation analysis, while the correlation coefficients of alarm data are calculated by kernel density estimation. To ensure the correlation consistency, the objective function is established as the sum of the absolute differences between these two types of correlations. The optimal thresholds are obtained using particle swarm optimization algorithm. Case study of Tennessee Eastman process is given to demonstrate the effectiveness of proposed method.

  2. Current Status Of Velocity Field Surveys: A Consistency Check

    CERN Document Server

    Sarkar, D; Watkins, R; Sarkar, Devdeep; Feldman, Hume A.

    2006-01-01

    We present a statistical analysis comparing the bulk--flow measurements for six recent peculiar velocity surveys, namely, ENEAR, SFI, RFGC, SBF and the Mark III singles and group catalogs. We study whether the bulk--flow estimates are consistent with each other and construct the full three dimensional bulk--flow vectors. The method we discuss could be used to test the consistency of all velocity field surveys. We show that although these surveys differ in their geometry and measurement errors, their bulk flow vectors are expected to be highly correlated and in fact show impressive agreement in all cases. Our results suggest that even though the surveys we study target galaxies of different morphology and use different distance measures, they all reliably reflect the same underlying large-scale flow.

  3. Stochastic multi-configurational self-consistent field theory

    CERN Document Server

    Thomas, Robert E; Alavi, Ali; Booth, George H

    2015-01-01

    The multi-configurational self-consistent field theory is considered the standard starting point for almost all multireference approaches required for strongly-correlated molecular problems. The limitation of the approach is generally given by the number of strongly-correlated orbitals in the molecule, as its cost will grow exponentially with this number. We present a new multi-configurational self-consistent field approach, wherein linear determinant coefficients of a multi-configurational wavefunction are optimized via the stochastic full configuration interaction quantum Monte Carlo technique at greatly reduced computational cost, with non-linear orbital rotation parameters updated variationally based on this sampled wavefunction. This extends this approach to strongly-correlated systems with far larger active spaces than it is possible to treat by conventional means. By comparison with this traditional approach, we demonstrate that the introduction of stochastic noise in both the determinant amplitudes an...

  4. Consistency Across Standards or Standards in a New Business Model

    Science.gov (United States)

    Russo, Dane M.

    2010-01-01

    Presentation topics include: standards in a changing business model, the new National Space Policy is driving change, a new paradigm for human spaceflight, consistency across standards, the purpose of standards, danger of over-prescriptive standards, a balance is needed (between prescriptive and general standards), enabling versus inhibiting, characteristics of success-oriented standards, characteristics of success-oriented standards, and conclusions. Additional slides include NASA Procedural Requirements 8705.2B identifies human rating standards and requirements, draft health and medical standards for human rating, what's been done, government oversight models, examples of consistency from anthropometry, examples of inconsistency from air quality and appendices of government and non-governmental human factors standards.

  5. Exceptional generalised geometry for massive IIA and consistent reductions

    CERN Document Server

    Cassani, Davide; Petrini, Michela; Strickland-Constable, Charles; Waldram, Daniel

    2016-01-01

    We develop an exceptional generalised geometry formalism for massive type IIA supergravity. In particular, we construct a deformation of the generalised Lie derivative, which generates the type IIA gauge transformations as modified by the Romans mass. We apply this new framework to consistent Kaluza-Klein reductions preserving maximal supersymmetry. We find a generalised parallelisation of the exceptional tangent bundle on S^6, and from this reproduce the consistent truncation ansatz and embedding tensor leading to dyonically gauged ISO(7) supergravity in four dimensions. We also discuss closely related hyperboloid reductions, yielding a dyonic ISO(p,7-p) gauging. Finally, while for vanishing Romans mass we find a generalised parallelisation on S^d, d=4,3,2, leading to a maximally supersymmetric reduction with gauge group SO(d+1) (or larger), we provide evidence that an analogous reduction does not exist in the massive theory.

  6. Exceptional generalised geometry for massive IIA and consistent reductions

    Science.gov (United States)

    Cassani, Davide; de Felice, Oscar; Petrini, Michela; Strickland-Constable, Charles; Waldram, Daniel

    2016-08-01

    We develop an exceptional generalised geometry formalism for massive type IIA supergravity. In particular, we construct a deformation of the generalised Lie derivative, which generates the type IIA gauge transformations as modified by the Romans mass. We apply this new framework to consistent Kaluza-Klein reductions preserving maximal supersymmetry. We find a generalised parallelisation of the exceptional tangent bundle on S 6, and from this reproduce the consistent truncation ansatz and embedding tensor leading to dyonically gauged ISO(7) supergravity in four dimensions. We also discuss closely related hyperboloid reductions, yielding a dyonic ISO( p, 7 - p) gauging. Finally, while for vanishing Romans mass we find a generalised parallelisation on S d , d = 4 , 3 , 2, leading to a maximally supersymmetric reduction with gauge group SO( d + 1) (or larger), we provide evidence that an analogous reduction does not exist in the massive theory.

  7. Bolasso: model consistent Lasso estimation through the bootstrap

    CERN Document Server

    Bach, Francis

    2008-01-01

    We consider the least-square linear regression problem with regularization by the l1-norm, a problem usually referred to as the Lasso. In this paper, we present a detailed asymptotic analysis of model consistency of the Lasso. For various decays of the regularization parameter, we compute asymptotic equivalents of the probability of correct model selection (i.e., variable selection). For a specific rate decay, we show that the Lasso selects all the variables that should enter the model with probability tending to one exponentially fast, while it selects all other variables with strictly positive probability. We show that this property implies that if we run the Lasso for several bootstrapped replications of a given sample, then intersecting the supports of the Lasso bootstrap estimates leads to consistent model selection. This novel variable selection algorithm, referred to as the Bolasso, is compared favorably to other linear regression methods on synthetic data and datasets from the UCI machine learning rep...

  8. Consistency Relations for an Implicit n-dimensional Regularization Scheme

    CERN Document Server

    Scarpelli, A P B; Nemes, M C

    2001-01-01

    We extend an implicit regularization scheme to be applicable in the $n$-dimensional space-time. Within this scheme divergences involving parity violating objects can be consistently treated without recoursing to dimensional continuation. Special attention is paid to differences between integrals of the same degree of divergence, typical of one loop calculations, which are in principle undetermined. We show how to use symmetries in order to fix these quantities consistently. We illustrate with examples in which regularization plays a delicate role in order to both corroborate and elucidate the results in the literature for the case of CPT violation in extended $QED_4$, topological mass generation in 3-dimensional gauge theories and the Schwinger Model and its chiral version.

  9. Collisional decoherence of tunneling molecules: a consistent histories treatment

    CERN Document Server

    Coles, Patrick J; Griffiths, Robert B

    2012-01-01

    The decoherence of a two-state tunneling molecule, such as a chiral molecule or ammonia, due to collisions with a buffer gas is analyzed in terms of a succession of quantum states of the molecule satisfying the conditions for a consistent family of histories. With $\\hbar \\omega$ the separation in energy of the levels in the isolated molecule and $\\gamma$ a decoherence rate proportional to the rate of collisions, we find for $\\gamma \\gg \\omega$ (strong decoherence) a consistent family in which the molecule flips randomly back and forth between the left- and right-handed chiral states in a stationary Markov process. For $\\gamma \\omega$ and for $\\gamma < \\omega$. In addition we relate the speed with which chiral information is transferred to the environment to the rate of decrease of complementary types of information (e.g., parity information) remaining in the molecule itself.

  10. A New Heuristic for Feature Selection by Consistent Biclustering

    CERN Document Server

    Mucherino, Antonio

    2010-01-01

    Given a set of data, biclustering aims at finding simultaneous partitions in biclusters of its samples and of the features which are used for representing the samples. Consistent biclusterings allow to obtain correct classifications of the samples from the known classification of the features, and vice versa, and they are very useful for performing supervised classifications. The problem of finding consistent biclusterings can be seen as a feature selection problem, where the features that are not relevant for classification purposes are removed from the set of data, while the total number of features is maximized in order to preserve information. This feature selection problem can be formulated as a linear fractional 0-1 optimization problem. We propose a reformulation of this problem as a bilevel optimization problem, and we present a heuristic algorithm for an efficient solution of the reformulated problem. Computational experiments show that the presented algorithm is able to find better solutions with re...

  11. Viscoelastic models with consistent hypoelasticity for fluids undergoing finite deformations

    Science.gov (United States)

    Altmeyer, Guillaume; Rouhaud, Emmanuelle; Panicaud, Benoit; Roos, Arjen; Kerner, Richard; Wang, Mingchuan

    2015-08-01

    Constitutive models of viscoelastic fluids are written with rate-form equations when considering finite deformations. Trying to extend the approach used to model these effects from an infinitesimal deformation to a finite transformation framework, one has to ensure that the tensors and their rates are indifferent with respect to the change of observer and to the superposition with rigid body motions. Frame-indifference problems can be solved with the use of an objective stress transport, but the choice of such an operator is not obvious and the use of certain transports usually leads to physically inconsistent formulation of hypoelasticity. The aim of this paper is to present a consistent formulation of hypoelasticity and to combine it with a viscosity model to construct a consistent viscoelastic model. In particular, the hypoelastic model is reversible.

  12. Sparse motion segmentation using multiple six-point consistencies

    CERN Document Server

    Zografos, Vasileios; Ellis, Liam

    2010-01-01

    We present a method for segmenting an arbitrary number of moving objects in image sequences using the geometry of 6 points in 2D to infer motion consistency. The method has been evaluated on the Hopkins 155 database and surpasses current state-of-the-art methods such as SSC, both in terms of overall performance on two and three motions but also in terms of maximum errors. The method works by ?nding initial clusters in the spatial domain, and then classifying each remaining point as belonging to the cluster that minimizes a motion consistency score. In contrast to most other motion segmentation methods that are based on an a?ne camera model, the proposed method is fully projective.

  13. Consistency and axiomatization of a natural extensional combinatory logic

    Institute of Scientific and Technical Information of China (English)

    蒋颖

    1996-01-01

    In the light of a question of J. L. Krivine about the consistency of an extensional λ-theory,an extensional combinatory logic ECL+U(G)+RU_∞+ is established, with its consistency model provedtheoretically and it is shown the it is not equivalent to any system of universal axioms. It is expressed bythe theory in first order logic that, for every given group G of order n, there simultaneously exist infinitelymany universal retractions and a surjective n-tuple notion, such that each element of G acts as a permutationof the components of the n-tuple, and as an Ap-automorphism of the model; further each of the universalretractions is invarian under the action of the Ap-automorphisms induced by G The difference between thetheory and that of Krivine is the G need not be a symmetric group.

  14. A minimal model of self-consistent partial synchrony

    Science.gov (United States)

    Clusella, Pau; Politi, Antonio; Rosenblum, Michael

    2016-09-01

    We show that self-consistent partial synchrony in globally coupled oscillatory ensembles is a general phenomenon. We analyze in detail appearance and stability properties of this state in possibly the simplest setup of a biharmonic Kuramoto-Daido phase model as well as demonstrate the effect in limit-cycle relaxational Rayleigh oscillators. Such a regime extends the notion of splay state from a uniform distribution of phases to an oscillating one. Suitable collective observables such as the Kuramoto order parameter allow detecting the presence of an inhomogeneous distribution. The characteristic and most peculiar property of self-consistent partial synchrony is the difference between the frequency of single units and that of the macroscopic field.

  15. Planck 2013 results. XXXI. Consistency of the Planck data

    DEFF Research Database (Denmark)

    Ade, P. A. R.; Arnaud, M.; Ashdown, M.

    2014-01-01

    by deviation of the ratio from unity) between 70 and 100 GHz power spectra averaged over 70 ≤∫≥ 390 at the 0.8% level, and agreement between 143 and 100 GHz power spectra of 0.4% over the same ` range. These values are within and consistent with the overall uncertainties in calibration given in the Planck 2013...... foreground emission. In this paper, we analyse the level of consistency achieved in the 2013 Planck data. We concentrate on comparisons between the 70, 100, and 143 GHz channel maps and power spectra, particularly over the angular scales of the first and second acoustic peaks, on maps masked for diuse....../100 ratio. Correcting for this, the 70, 100, and 143 GHz power spectra agree to 0.4% over the first two acoustic peaks. The likelihood analysis that produced the 2013 cosmological parameters incorporated uncertainties larger than this. We show explicitly that correction of the missing near sidelobe power...

  16. Time-Consistent and Market-Consistent Evaluations (Revised version of CentER DP 2011-063)

    NARCIS (Netherlands)

    Pelsser, A.; Stadje, M.A.

    2012-01-01

    Abstract: We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  17. Time-Consistent and Market-Consistent Evaluations (replaced by CentER DP 2012-086)

    NARCIS (Netherlands)

    Pelsser, A.; Stadje, M.A.

    2011-01-01

    We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from mathemati

  18. Non-autonomous discrete Boussinesq equation: Solutions and consistency

    Science.gov (United States)

    Nong, Li-Juan; Zhang, Da-Juan

    2014-07-01

    A non-autonomous 3-component discrete Boussinesq equation is discussed. Its spacing parameters pn and qm are related to independent variables n and m, respectively. We derive bilinear form and solutions in Casoratian form. The plain wave factor is defined through the cubic roots of unity. The plain wave factor also leads to extended non-autonomous discrete Boussinesq equation which contains a parameter δ. Tree-dimendional consistency and Lax pair of the obtained equation are discussed.

  19. An Algebraic Characterization of Inductive Soundness in Proof by Consistency

    Institute of Scientific and Technical Information of China (English)

    邵志清; 宋国新

    1995-01-01

    Kapur and Musser studied the theoretical basis for proof by consistency and obtained an inductive completeness result:p=q if and only if p=q is true in every inductive model.However,there is a loophole in their proof for the soundness part:p=q implies p=q is true in every inductive model.The aim of this paper is to give a correct characterization of inductive soundness from an algebraic view by introducing strong inductive models.

  20. Consistency analysis of a nonbirefringent Lorentz-violating planar model

    CERN Document Server

    Casana, Rodolfo; Moreira, Roemir P M

    2011-01-01

    In this work analyze the physical consistency of a nonbirefringent Lorentz-violating planar model via the analysis of the pole structure of its Feynman's propagators. The nonbirefringent planar model, obtained from the dimensional reduction of the CPT-even gauge sector of the standard model extension, is composed of a gauge and a scalar fields, being affected by Lorentz-violating (LIV) coefficients encoded in the symmetric tensor $\\kappa_{\\mu\

  1. Incomplete Lineage Sorting: Consistent Phylogeny Estimation From Multiple Loci

    CERN Document Server

    Mossel, Elchanan

    2008-01-01

    We introduce a simple algorithm for reconstructing phylogenies from multiple gene trees in the presence of incomplete lineage sorting, that is, when the topology of the gene trees may differ from that of the species tree. We show that our technique is statistically consistent under standard stochastic assumptions, that is, it returns the correct tree given sufficiently many unlinked loci. We also show that it can tolerate moderate estimation errors.

  2. Consistent 4D Brain Extraction of Serial Brain MR Images

    OpenAIRE

    Wang, Yaping; Li, Gang; Nie, Jingxin; Yap, Pew-Thian; Guo, Lei; Shen, Dinggang

    2013-01-01

    Accurate and consistent skull stripping of serial brain MR images is of great importance in longitudinal studies that aim to detect subtle brain morphological changes. To avoid inconsistency and the potential bias introduced by independently performing skull-stripping for each time-point image, we propose an effective method that is capable of skull-stripping serial brain MR images simultaneously. Specifically, all serial images of the same subject are first affine aligned in a groupwise mann...

  3. Consistency argument and classification problem in λ-calculus

    Institute of Scientific and Technical Information of China (English)

    王驹; 赵希顺; 黄且圆; 蒋颖

    1999-01-01

    Enlightened by Mal’cev theorem in universal algebra, a new criterion for consistency argument in λ-calculus has been introduced. It is equivalent to Jacopini and Baeten-Boerboom’ s, but more convenient to use. Based on the new criterion, one uses an enhanced technique to show a few results which provides a deeper insight in the classification problem of λ-terms with no normal forms.

  4. Security Policy:Consistency,Adjustments and Restraining Factors

    Institute of Scientific and Technical Information of China (English)

    Yang Jiemian

    2004-01-01

    @@ In the 2004 U.S. presidential election, despite well-divided domestic opinions and Kerry's appealing slogan of "Reversing the Trend," a slight majority still voted for George W. Bush in the end. It is obvious that, based on the author's analysis, security agenda such as counter-terrorism and Iraqi issue has contributed greatly to the reelection of Mr. Bush. This also indicates that the security policy of Bush's second term will basically be consistent.

  5. Measuring Consistent Poverty in Ireland with EU SILC Data

    OpenAIRE

    Whelan, Christopher T.; Nolan, Brian; Maitre, Bertrand

    2006-01-01

    In this paper we seek to make use of the newly available Irish component of the European Union Statistics on Income and Living Conditions (EU-SILC) in order to develop a measure of consistent poverty that overcomes some of the difficulties associated with the original indicators employed as targets in the Irish National Anti-Poverty Strategy. Our analysis leads us to propose a set of economic strain indicators that cover a broader range than the original basic deprivation set. The accumulated...

  6. Personalities in great tits, Parus major: stability and consistency

    OpenAIRE

    Carere, C; Drent, PJ; Privitera, L; Koolhaas, JM; Groothuis, TGG; Drent, Piet J; Koolhaas, Jaap M.; Groothuis, Ton G.G.

    2005-01-01

    We carried out a longitudinal study on great tits from two lines bidirectionally selected for fast or slow exploratory performance during the juvenile phase, a trait thought to reflect different personalities. We analysed temporal stability and consistency of responses within and between situations involving exploratory and sociosexual behaviour. Exploratory behaviour was assessed both in the juvenile phase and in adulthood (2-3-year interval) by means of a novel object test and an open field...

  7. Noncommuting electric fields and algebraic consistency in noncommutative gauge theories

    Science.gov (United States)

    Banerjee, Rabin

    2003-05-01

    We show that noncommuting electric fields occur naturally in θ-expanded noncommutative gauge theories. Using this noncommutativity, which is field dependent, and a Hamiltonian generalization of the Seiberg-Witten map, the algebraic consistency in the Lagrangian and Hamiltonian formulations of these theories is established. A comparison of results in different descriptions shows that this generalized map acts as a canonical transformation in the physical subspace only. Finally, we apply the Hamiltonian formulation to derive the gauge symmetries of the action.

  8. Identification of consistency in rating curve data: Bidirectional Reach (BReach)

    Science.gov (United States)

    Van Eerdenbrugh, Katrien; Van Hoey, Stijn; Verhoest, Niko E. C.

    2016-04-01

    Before calculating rating curve discharges, it is crucial to identify possible interruptions in data consistency. In this research, a methodology to perform this preliminary analysis is developed and validated. This methodology, called Bidirectional Reach (BReach), evaluates in each data point results of a rating curve model with randomly sampled parameter sets. The combination of a parameter set and a data point is classified as non-acceptable if the deviation between the accompanying model result and the measurement exceeds observational uncertainty. Moreover, a tolerance degree that defines satisfactory behavior of a sequence of model results is chosen. This tolerance degree equals the percentage of observations that are allowed to have non-acceptable model results. Subsequently, the results of the classification is used to assess the maximum left and right reach for each data point of a chronologically sorted time series. This maximum left and right reach in a gauging point represent the data points in the direction of the previous respectively the following observations beyond which none of the sampled parameter sets both are satisfactory and result in an acceptable deviation. This analysis is repeated for a variety of tolerance degrees. Plotting results of this analysis for all data points and all tolerance degrees in a combined BReach plot enables the detection of changes in data consistency. Moreover, if consistent periods are detected, limits of these periods can be derived. The methodology is validated with various synthetic stage-discharge data sets and proves to be a robust technique to investigate temporal consistency of rating curve data. It provides satisfying results despite of low data availability, large errors in the estimated observational uncertainty, and a rating curve model that is known to cover only a limited part of the observations.

  9. Consistent Algorithm for Multi-value Constraint with Continuous Variables

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    Mature algorithms for the Constraint Satisfaction Problem (CSP) of binary constraint with discrete variables have already been obtained for the application. For the instance of multi-value constraint with continuous variables, the approach will be quite different and the difficulty of settling will aggrandize a lot. This paper presents the algorithm for realizing global consistency of continuous variable. And this algorithm can be applied to multi-value constraint.

  10. Influence of Sensor Ingestion Timing on Consistency of Temperature Measures

    Science.gov (United States)

    2009-01-01

    Copyright @ 200 by the American College of Sports Medicine. Unauthorized reproduction of this article is prohibited.9 Influence of Sensor Ingestion ... Ingestion Timing on Consistency of Temperature Measures. Med. Sci. Sports Exerc., Vol. 41, No. 3, pp. 597–602, 2009. Purpose: The validity and the...reliability of using intestinal temperature (Tint) via ingestible temperature sensors (ITS) to measure core body temperature have been demonstrated. However

  11. Spectrally Consistent Satellite Image Fusion with Improved Image Priors

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Aanæs, Henrik; Jensen, Thomas B.S.;

    2006-01-01

    Here an improvement to our previous framework for satellite image fusion is presented. A framework purely based on the sensor physics and on prior assumptions on the fused image. The contributions of this paper are two fold. Firstly, a method for ensuring 100% spectrally consistency is proposed......, even when more sophisticated image priors are applied. Secondly, a better image prior is introduced, via data-dependent image smoothing....

  12. GW method with the self-consistent Sternheimer equation

    OpenAIRE

    2010-01-01

    We propose a novel approach to quasiparticle GW calculations which does not require the computation of unoccupied electronic states. In our approach the screened Coulomb interaction is evaluated by solving self-consistent linear-response Sternheimer equations, and the noninteracting Green's function is evaluated by solving inhomogeneous linear systems. The frequency-dependence of the screened Coulomb interaction is explicitly taken into account. In order to avoid the singularities of the scre...

  13. Internal consistency reliability is a poor predictor of responsiveness

    Directory of Open Access Journals (Sweden)

    Heels-Ansdell Diane

    2005-05-01

    Full Text Available Abstract Background Whether responsiveness represents a measurement property of health-related quality of life (HRQL instruments that is distinct from reliability and validity is an issue of debate. We addressed the claims of a recent study, which suggested that investigators could rely on internal consistency to reflect instrument responsiveness. Methods 516 patients with chronic obstructive pulmonary disease or knee injury participating in four longitudinal studies completed generic and disease-specific HRQL questionnaires before and after an intervention that impacted on HRQL. We used Pearson correlation coefficients and linear regression to assess the relationship between internal consistency reliability (expressed as Cronbach's alpha, instrument type (generic and disease-specific and responsiveness (expressed as the standardised response mean, SRM. Results Mean Cronbach's alpha was 0.83 (SD 0.08 and mean SRM was 0.59 (SD 0.33. The correlation between Cronbach's alpha and SRMs was 0.10 (95% CI -0.12 to 0.32 across all studies. Cronbach's alpha alone did not explain variability in SRMs (p = 0.59, r2 = 0.01 whereas the type of instrument was a strong predictor of the SRM (p = 0.012, r2 = 0.37. In multivariable models applied to individual studies Cronbach's alpha consistently failed to predict SRMs (regression coefficients between -0.45 and 1.58, p-values between 0.15 and 0.98 whereas the type of instrument did predict SRMs (regression coefficients between -0.25 to -0.59, p-values between Conclusion Investigators must look to data other than internal consistency reliability to select a responsive instrument for use as an outcome in clinical trials.

  14. Consistent deniable lying : privacy in mobile social networks

    OpenAIRE

    Belle, Sebastian Kay; Waldvogel, Marcel

    2008-01-01

    Social networking is moving to mobile phones. This not only means continuous access, but also allows to link virtual and physical neighbourhood in novel ways. To make such systems useful, personal data such as lists of friends and interests need to be shared with more and frequently unknown people, posing a risk to your privacy. In this paper, we present our approach to social networking, Consistent Deniable Lying (CDL). Using easy-to-understand mechanisms and tuned to this environment, i...

  15. Design of Dependable Control System Using a Component Based Approach

    DEFF Research Database (Denmark)

    Blanke, M.

    1995-01-01

    Design of fault handling in control systems is discussed and a consistent method for design is presented.......Design of fault handling in control systems is discussed and a consistent method for design is presented....

  16. On the scalar consistency relation away from slow roll

    CERN Document Server

    Sreenath, V; Sriramkumar, L

    2014-01-01

    As is well known, the non-Gaussianity parameter $f_{_{\\rm NL}}$, which is often used to characterize the amplitude of the scalar bi-spectrum, can be expressed completely in terms of the scalar spectral index $n_{\\rm s}$ in the squeezed limit, a relation that is referred to as the consistency condition. This relation, while it is largely discussed in the context of slow roll inflation, is actually expected to hold in any single field model of inflation, irrespective of the dynamics of the underlying model. In this work, we explicitly examine the validity of the consistency relation, analytically as well as numerically, away from slow roll. Analytically, we first arrive at the relation in the simple case of power law inflation. We also consider the non-trivial example of the Starobinsky model involving a linear potential with a sudden change in its slope (which leads to a brief period of fast roll), and establish the condition completely analytically. We then numerically examine the validity of the consistency ...

  17. Wide baseline stereo matching based on double topological relationship consistency

    Science.gov (United States)

    Zou, Xiaohong; Liu, Bin; Song, Xiaoxue; Liu, Yang

    2009-07-01

    Stereo matching is one of the most important branches in computer vision. In this paper, an algorithm is proposed for wide-baseline stereo vision matching. Here, a novel scheme is presented called double topological relationship consistency (DCTR). The combination of double topological configuration includes the consistency of first topological relationship (CFTR) and the consistency of second topological relationship (CSTR). It not only sets up a more advanced model on matching, but discards mismatches by iteratively computing the fitness of the feature matches and overcomes many problems of traditional methods depending on the powerful invariance to changes in the scale, rotation or illumination across large view changes and even occlusions. Experimental examples are shown where the two cameras have been located in very different orientations. Also, epipolar geometry can be recovered using RANSAC by far the most widely method adopted possibly. By the method, we can obtain correspondences with high precision on wide baseline matching problems. Finally, the effectiveness and reliability of this method are demonstrated in wide-baseline experiments on the image pairs.

  18. Consistency Property of Finite FC-Normal Logic Programs

    Institute of Scientific and Technical Information of China (English)

    Yi-Song Wang; Ming-Yi Zhang; Yu-Ping Shen

    2007-01-01

    Marek's forward-chaining construction is one of the important techniques for investigating the non-monotonic reasoning. By introduction of consistency property over a logic program, they proposed a class of logic programs, FC-normal programs, each of which has at least one stable model. However, it is not clear how to choose one appropriate consistency property for deciding whether or not a logic program is FC-normal. In this paper, we firstly discover that, for any finite logic program ∏, there exists the least consistency property LCon(∏) over ∏, which just depends on ∏ itself, such that, ∏ is FC-normal if and only if ∏ is FC-normal with respect to (w.r.t.) LCon(∏). Actually, in order to determine the FC-normality of a logic program, it is sufficient to check the monotonic closed sets in LCon(∏) for all non-monotonic rules, that is LFC(∏). Secondly, we present an algorithm for computing LFC(∏). Finally, we reveal that the brave reasoning task and cautious reasoning task for FC-normal logic programs are of the same difficulty as that of normal logic programs.

  19. On the kernel and particle consistency in smoothed particle hydrodynamics

    CERN Document Server

    Sigalotti, Leonardo Di G; Rendón, Otto; Vargas, Carlos A; Peña-Polo, Franklin

    2016-01-01

    The problem of consistency of smoothed particle hydrodynamics (SPH) has demanded considerable attention in the past few years due to the ever increasing number of applications of the method in many areas of science and engineering. A loss of consistency leads to an inevitable loss of approximation accuracy. In this paper, we revisit the issue of SPH kernel and particle consistency and demonstrate that SPH has a limiting second-order convergence rate. Numerical experiments with suitably chosen test functions validate this conclusion. In particular, we find that when using the root mean square error as a model evaluation statistics, well-known corrective SPH schemes, which were thought to converge to second, or even higher order, are actually first-order accurate, or at best close to second order. We also find that observing the joint limit when $N\\to\\infty$, $h\\to 0$, and $n\\to\\infty$, as was recently proposed by Zhu et al., where $N$ is the total number of particles, $h$ is the smoothing length, and $n$ is th...

  20. Consistently Trained Artificial Neural Network for Automatic Ship Berthing Control

    Directory of Open Access Journals (Sweden)

    Y.A. Ahmed

    2015-09-01

    Full Text Available In this paper, consistently trained Artificial Neural Network controller for automatic ship berthing is discussed. Minimum time course changing manoeuvre is utilised to ensure such consistency and a new concept named ‘virtual window’ is introduced. Such consistent teaching data are then used to train two separate multi-layered feed forward neural networks for command rudder and propeller revolution output. After proper training, several known and unknown conditions are tested to judge the effectiveness of the proposed controller using Monte Carlo simulations. After getting acceptable percentages of success, the trained networks are implemented for the free running experiment system to judge the network’s real time response for Esso Osaka 3-m model ship. The network’s behaviour during such experiments is also investigated for possible effect of initial conditions as well as wind disturbances. Moreover, since the final goal point of the proposed controller is set at some distance from the actual pier to ensure safety, therefore a study on automatic tug assistance is also discussed for the final alignment of the ship with actual pier.

  1. A dynamical mechanism for large volumes with consistent couplings

    Science.gov (United States)

    Abel, Steven

    2016-11-01

    A mechanism for addressing the "decompactification problem" is proposed, which consists of balancing the vacuum energy in Scherk-Schwarzed theories against contributions coming from non-perturbative physics. Universality of threshold corrections ensures that, in such situations, the stable minimum will have consistent gauge couplings for any gauge group that shares the same N = 2 beta function for the bulk excitations as the gauge group that takes part in the minimisation. Scherk-Schwarz compactification from 6D to 4D in heterotic strings is discussed explicitly, together with two alternative possibilities for the non-perturbative physics, namely metastable SQCD vacua and a single gaugino condensate. In the former case, it is shown that modular symmetries gives various consistency checks, and allow one to follow soft-terms, playing a similar role to R-symmetry in global SQCD. The latter case is particularly attractive when there is nett Bose-Fermi degeneracy in the massless sector. In such cases, because the original Casimir energy is generated entirely by excited and/or non-physical string modes, it is completely immune to the non-perturbative IR physics. Such a separation between UV and IR contributions to the potential greatly simplifies the analysis of stabilisation, and is a general possibility that has not been considered before.

  2. The consistent histories approach to loop quantum cosmology

    Science.gov (United States)

    Craig, David A.

    2016-06-01

    We review the application of the consistent (or decoherent) histories formulation of quantum theory to canonical loop quantum cosmology. Conventional quantum theory relies crucially on “measurements” to convert unrealized quantum potentialities into physical outcomes that can be assigned probabilities. In the early universe and other physical contexts in which there are no observers or measuring apparatus (or indeed, in any closed quantum system), what criteria determine which alternative outcomes may be realized and what their probabilities are? In the consistent histories formulation it is the vanishing of interference between the branch wave functions describing alternative histories — as determined by the system’s decoherence functional — that determines which alternatives may be assigned probabilities. We describe the consistent histories formulation and how it may be applied to canonical loop quantum cosmology, describing in detail the application to homogeneous and isotropic cosmological models with scalar matter. We show how the theory may be used to make definite physical predictions in the absence of “observers”. As an application, we demonstrate how the theory predicts that loop quantum models “bounce” from large volume to large volume, while conventional “Wheeler-DeWitt”-quantized universes are invariably singular. We also briefly indicate the relation to other work.

  3. Consistent Adjoint Driven Importance Sampling using Space, Energy and Angle

    Energy Technology Data Exchange (ETDEWEB)

    Peplow, Douglas E. [ORNL; Mosher, Scott W [ORNL; Evans, Thomas M [ORNL

    2012-08-01

    For challenging radiation transport problems, hybrid methods combine the accuracy of Monte Carlo methods with the global information present in deterministic methods. One of the most successful hybrid methods is CADIS Consistent Adjoint Driven Importance Sampling. This method uses a deterministic adjoint solution to construct a biased source distribution and consistent weight windows to optimize a specific tally in a Monte Carlo calculation. The method has been implemented into transport codes using just the spatial and energy information from the deterministic adjoint and has been used in many applications to compute tallies with much higher figures-of-merit than analog calculations. CADIS also outperforms user-supplied importance values, which usually take long periods of user time to develop. This work extends CADIS to develop weight windows that are a function of the position, energy, and direction of the Monte Carlo particle. Two types of consistent source biasing are presented: one method that biases the source in space and energy while preserving the original directional distribution and one method that biases the source in space, energy, and direction. Seven simple example problems are presented which compare the use of the standard space/energy CADIS with the new space/energy/angle treatments.

  4. Mechanism of Consistent Gyrus Formation: an Experimental and Computational Study

    Science.gov (United States)

    Zhang, Tuo; Razavi, Mir Jalil; Li, Xiao; Chen, Hanbo; Liu, Tianming; Wang, Xianqiao

    2016-11-01

    As a significant type of cerebral cortical convolution pattern, the gyrus is widely preserved across species. Although many hypotheses have been proposed to study the underlying mechanisms of gyrus formation, it is currently still far from clear which factors contribute to the regulation of consistent gyrus formation. In this paper, we employ a joint analysis scheme of experimental data and computational modeling to investigate the fundamental mechanism of gyrus formation. Experimental data on mature human brains and fetal brains show that thicker cortices are consistently found in gyral regions and gyral cortices have higher growth rates. We hypothesize that gyral convolution patterns might stem from heterogeneous regional growth in the cortex. Our computational simulations show that gyral convex patterns may occur in locations where the cortical plate grows faster than the cortex of the brain. Global differential growth can only produce a random gyrification pattern, but it cannot guarantee gyrus formation at certain locations. Based on extensive computational modeling and simulations, it is suggested that a special area in the cerebral cortex with a relatively faster growth speed could consistently engender gyri.

  5. The Different Varieties of the Suyama-Yamaguchi Consistency Relation

    CERN Document Server

    Rodriguez, Yeinzon

    2013-01-01

    We present the different consistency relations that can be seen as variations of the well known Suyama-Yamaguchi (SY) consistency relation \\tau_{NL} \\geqslant (\\frac{6}{5} f_{NL})^2, the latter involving the levels of non-gaussianity f_{NL} and \\tau_{NL} in the primordial curvature perturbation \\zeta, they being scale-invariant. We explicitly state under which conditions the SY consistency relation has been claimed to hold in its different varieties (implicitly) presented in the literature since its inception back in 2008; as a result, we show for the first time that the variety \\tau_{NL} ({\\bf k}_1, {\\bf k}_1) \\geqslant (\\frac{6}{5} f_{NL} ({\\bf k}_1))^2, which we call "the fifth variety", is always satisfied even when there is strong scale-dependence and high levels of statistical anisotropy as long as statistical homogeneity holds: thus, an observed violation of this specific variety would prevent the comparison between theory and observation, shaking this way the foundations of cosmology as a science.

  6. Temporal and contextual consistency of leadership in homing pigeon flocks.

    Directory of Open Access Journals (Sweden)

    Carlos D Santos

    Full Text Available Organized flight of homing pigeons (Columba livia was previously shown to rely on simple leadership rules between flock mates, yet the stability of this social structuring over time and across different contexts remains unclear. We quantified the repeatability of leadership-based flock structures within a flight and across multiple flights conducted with the same animals. We compared two contexts of flock composition: flocks of birds of the same age and flight experience; and, flocks of birds of different ages and flight experience. All flocks displayed consistent leadership-based structures over time, showing that individuals have stable roles in the navigational decisions of the flock. However, flocks of balanced age and flight experience exhibited reduced leadership stability, indicating that these factors promote flock structuring. Our study empirically demonstrates that leadership and followership are consistent behaviours in homing pigeon flocks, but such consistency is affected by the heterogeneity of individual flight experiences and/or age. Similar evidence from other species suggests leadership as an important mechanism for coordinated motion in small groups of animals with strong social bonds.

  7. Dynamic self-consistent field theory for unentangled homopolymer fluids

    Science.gov (United States)

    Mihajlovic, Maja; Lo, Tak Shing; Shnidman, Yitzhak

    2005-10-01

    We present a lattice formulation of a dynamic self-consistent field (DSCF) theory that is capable of resolving interfacial structure, dynamics, and rheology in inhomogeneous, compressible melts and blends of unentangled homopolymer chains. The joint probability distribution of all the Kuhn segments in the fluid, interacting with adjacent segments and walls, is approximated by a product of one-body probabilities for free segments interacting solely with an external potential field that is determined self-consistently. The effect of flow on ideal chain conformations is modeled with finitely extensible, nonlinearly elastic dumbbells in the Peterlin approximation, and related to stepping probabilities in a random walk. Free segment and stepping probabilities generate statistical weights for chain conformations in a self-consistent field, and determine local volume fractions of chain segments. Flux balance across unit lattice cells yields mean field transport equations for the evolution of free segment probabilities and of momentum densities on the Kuhn length scale. Diffusive and viscous contributions to the fluxes arise from segmental hops modeled as a Markov process, with transition rates reflecting changes in segmental interaction, kinetic energy, and entropic contributions to the free energy under flow. We apply the DSCF equations to study both transient and steady-state interfacial structure, flow, and rheology in a sheared planar channel containing either a one-component melt or a phase-separated, two-component blend.

  8. One-particle-irreducible consistency relations for cosmological perturbations

    Science.gov (United States)

    Goldberger, Walter D.; Hui, Lam; Nicolis, Alberto

    2013-05-01

    We derive consistency relations for correlators of scalar cosmological perturbations that hold in the “squeezed limit” in which one or more of the external momenta become soft. Our results are formulated as relations between suitably defined one-particle-irreducible N-point and (N-1)-point functions that follow from residual spatial conformal diffeomorphisms of the unitary gauge Lagrangian. As such, some of these relations are exact to all orders in perturbation theory and do not rely on approximate de Sitter invariance or other dynamical assumptions (e.g., properties of the operator product expansion or the behavior of modes at the horizon crossing). The consistency relations apply model-independently to cosmological scenarios in which the time evolution is driven by a single scalar field. Besides reproducing the known results for single-field inflation in the slow-roll limit, we verify that our consistency relations hold more generally, for instance, in ghost condensate models in flat space. We comment on possible extensions of our results to multifield models.

  9. Consistency of scalar potentials from quantum de Sitter space

    Science.gov (United States)

    Espinosa, José R.; Fortin, Jean-François; Trépanier, Maxime

    2016-06-01

    Consistency of the unconventional view of de Sitter space as a quantum theory of gravity with a finite number of degrees of freedom requires that Coleman-De Luccia tunneling rates to vacua with negative cosmological constant should be interpreted as recurrences to low-entropy states. This demand translates into two constraints, or consistency conditions, on the scalar potential that are generically as follows: (1) the distance in field space between the de Sitter vacuum and any other vacuum with negative cosmological constant must be of the order of the reduced Planck mass or larger and (2) the fourth root of the vacuum energy density of the de Sitter vacuum must be smaller than the fourth root of the typical scale of the scalar potential. These consistency conditions shed a different light on both outstanding hierarchy problems of the standard model of particle physics: the scale of electroweak symmetry breaking and the scale of the cosmological constant. Beyond the unconventional interpretation of quantum de Sitter space, we complete the analytic understanding of the thin-wall approximation of Coleman-De Luccia tunneling, extend its numerical analysis to generic potentials and discuss the role of gravity in stabilizing the standard model potential.

  10. Using Bayesian Networks for Candidate Generation in Consistency-based Diagnosis

    Science.gov (United States)

    Narasimhan, Sriram; Mengshoel, Ole

    2008-01-01

    Consistency-based diagnosis relies heavily on the assumption that discrepancies between model predictions and sensor observations can be detected accurately. When sources of uncertainty like sensor noise and model abstraction exist robust schemes have to be designed to make a binary decision on whether predictions are consistent with observations. This risks the occurrence of false alarms and missed alarms when an erroneous decision is made. Moreover when multiple sensors (with differing sensing properties) are available the degree of match between predictions and observations can be used to guide the search for fault candidates. In this paper we propose a novel approach to handle this problem using Bayesian networks. In the consistency- based diagnosis formulation, automatically generated Bayesian networks are used to encode a probabilistic measure of fit between predictions and observations. A Bayesian network inference algorithm is used to compute most probable fault candidates.

  11. Sport fans: evaluating the consistency between implicit and explicit attitudes toward favorite and rival teams.

    Science.gov (United States)

    Wenger, Jay L; Brown, Roderick O

    2014-04-01

    Sport fans often foster very positive attitudes for their favorite teams and less favorable attitudes for opponents. The current research was designed to evaluate the consistency that might exist between implicit and explicit measures of those attitudes. College students (24 women, 16 men) performed a version of the Implicit Association Test related to their favorite and rival teams. Participants also reported their attitudes for these teams explicitly, via self-report instruments. When responding to the IAT, participants' responses were faster when they paired positive words with concepts related to favorite teams and negative words with rival teams, indicating implicit favorability for favorite teams and implicit negativity for rival teams. This pattern of implicit favorability and negativity was consistent with what participants reported explicitly via self-report. The importance of evaluating implicit attitudes and the corresponding consistency with explicit attitudes are discussed.

  12. Consistency assessment of rating curve data in various locations using Bidirectional Reach (BReach)

    Science.gov (United States)

    Van Eerdenbrugh, Katrien; Van Hoey, Stijn; Coxon, Gemma; Freer, Jim; Verhoest, Niko E. C.

    2017-04-01

    When estimating discharges through rating curves, temporal data consistency is a critical issue. In this research, consistency in stage-discharge data is investigated using a methodology called Bidirectional Reach (BReach). This methodology considers a period to be consistent if no consecutive and systematic deviations from a current situation occur that exceed observational uncertainty. Therefore, the capability of a rating curve model to describe a subset of the (chronologically sorted) data is assessed in each observation by indicating the outermost data points for which the model behaves satisfactory. These points are called the maximum left or right reach, depending on the direction of the investigation. This temporal reach should not be confused with a spatial reach (indicating a part of a river). Changes in these reaches throughout the data series indicate possible changes in data consistency and if not resolved could introduce additional errors and biases. In this research, various measurement stations in the UK, New Zealand and Belgium are selected based on their significant historical ratings information and their specific characteristics related to data consistency. For each station, a BReach analysis is performed and subsequently, results are validated against available knowledge about the history and behavior of the site. For all investigated cases, the methodology provides results that appear consistent with this knowledge of historical changes and facilitates thus a reliable assessment of (in)consistent periods in stage-discharge measurements. This assessment is not only useful for the analysis and determination of discharge time series, but also to enhance applications based on these data (e.g., by informing hydrological and hydraulic model evaluation design about consistent time periods to analyze).

  13. Indexing Learning Objects: Vocabularies and Empirical Investigation of Consistency

    Science.gov (United States)

    Kabel, Suzanne; De Hoog, Robert; Wielinga, Bob; Anjewierden, Anjo

    2004-01-01

    In addition to the LOM standard and instructional design specifications, as well as domain specific indexing vocabularies, a structured indexing vocabulary for the more elementary learning objects is advisable in order to support retrieval tasks of developers. Furthermore, because semantic indexing is seen as a difficult task, three issues…

  14. How Expert Designers Design

    NARCIS (Netherlands)

    C. Carr; Dr. Peter Sloep; P. Kirschner; J. van Merrienboer

    2003-01-01

    This paper discusses two studies - the one in a business context, the other in a university context - carried out with expert educational designers. The studies aimed to determine the priorities experts claim to employ when designing competence-based learning environments. Designers in both contexts

  15. Open, Prospective, Multi-Center, Two-Part Study of Patient Preference with Monthly Ibandronate Therapy in Women with Postmenopausal Osteoporosis Switched From Daily or Weekly Alendronate or Risendronate-BONCURE: Results of Turkish Sub-Study

    Directory of Open Access Journals (Sweden)

    Nurten Eskiyurt

    2012-04-01

    Full Text Available Aim: BONCURE (Bonviva for Current Bisphosphonate Users Regional European Trial, aimed to evaluate patient preference with monthly ibandronate in women with postmenopausal osteoporosis who previously received daily or weekly alendronate or risendronate. Materials and Methods: This prospective, open-label study consisted of two sequential stages, Part A (screening and Part B (treatment. Patients enrolled into Part A completed the Candidate Identification Questionnaire (CIQ. In Part B, after completing the Osteoporosis Patient Satisfaction Questionnaire (OPSATQ, patients received monthly oral ibandronate 150 mg for 6 months. Following treatment, patients completed the OPSAT-Q and Preference Questionnaire. Results: A total of 223 patients (mean age, 63.7±9.51 years were enrolled in Part A from Turkey. Among them, 103 (46.2% answered “YES” to at least one CIQ question. The mean composite OPSAT-Q domain scores increased for convenience (mean change, 15.3±17.7 points, quality of life (10.4±20.4 points, overall satisfaction (11.9±22.7 points, and side effects (3.3±18.8 points. At month 6, 177 subjects (92.7% preferred once-monthly dosing schedule and 99.0% were compliant (≥80% with study treatment. Thirty (15.6% subjects experienced mild to moderate adverse events, mostly gastrointestinal. Conclusion: Postmenopausal women with osteoporosis prefer and are more satisfied and compliant with monthly dosing of ibandronate than daily or weekly bisphosphonate treatment. (Turkish Journal of Osteoporosis 2012;18:1-7

  16. Consistent individual differences in fathering in threespined stickleback Gasterosteus aculeatus

    Institute of Scientific and Technical Information of China (English)

    Laura R. STEIN; Alison M. BELL

    2012-01-01

    There is growing evidence that individual animals show consistent differences in behavior.For example,individual threespined stickleback fish differ in how they react to predators and how aggressive they are during social interactions with conspecifics.A relatively unexplored but potentially important axis of variation is parental behavior.In sticklebacks,fathers provide all of the parental care that is necessary for offspring survival; therefore paternal care is directly tied to fimess.In this study,we assessed whether individual male sticklebacks differ consistently from each other in parental behavior.We recorded visits to nest,total time fanning,and activity levels of 11 individual males every day throughout one clutch,and then allowed the males to breed again.Half of the males were exposed to predation risk while parenting during the fast clutch,and the other half of the males experienced predation risk during the second clutch.We detected dramatic temporal changes in parental behaviors over the course of the clutch:for example,total time fanning increased six-fold prior to eggs hatching,then decreased to approximately zero.Despite these temporal changes,males retained their individually-distinctive parenting styles within a clutch that could not be explained by differences in body size or egg mass.Moreover,individual differences in parenting were maintained when males reproduced for a second time.Males that were exposed to simulated predation risk briefly decreased fanning and increased activity levels.Altogether,these results show that individual sticklebacks consistently differ from each other in how they behave as parents [Current Zoology 58 (1):45-52,2012].

  17. Flood damage curves for consistent global risk assessments

    Science.gov (United States)

    de Moel, Hans; Huizinga, Jan; Szewczyk, Wojtek

    2016-04-01

    Assessing potential damage of flood events is an important component in flood risk management. Determining direct flood damage is commonly done using depth-damage curves, which denote the flood damage that would occur at specific water depths per asset or land-use class. Many countries around the world have developed flood damage models using such curves which are based on analysis of past flood events and/or on expert judgement. However, such damage curves are not available for all regions, which hampers damage assessments in those regions. Moreover, due to different methodologies employed for various damage models in different countries, damage assessments cannot be directly compared with each other, obstructing also supra-national flood damage assessments. To address these problems, a globally consistent dataset of depth-damage curves has been developed. This dataset contains damage curves depicting percent of damage as a function of water depth as well as maximum damage values for a variety of assets and land use classes (i.e. residential, commercial, agriculture). Based on an extensive literature survey concave damage curves have been developed for each continent, while differentiation in flood damage between countries is established by determining maximum damage values at the country scale. These maximum damage values are based on construction cost surveys from multinational construction companies, which provide a coherent set of detailed building cost data across dozens of countries. A consistent set of maximum flood damage values for all countries was computed using statistical regressions with socio-economic World Development Indicators from the World Bank. Further, based on insights from the literature survey, guidance is also given on how the damage curves and maximum damage values can be adjusted for specific local circumstances, such as urban vs. rural locations, use of specific building material, etc. This dataset can be used for consistent supra

  18. Consistency and sealing of advanced bipolar tissue sealers

    Directory of Open Access Journals (Sweden)

    Chekan EG

    2015-04-01

    Full Text Available Edward G Chekan , Mark A Davison, David W Singleton, John Z Mennone, Piet Hinoul Ethicon, Inc., Cincinnati, OH, USA Objectives: The aim of this study was to evaluate two commonly used advanced bipolar devices (ENSEAL® G2 Tissue Sealers and LigaSure™ Blunt Tip for compression uniformity, vessel sealing strength, and consistency in bench-top analyses. Methods: Compression analysis was performed with a foam pad/sensor apparatus inserted between closed jaws of the instruments. Average pressures (psi were recorded across the entire inside surface of the jaws, and over the distal one-third of jaws. To test vessel sealing strength, ex vivo pig carotid arteries were sealed and transected and left and right (sealed halves of vessels were subjected to burst pressure testing. The maximum bursting pressures of each half of vessels were averaged to obtain single data points for analysis. The absence or presence of tissue sticking to device jaws was noted for each transected vessel. Results: Statistically higher average compression values were found for ENSEAL® instruments (curved jaw and straight jaw compared to LigaSure™, P<0.05. Moreover, the ENSEAL® devices retained full compression at the distal end of jaws. Significantly higher and more consistent median burst pressures were noted for ENSEAL® devices relative to LigaSure™ through 52 firings of each device (P<0.05. LigaSure™ showed a significant reduction in median burst pressure for the final three firings (cycles 50–52 versus the first three firings (cycles 1–3, P=0.027. Tissue sticking was noted for 1.39% and 13.3% of vessels transected with ENSEAL® and LigaSure™, respectively. Conclusion: In bench-top testing, ENSEAL® G2 sealers produced more uniform compression, stronger and more consistent vessel sealing, and reduced tissue sticking relative to LigaSure™. Keywords: ENSEAL, sealing, burst pressure, laparoscopic, compression, LigaSure

  19. Consistency of EEG source localization and connectivity estimates.

    Science.gov (United States)

    Mahjoory, Keyvan; Nikulin, Vadim V; Botrel, Loïc; Linkenkaer-Hansen, Klaus; Fato, Marco M; Haufe, Stefan

    2017-05-15

    As the EEG inverse problem does not have a unique solution, the sources reconstructed from EEG and their connectivity properties depend on forward and inverse modeling parameters such as the choice of an anatomical template and electrical model, prior assumptions on the sources, and further implementational details. In order to use source connectivity analysis as a reliable research tool, there is a need for stability across a wider range of standard estimation routines. Using resting state EEG recordings of N=65 participants acquired within two studies, we present the first comprehensive assessment of the consistency of EEG source localization and functional/effective connectivity metrics across two anatomical templates (ICBM152 and Colin27), three electrical models (BEM, FEM and spherical harmonics expansions), three inverse methods (WMNE, eLORETA and LCMV), and three software implementations (Brainstorm, Fieldtrip and our own toolbox). Source localizations were found to be more stable across reconstruction pipelines than subsequent estimations of functional connectivity, while effective connectivity estimates where the least consistent. All results were relatively unaffected by the choice of the electrical head model, while the choice of the inverse method and source imaging package induced a considerable variability. In particular, a relatively strong difference was found between LCMV beamformer solutions on one hand and eLORETA/WMNE distributed inverse solutions on the other hand. We also observed a gradual decrease of consistency when results are compared between studies, within individual participants, and between individual participants. In order to provide reliable findings in the face of the observed variability, additional simulations involving interacting brain sources are required. Meanwhile, we encourage verification of the obtained results using more than one source imaging procedure. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Consistent individual differences in fathering in threespined stickleback Gasterosteus aculeatus

    Directory of Open Access Journals (Sweden)

    Laura R. STEIN, Alison M. BELL

    2012-02-01

    Full Text Available There is growing evidence that individual animals show consistent differences in behavior. For example, individual threespined stickleback fish differ in how they react to predators and how aggressive they are during social interactions with conspecifics. A relatively unexplored but potentially important axis of variation is parental behavior. In sticklebacks, fathers provide all of the parental care that is necessary for offspring survival; therefore paternal care is directly tied to fitness. In this study, we assessed whether individual male sticklebacks differ consistently from each other in parental behavior. We recorded visits to nest, total time fanning, and activity levels of 11 individual males every day throughout one clutch, and then allowed the males to breed again. Half of the males were exposed to predation risk while parenting during the first clutch, and the other half of the males experienced predation risk during the second clutch. We detected dramatic temporal changes in parental behaviors over the course of the clutch: for example, total time fanning increased six-fold prior to eggs hatching, then decreased to approximately zero. Despite these temporal changes, males retained their individually-distinctive parenting styles within a clutch that could not be explained by differences in body size or egg mass. Moreover, individual differences in parenting were maintained when males reproduced for a second time. Males that were exposed to simulated predation risk briefly decreased fanning and increased activity levels. Altogether, these results show that individual sticklebacks consistently differ from each other in how they behave as parents [Current Zoology 58 (1: 45–52, 2012].