Modeling and Testing Legacy Data Consistency Requirements
Nytun, J. P.; Jensen, Christian Søndergaard
2003-01-01
An increasing number of data sources are available on the Internet, many of which offer semantically overlapping data, but based on different schemas, or models. While it is often of interest to integrate such data sources, the lack of consistency among them makes this integration difficult....... This paper addresses the need for new techniques that enable the modeling and consistency checking for legacy data sources. Specifically, the paper contributes to the development of a framework that enables consistency testing of data coming from different types of data sources. The vehicle is UML and its...... accompanying XMI. The paper presents techniques for modeling consistency requirements using OCL and other UML modeling elements: it studies how models that describe the required consistencies among instances of legacy models can be designed in standard UML tools that support XMI. The paper also considers...
75 FR 1356 - RC2 Corporation, Provisional Acceptance of a Settlement Agreement and Order
2010-01-11
... manufactured by OW rather than by RC2's other Chinese contract manufacturer. RC2 cross-referenced each of the...) any claims under the Equal Access to Justice Act. 25. The Commission may publicize the terms of...
Testing the visual consistency of web sites
Geest, van der Thea; Loorbach, Nicole
2005-01-01
Consistency in the visual appearance of Web pages is often checked by experts, such as designers or reviewers. This article reports a card sort study conducted to determine whether users rather than experts could distinguish visual (in-)consistency in Web elements and pages. The users proved to agre
Data of evolutionary structure change: 1E1RC-2HLDT [Confc[Archive
Full Text Available 1E1RC-2HLDT 1E1R 2HLD C T ADTSVDLEETGRVLSIGDGIARVHGLRNVQAEEMVEFSS...VQTGLKAVDALVPIGRGQRELIIGDRQTGKTAVALDTILNQKRWNNGSDESKKLYCVYVAVGQKRSTVAQLVQTLEQHDAMKYSIIVAATASEAAPLQYLAPFTAASI... 2HLD T 2HLDT AAFAQ--
Data of evolutionary structure change: 1E1RC-2HLDA [Confc[Archive
Full Text Available 1E1RC-2HLDA 1E1R 2HLD C A ADTSVDLEETGRVLSIGDGIARVHGLRNVQAEEMVEFSS...ELFYKGIRPAINVGLSVSRVGSAAQTRAMKQVAGTMKLELAQYREVAAFAQFGSDLDAATQQLLSRGVRLTELLKQGQYSPMAIEEQVAVIYAGVRGYLDKLEPSKIT...RPAINVGLSVSRVGSAAQVKALKQVAGSLKLFLAQYREVAAFAQS--DLDASTKQTLVRGERLTQLLKQNQYSPLATEEQV...fEVID> 0 2HLD A 2HLDA AFAQS--DLDAS GGG -- HH
RC2-a new expert system to interpret signals
Grumbach, A.; Ducasse, M.
1983-12-01
A new system of word recognition called the blackboard system or hearsay has been put forth by Ermann. The article devotes a lot of attention to an example which shows how a sentence is constructed starting from phonemes and going on to words, groups of words and phrases. Classic expert systems consist of a base of knowledge and a base of temporary information to bring forth an inference. Lastly there is an interpreter to activate these two bases. The hearsay expert system has some differences: firstly the base of knowledge consists of several modules (KS or knowledge sources) which are independent and communicate by means of the blackboard which is a base of structured facts memorising the hypotheses furnished by the modules. The hypotheses are divided into phonetics, lexical, syntactic, and semantic knowledge. The phonemes provided are grouped into words by the ks phonemes=words, and then into groups of words by the ks words=groups of words and lastly into a phrase. The problem of word recognition is not completely resolved but the principle objective of constructing a kernel has been attained. The recognition of a phrase of 10 words takes about 45 minutes. There still remains a lot of work to be done concerning the knowledge base. 4 references.
Migraine patients consistently show abnormal vestibular bedside tests
Eliana Teixeira Maranhão
2015-01-01
Full Text Available Migraine and vertigo are common disorders, with lifetime prevalences of 16% and 7% respectively, and co-morbidity around 3.2%. Vestibular syndromes and dizziness occur more frequently in migraine patients. We investigated bedside clinical signs indicative of vestibular dysfunction in migraineurs.Objective To test the hypothesis that vestibulo-ocular reflex, vestibulo-spinal reflex and fall risk (FR responses as measured by 14 bedside tests are abnormal in migraineurs without vertigo, as compared with controls.Method Cross-sectional study including sixty individuals – thirty migraineurs, 25 women, 19-60 y-o; and 30 gender/age healthy paired controls.Results Migraineurs showed a tendency to perform worse in almost all tests, albeit only the Romberg tandem test was statistically different from controls. A combination of four abnormal tests better discriminated the two groups (93.3% specificity.Conclusion Migraine patients consistently showed abnormal vestibular bedside tests when compared with controls.
Testing consistency of general relativity with kinematic and dynamical probes
Duan, Xiao-Wei; Zhang, Tong-Jie
2016-01-01
In this work, we test consistency relations between a kinematic probe, the observational Hubble data, and a dynamical probe, the growth rates for cosmic large scale structure, which should hold if general relativity is the correct theory of gravity on cosmological scales. Moreover, we summarize the development history of parametrization in testings and make an improvement of it. Taking advantage of the Hubble parameter given from both parametric and non-parametric methods, we propose three equations and test two of them performed by means of two-dimensional parameterizations, including one using trigonometric functions we propose. As a result, it is found that the consistency relations satisfies well at $1\\sigma$ CL and trigonometric functions turn out to be efficient tools in parameterizations. Furthermore, in order to confirm the validity of our test, we introduce a model of modified gravity, DGP model and compare the testing results in the cases of $\\Lambda$CDM, "DGP in GR" and DGP model with mock data. It...
Product consistency testing of West Valley Compositional Variation Glasses
Olson, K.M.; Marschman, S.C.; Piepel, G.F.; Whiting, G.K.
1994-11-01
Nuclear waste glass produced by the West Valley Demonstration Project (WVDP) must meet the requirements of the Waste Acceptance Preliminary Specification (WAPS) as developed by the US Department of Energy (DOE). To assist WVDP in complying with WAPS, the Materials Characterization Center (MCC) at Pacific Northwest Laboratory (PNL) used the Product Consistency Test (PCT) to evaluate 44 West Valley glasses that had previously been tested in FY 1987 and FY 1988. This report summarizes the results of the PCTs. The glasses tested, which were fabricated as sets of Compositional Variation Glasses for studies performed by the West Valley Support Task (WVST) at PNL during FY 1987 and FY 1988, were doped with Th and U and were variations of West Valley reference glasses. In addition, Approved Reference Material-1 (ARM-1) was used as a test standard (ARM-1 is supplied by the MCC). The PCT was originated at Westinghouse Savannah River Company (WSRC) by C. M. Jantzen and N. R. Bibler (Jantzen and Bibler 1989). The test is a seven-day modified MCC-3 test that uses crushed glass in the size range -100 +200 mesh with deionized water in a Teflon container. There is no agitation during the PCT, and no attempt to include CO{sub 2} from the test environment. Based on B and Li release, the glasses performed about the same as in previous modified MCC-3 testing performed in FY 1987 and FY 1988 (Reimus et al. 1988). The modified MCC-3 tests performed by Reimus et al. were similar to the PCT containers and the exclusion of CO{sub 2} from the tests.
THE PRODUCT CONSISTENCY TEST HOW AND WHY IT WAS DEVELOPED
Jantzen, C; Ned Bibler, N
2008-12-15
The Product Consistency Test (PCT), American Society for Testing Materials (ASTM) Standard C1285, is currently used world wide for testing glass and glass-ceramic waste forms for high level waste (HLW), low level waste (LLW), and hazardous wastes. Development of the PCT was initiated in 1986 because HLW glass waste forms required extensive characterization before actual production began and required continued characterization during production ({ge}25 years). Non-radioactive startup was in 1994 and radioactive startup was in 1996. The PCT underwent extensive development from 1986-1994 and became an ASTM consensus standard in 1994. During the extensive laboratory testing and inter- and intra-laboratory round robins using non-radioactive and radioactive glasses, the PCT was shown to be very reproducible, to yield reliable results rapidly, to distinguish between glasses of different durability and homogeneity, and to easily be performed in shielded cell facilities with radioactive samples. In 1997, the scope was broadened to include hazardous and mixed (radioactive and hazardous) waste glasses. In 2002, the scope was broadened to include glass-ceramic waste forms which are currently being recommended for second generation nuclear wastes yet to be generated in the nuclear renaissance. Since the PCT has proven useful for glass-ceramics with up to 75% ceramic component and has been used to evaluate Pu ceramic waste forms, the use of this test for other ceramic/mineral waste forms such as geopolymers, hydroceramics, and fluidized bed steam reformer mineralized product is under investigation.
Testing the consistency between cosmological measurements of distance and age
Remya Nair
2015-05-01
Full Text Available We present a model independent method to test the consistency between cosmological measurements of distance and age, assuming the distance duality relation. We use type Ia supernovae, baryon acoustic oscillations, and observational Hubble data, to reconstruct the luminosity distance DL(z, the angle-averaged distance DV(z and the Hubble rate H(z, using Gaussian processes regression technique. We obtain estimate of the distance duality relation in the redshift range 0.1
Derivation of the human embryonic stem cell line RCe006-A (RC-2)
P.A. De Sousa; B. Tye; Bruce, K.; P. Dand; RUSSELL, G.; Gardner, J.; J.M. Downie; M. Bateman; A. Courtney
2016-01-01
The human embryonic stem cell line RCe006-A (RC-2) was derived from a frozen and thawed blastocyst voluntarily donated as surplus to fertility requirements following ethics committee approved informed consent under licence from the UK Human Fertilisation and Embryology Authority. The cell line exhibits expression of expected pluripotency markers and in vitro differentiation potential to three germinal lineage representative cell populations. It has a male trisomy 12 karyotype (47XY, +12). Mic...
Peyroux, Elodie; Franck, Nicolas
2014-01-01
In people with psychiatric disorders, particularly those suffering from schizophrenia and related illnesses, pronounced difficulties in social interactions are a key manifestation. These difficulties can be partly explained by impairments in social cognition, defined as the ability to understand oneself and others in the social world, which includes abilities such as emotion recognition, theory of mind (ToM), attributional style, and social perception and knowledge. The impact of several kinds of interventions on social cognition has been studied recently. The best outcomes in the area of social cognition in schizophrenia are those obtained by way of cognitive remediation programs. New strategies and programs in this line are currently being developed, such as RC2S (cognitive remediation of social cognition) in Lyon, France. Considering that the social cognitive deficits experienced by patients with schizophrenia are very diverse, and that the main objective of social cognitive remediation programs is to improve patients' functioning in their daily social life, RC2S was developed as an individualized and flexible program that allows patients to practice social interaction in a realistic environment through the use of virtual reality techniques. In the RC2S program, the patient's goal is to assist a character named Tom in various social situations. The underlying idea for the patient is to acquire cognitive strategies for analyzing social context and emotional information in order to understand other characters' mental states and to help Tom manage his social interactions. In this paper, we begin by presenting some data regarding the social cognitive impairments found in schizophrenia and related disorders, and we describe how these deficits are targeted by social cognitive remediation. Then we present the RC2S program and discuss the advantages of computer-based simulation to improve social cognition and social functioning in people with psychiatric disorders.
Elodie ePEYROUX
2014-06-01
Full Text Available In people with psychiatric disorders, particularly those suffering from schizophrenia and related illnesses, pronounced difficulties in social interactions are a key manifestation. These difficulties can be partly explained by impairments in social cognition, defined as the ability to understand oneself and others in the social world, which includes abilities such as emotion recognition, theory of mind, attributional style, and social perception and knowledge. The impact of several kinds of interventions on social cognition has been studied recently. The best outcomes in the area of social cognition in schizophrenia are those obtained by way of cognitive remediation programs. New strategies and programs in this line are currently being developed, such as RC2S (Cognitive Remediation of Social Cognition in Lyon, France. Considering that the social cognitive deficits experienced by patients with schizophrenia are very diverse, and that the main objective of social cognitive remediation programs is to improve patients’ functioning in their daily social life, RC2S was developed as an individualized and flexible program that allows patients to practice social interaction in a realistic environment through the use of virtual-reality techniques. In the RC2S program, the patient’s goal is to assist a character named Tom in various social situations. The underlying idea for the patient is to acquire cognitive strategies for analyzing social context and emotional information in order to understand other characters’ mental states and to help Tom manage his social interactions. In this paper, we begin by presenting some data regarding the social cognitive impairments found in schizophrenia and related disorders, and we describe how these deficits are targeted by social cognitive remediation. Then we present the RC2S program and discuss the advantages of computer-based simulation to improve social cognition and social functioning in people with
Derivation of the human embryonic stem cell line RCe006-A (RC-2
P.A. De Sousa
2016-03-01
Full Text Available The human embryonic stem cell line RCe006-A (RC-2 was derived from a frozen and thawed blastocyst voluntarily donated as surplus to fertility requirements following ethics committee approved informed consent under licence from the UK Human Fertilisation and Embryology Authority. The cell line exhibits expression of expected pluripotency markers and in vitro differentiation potential to three germinal lineage representative cell populations. It has a male trisomy 12 karyotype (47XY, +12. Microsatellite DNA marker identity and HLA and blood group typing data are available.
Improving social cognition in people with schizophrenia with RC2S: two single-case studies
Elodie ePEYROUX
2016-04-01
Full Text Available Difficulties in social interactions are a central characteristic of people with schizophrenia, and can be partly explained by impairments of social cognitive processes. New strategies of cognitive remediation have been recently developed to target these deficits. The RC2S therapy is an individualized and partly computerized program through which patients practice social interactions and develop social cognitive abilities with simulation techniques in a realistic environment. Here we present the results of two case studies involving two patients with schizophrenia presenting with specific profiles of impaired social cognition. Each patient completed three baseline sessions, 14 treatment sessions, and three follow up sessions at the end of the therapy – and for one patient, another three sessions nine months later. We used a multiple baseline design to assess specific components of social cognition according to the patients’ profiles. Functioning and symptomatology were also assessed at the end of the treatment and six months later. Results highlight significant improvements in the targeted social cognitive processes and positive changes in functioning in the long term. The RC2S program seems thus to be a new useful program for social cognitive remediation in schizophrenia.
Improving Social Cognition in People with Schizophrenia with RC2S: Two Single-Case Studies.
Peyroux, Elodie; Franck, Nicolas
2016-01-01
Difficulties in social interactions are a central characteristic of people with schizophrenia, and can be partly explained by impairments of social cognitive processes. New strategies of cognitive remediation have been recently developed to target these deficits. The RC2S therapy is an individualized and partly computerized program through which patients practice social interactions and develop social cognitive abilities with simulation techniques in a realistic environment. Here, we present the results of two case-studies involving two patients with schizophrenia presenting with specific profiles of impaired social cognition. Each patient completed three baseline sessions, 14 treatment sessions, and 3 follow-up sessions at the end of the therapy - and for 1 patient, another 3 sessions 9 months later. We used a multiple baseline design to assess specific components of social cognition according to the patients' profiles. Functioning and symptomatology were also assessed at the end of the treatment and 6 months later. Results highlight significant improvements in the targeted social cognitive processes and positive changes in functioning in the long term. The RC2S program seems, thus, to be a new useful program for social cognitive remediation in schizophrenia.
The Logical Consistency of Simultaneous Agnostic Hypothesis Tests
Luís G. Esteves
2016-07-01
Full Text Available Simultaneous hypothesis tests can fail to provide results that meet logical requirements. For example, if A and B are two statements such that A implies B, there exist tests that, based on the same data, reject B but not A. Such outcomes are generally inconvenient to statisticians (who want to communicate the results to practitioners in a simple fashion and non-statisticians (confused by conflicting pieces of information. Based on this inconvenience, one might want to use tests that satisfy logical requirements. However, Izbicki and Esteves shows that the only tests that are in accordance with three logical requirements (monotonicity, invertibility and consonance are trivial tests based on point estimation, which generally lack statistical optimality. As a possible solution to this dilemma, this paper adapts the above logical requirements to agnostic tests, in which one can accept, reject or remain agnostic with respect to a given hypothesis. Each of the logical requirements is characterized in terms of a Bayesian decision theoretic perspective. Contrary to the results obtained for regular hypothesis tests, there exist agnostic tests that satisfy all logical requirements and also perform well statistically. In particular, agnostic tests that fulfill all logical requirements are characterized as region estimator-based tests. Examples of such tests are provided.
Still just 1 g: Consistent results from five test batteries
Johnson, W.; te Nijenhuis, J.; Bouchard, T.J.Jr.
2008-01-01
In a recent paper, Johnson, Bouchard, Krueger,McGue, and Gottesman (2004) addressed a long-standing debate in psychology by demonstrating that the g factors derived from three test batteries administered to a single group of individuals were completely correlated. This finding provided evidence for
Internal Consistency and Bias Considerations of the Goodenough-Harris Draw-A-Person Test.
Strommen, Erik F.; Smith, Jeffrey K.
1987-01-01
The internal consistency of the Goodenough-Harris Draw-A-Person Test was examined using 150 children, aged 5-8. The 72-item full scales showed good internal consistency at all ages, with no sex differences. Administration of a 42-item short form resulted in sex effects and differential internal consistency. (Author/GDC)
A diagnostic test for apraxia in stroke patients: internal consistency and diagnostic value.
Heugten, C.M. van; Dekker, J.; Deelman, B.G.; Stehmann-Saris, F.C.; Kinebanian, A.
1999-01-01
The internal consistency and the diagnostic value of a test for apraxia in patients having had a stroke are presented. Results indicate that the items of the test form a strong and consistent scale: Cronbach's alpha as well as the results of a Mokken scale analysis present good reliability and good
A Diagnostic Test for Apraxia in Stroke Patients : Internal consistency and diagnostic value
van Heugten, C.M.; Dekker, J.; Deelman, B.G.; Stehmann-Saris, J.C; Kinebanian, A
1999-01-01
The internal consistency and the diagnostic value of a test for apraxia in patients having had a stroke are presented. Results indicate that the items of the test form a strong and consistent scale: Cronbach's alpha as well as the results of a Mokken scale analysis present good reliability and good
Personal Hypothesis Testing: The Role of Consistency and Self-Schema.
Strohmer, Douglas C.; And Others
1988-01-01
Studied how individuals test hypotheses about themselves. Examined extent to which Snyder's bias toward confirmation persists when negative or nonconsistent personal hypothesis is tested. Found negativity or positivity did not affect hypothesis testing directly, though hypothesis consistency did. Found cognitive schematic variable (vulnerability…
Testing a Variety of Encryption Technologies
Henson, T J
2001-04-09
Review and test speeds of various encryption technologies using Entrust Software. Multiple encryption algorithms are included in the product. Algorithms tested were IDEA, CAST, DES, and RC2. Test consisted of taking a 7.7 MB Word document file which included complex graphics and timing encryption, decryption and signing. Encryption is discussed in the GIAC Kickstart section: Information Security: The Big Picture--Part VI.
A Bayesian Decision-Theoretic Approach to Logically-Consistent Hypothesis Testing
Gustavo Miranda da Silva
2015-09-01
Full Text Available This work addresses an important issue regarding the performance of simultaneous test procedures: the construction of multiple tests that at the same time are optimal from a statistical perspective and that also yield logically-consistent results that are easy to communicate to practitioners of statistical methods. For instance, if hypothesis A implies hypothesis B, is it possible to create optimal testing procedures that reject A whenever they reject B? Unfortunately, several standard testing procedures fail in having such logical consistency. Although this has been deeply investigated under a frequentist perspective, the literature lacks analyses under a Bayesian paradigm. In this work, we contribute to the discussion by investigating three rational relationships under a Bayesian decision-theoretic standpoint: coherence, invertibility and union consonance. We characterize and illustrate through simple examples optimal Bayes tests that fulfill each of these requisites separately. We also explore how far one can go by putting these requirements together. We show that although fairly intuitive tests satisfy both coherence and invertibility, no Bayesian testing scheme meets the desiderata as a whole, strengthening the understanding that logical consistency cannot be combined with statistical optimality in general. Finally, we associate Bayesian hypothesis testing with Bayes point estimation procedures. We prove the performance of logically-consistent hypothesis testing by means of a Bayes point estimator to be optimal only under very restrictive conditions.
Shade, J.W.; Piepel, G.F.
1991-06-01
It is desirable to have a means of monitoring possible changes in waste glass durability during protection so that the product remains within acceptable limits. A leach test called the Product Consistency test (PCT) was developed by Savannah River Laboratory (SRL) as such a production test for the Defense Waste Processing Facility (DWPF). This report examines some of the experimental factors that may be used in the PCT that could influence test precision and its ability to function as intended. An experiment was performed to investigate the effects (on pH and elemental releases of Al, Fe, K, Na, Si, B, Li, and Mn) of modifications to the test conditions of the Product Consistency Test (PCT). The experiment was replicated three times; each replicate involved leach testing two glasses with each of 24 different sets of PCT conditions. 6 refs., 1 fig., 12 tabs.
The consistency test does not - and cannot - deliver what is advertised : A comment on Francis
Morey, Richard D.
2013-01-01
The statistical consistency test of Ioannidis and Trikalinos (2007) has been used recently by Francis (2012a,c,d,e,2013,in press), to argue that specific sets of experiments show evidence of publication bias. I argue that the test is unnecessary because publication bias exists almost everywhere as p
Thermodynamic Consistency Test for Binary Gas+Water Equilibrium Data at Low and High Pressures
Claudio A. Fandez; Felipe A. Quiero; Jos O. Valderrama
2013-01-01
Phase equilibrium in binary gas+water mixtures over wide ranges of temperatures and pressures are modeled and tested for thermodynamic consistency. For modeling, the Peng-Robinson equation of state was used and the Wong-Sandler mixing rules were incorporated into the equation of state parameters. In the Wong-Sandler mixing rules the van Laar model for the excess Gibbs en-ergy was applied. In addition, a reasonable and flexible method is applied to test the thermody-namic consistency of pressure-temperature-concentration (P-T-x) data of these binary mixtures. Modeling is found acceptable in all cases, meaning that deviations in correlating the pressure and the gas phase concentration are low. For all cases the thermodynamic consistency method gives a clear conclusion about consistency or inconsistency of a set of experimental P-T-x data.
Method used to test the imaging consistency of binocular camera's left-right optical system
Liu, Meiying; Wang, Hu; Liu, Jie; Xue, Yaoke; Yang, Shaodong; Zhao, Hui
2016-09-01
To binocular camera, the consistency of optical parameters of the left and the right optical system is an important factor that will influence the overall imaging consistency. In conventional testing procedure of optical system, there lacks specifications suitable for evaluating imaging consistency. In this paper, considering the special requirements of binocular optical imaging system, a method used to measure the imaging consistency of binocular camera is presented. Based on this method, a measurement system which is composed of an integrating sphere, a rotary table and a CMOS camera has been established. First, let the left and the right optical system capture images in normal exposure time under the same condition. Second, a contour image is obtained based on the multiple threshold segmentation result and the boundary is determined using the slope of contour lines near the pseudo-contour line. Third, the constraint of gray level based on the corresponding coordinates of left-right images is established and the imaging consistency could be evaluated through standard deviation σ of the imaging grayscale difference D (x, y) between the left and right optical system. The experiments demonstrate that the method is suitable for carrying out the imaging consistency testing for binocular camera. When the standard deviation 3σ distribution of imaging gray difference D (x, y) between the left and right optical system of the binocular camera does not exceed 5%, it is believed that the design requirements have been achieved. This method could be used effectively and paves the way for the imaging consistency testing of the binocular camera.
Sheng Qiang; Zhen-Fang Du; Min Huang
2014-01-01
Objective: To investigate the inhibitory effects of adenovirus-mediated NDRG2 on the proliferation of human renal cell carcinoma cell line OS-RC-2 in vitro. Methods: NDRG2 was harvested by RT-PCR, confirmed by DNA sequencing, and then cloned into the eukaryotic expression vector pIRES2-EGFP, which encodes green fluorescent protein (GFP), to construct pIRES2-EGFP-NDRG2 plasmid. OS-RC-2 cells with NDRG2 negative expression were transfected with pIRES2-EGFP-NDRG2 plasmid. The growth of transfected OS-RC-2 cells was observed under light and fluorescence microscopes. After colony-forming cell assays, cell proliferation detection and MTT assays, the growth curves of cells in each group were plotted to investigate the inhibitory effects of adenovirus-mediated NDRG2 on the proliferation of OS-RC-2 cells. Cell cycle was determined by flow cytometry. Confocal laser scanning microscopy showed that NDRG2 protein was specifically located on subcellular organelle. Results: A eukaryotic expression vector pIRES2-EGFP-NDRG2 was successfully constructed. After NDRG2 transfection, the growth of OS-RC-2 cells was inhibited. Flow cytometry showed that cells were arrested in S phase but the peak of cell apoptosis was not present, and confocal laser scanning microscopy showed that NDRG2 protein was located in mitochondrion. Conclusions: NDRG2 can significantly inhibit the proliferation of OS-RC-2 cells in vitro and its protein is specifically expressed in the mitochondrion.
Glave, A Page; Didier, Jennifer J; Weatherwax, Jacqueline; Browning, Sarah J; Fiaud, Vanessa
2016-01-01
There are a variety of options to test postural stability; however many physical tests lack validity information. Two tests of postural stability - the Star Excursion Balance Test (SEBT) and Biodex Balance System Limits of Stability Test (LOS) - were examined to determine if similar components of balance were measured. Healthy adults (n=31) completed the LOS (levels 6 and 12) and SEBT (both legs). SEBT directions were offset by 180° to approximate LOS direction. Correlations and partial correlations controlling for height were analyzed. Correlations were significant for SEBT 45° and LOS back-left (6: r=-0.41; 12: r=-0.42; pbalance. Research is needed to determine and define what specific components of balance are being assessed. Care must be taken when choosing balance tests to best match the test to the purpose of testing (fall risk, athletic performance, etc.).
Nonparametric test of consistency between cosmological models and multiband CMB measurements
Aghamousa, Amir
2015-01-01
We present a novel approach to test the consistency of the cosmological models with multiband CMB data using a nonparametric approach. In our analysis we calibrate the REACT (Risk Estimation and Adaptation after Coordinate Transformation) confidence levels associated with distances in function space (confidence distances) based on the Monte Carlo simulations in order to test the consistency of an assumed cosmological model with observation. To show the applicability of our algorithm, we confront Planck 2013 temperature data with concordance model of cosmology considering two different Planck spectra combination. In order to have an accurate quantitative statistical measure to compare between the data and the theoretical expectations, we calibrate REACT confidence distances and perform a bias control using many realizations of the data. Our results in this work using Planck 2013 temperature data put the best fit $\\Lambda$CDM model at $95\\% (\\sim 2\\sigma)$ confidence distance from the center of the nonparametri...
Pasi Nieminen
2010-08-01
Full Text Available This study investigates students’ ability to interpret multiple representations consistently (i.e., representational consistency in the context of the force concept. For this purpose we developed the Representational Variant of the Force Concept Inventory (R-FCI, which makes use of nine items from the 1995 version of the Force Concept Inventory (FCI. These original FCI items were redesigned using various representations (such as motion map, vectorial and graphical, yielding 27 multiple-choice items concerning four central concepts underpinning the force concept: Newton’s first, second, and third laws, and gravitation. We provide some evidence for the validity and reliability of the R-FCI; this analysis is limited to the student population of one Finnish high school. The students took the R-FCI at the beginning and at the end of their first high school physics course. We found that students’ (n=168 representational consistency (whether scientifically correct or not varied considerably depending on the concept. On average, representational consistency and scientifically correct understanding increased during the instruction, although in the post-test only a few students performed consistently both in terms of representations and scientifically correct understanding. We also compared students’ (n=87 results of the R-FCI and the FCI, and found that they correlated quite well.
Tests and applications of self-consistent cranking in the interacting boson model
Kuyucak, S; Kuyucak, Serdar; Sugita, Michiaki
1999-01-01
The self-consistent cranking method is tested by comparing the cranking calculations in the interacting boson model with the exact results obtained from the SU(3) and O(6) dynamical symmetries and from numerical diagonalization. The method is used to study the spin dependence of shape variables in the $sd$ and $sdg$ boson models. When realistic sets of parameters are used, both models lead to similar results: axial shape is retained with increasing cranking frequency while fluctuations in the shape variable $\\gamma$ are slightly reduced.
Banana Split: Testing the Dark Energy Consistency with Geometry and Growth
Ruiz, Eduardo J
2014-01-01
We perform parametric tests of the consistency of the standard $w$CDM model in the framework of General Relativity by carefully separating information between the geometry and growth of structure. We replace each late-universe parameter that describes the behavior of dark energy with two parameters: one describing geometrical information in cosmological probes, and the other controlling the growth of structure. We use data from all principal cosmological probes: of these, Type Ia supernovae, baryon acoustic oscillations, and the peak locations in the cosmic microwave background angular power spectrum constrain the geometry, while the redshift space distortions, weak gravitational lensing and the abundance of galaxy clusters constrain both geometry and growth. Both geometry and growth separately favor the $\\Lambda$CDM cosmology with the matter density relative to critical $\\Omega_M\\simeq 0.3$. When the equation of state is allowed to vary separately for probes of growth and geometry, we find again a good agree...
Consistency tests of the stability of fundamental couplings and unification scenarios
Ferreira, M C; Martins, C J A P; Monteiro, A M R V L; Solà, J
2014-01-01
We test the consistency of several independent astrophysical measurements of fundamental dimensionless constants. In particular, we compare direct measurements of the fine-structure constant $\\alpha$ and the proton-to-electron mass ratio $\\mu=m_p/m_e$ (mostly in the optical/ultraviolet) with combined measurements of $\\alpha$, $\\mu$ and the proton gyromagnetic ratio $g_p$ (mostly in the radio band). We point out some apparent inconsistencies, which suggest that hidden systematics may be affecting some of the measurements. These findings demonstrate the importance of future more precise measurements with ALMA, ESPRESSO and ELT-HIRES. We also highlight some of the implications of the currently available measurements for fundamental physics, specifically for unification scenarios.
Semi-holography for heavy ion collisions: self-consistency and first numerical tests
Mukhopadhyay, Ayan; Preis, Florian; Rebhan, Anton; Stricker, Stefan A.
2016-05-01
We present an extended version of a recently proposed semi-holographic model for heavy-ion collisions, which includes self-consistent couplings between the Yang-Mills fields of the Color Glass Condensate framework and an infrared AdS/CFT sector, such as to guarantee the existence of a conserved energy-momentum tensor for the combined system that is local in space and time, which we also construct explicitly. Moreover, we include a coupling of the topological charge density in the glasma to the same of the holographic infrared CFT. The semi-holographic approach makes it possible to combine CGC initial conditions and weak-coupling glasma field equations with a simultaneous evolution of a strongly coupled infrared sector describing the soft gluons radiated by hard partons. As a first numerical test of the semi-holographic model we study the dynamics of fluctuating homogeneous color-spin-locked Yang-Mills fields when coupled to a homogeneous and isotropic energy-momentum tensor of the holographic IR-CFT, and we find rapid convergence of the iterative numerical procedure suggested earlier.
Peters, Baron; Bolhuis, Peter G; Mullen, Ryan G; Shea, Joan-Emma
2013-02-01
We propose a method for identifying accurate reaction coordinates among a set of trial coordinates. The method applies to special cases where motion along the reaction coordinate follows a one-dimensional Smoluchowski equation. In these cases the reaction coordinate can predict its own short-time dynamical evolution, i.e., the dynamics projected from multiple dimensions onto the reaction coordinate depend only on the reaction coordinate itself. To test whether this property holds, we project an ensemble of short trajectory swarms onto trial coordinates and compare projections of individual swarms to projections of the ensemble of swarms. The comparison, quantified by the Kullback-Leibler divergence, is numerically performed for each isosurface of each trial coordinate. The ensemble of short dynamical trajectories is generated only once by sampling along an initial order parameter. The initial order parameter should separate the reactants and products with a free energy barrier, and distributions on isosurfaces of the initial parameter should be unimodal. The method is illustrated for three model free energy landscapes with anisotropic diffusion. Where exact coordinates can be obtained from Kramers-Langer-Berezhkovskii-Szabo theory, results from the new method agree with the exact results. We also examine characteristics of systems where the proposed method fails. We show how dynamical self-consistency is related (through the Chapman-Kolmogorov equation) to the earlier isocommittor criterion, which is based on longer paths.
Rice, Stephen; Geels, Kasha; Trafimow, David; Hackett, Holly
2011-01-01
Test scores are used to assess one's general knowledge of a specific area. Although strategies to improve test performance have been previously identified, the consistency with which one uses these strategies has not been analyzed in such a way that allows assessment of how much consistency affects overall performance. Participants completed one…
Bateman, Ian J. [Centre for Social and Economic Research on the Global Environment (CSERGE), School of Environmental Sciences, University of East Anglia, Norwich NR4 7TJ (United Kingdom); Brouwer, Roy [Institute for Environmental Studies (IVM), Vrije Universiteit, De Boelelaan 1087, 1081 HV Amsterdam (Netherlands)
2006-08-15
A contingent valuation study is conducted to estimate willingness to pay (WTP) for reducing skin cancer risks. A split sample design contrasts dichotomous choice (DC) with open-ended (OE) methods for eliciting WTP. A novel scope test varies the remit of risk reductions from just the individual respondent to their entire household allowing us to examine both the statistical significance and scale of scope sensitivity. While OE responses fail such tests, DC responses pass both forms of testing. We conclude that conformity of the size of scope effects with prior expectations should form a focus for future validity testing. (author)
Subjective Confidence in Perceptual Judgments: A Test of the Self-Consistency Model
Koriat, Asher
2011-01-01
Two questions about subjective confidence in perceptual judgments are examined: the bases for these judgments and the reasons for their accuracy. Confidence in perceptual judgments has been claimed to rest on qualitatively different processes than confidence in memory tasks. However, predictions from a self-consistency model (SCM), which had been…
Subjective Confidence in Perceptual Judgments: A Test of the Self-Consistency Model
Koriat, Asher
2011-01-01
Two questions about subjective confidence in perceptual judgments are examined: the bases for these judgments and the reasons for their accuracy. Confidence in perceptual judgments has been claimed to rest on qualitatively different processes than confidence in memory tasks. However, predictions from a self-consistency model (SCM), which had been…
Balawender, K.; Jaworski, A.; Kuszewski, H.; Lejda, K.; Ustrzycki, A.
2016-09-01
Measurements concerning emissions of pollutants contained in automobile combustion engine exhaust gases is of primary importance in view of their harmful impact on the natural environment. This paper presents results of tests aimed at determining exhaust gas pollutant emissions from a passenger car engine obtained under repeatable conditions on a chassis dynamometer. The test set-up was installed in a controlled climate chamber allowing to maintain the temperature conditions within the range from -20°C to +30°C. The analysis covered emissions of such components as CO, CO2, NOx, CH4, THC, and NMHC. The purpose of the study was to assess repeatability of results obtained in a number of tests performed as per NEDC test plan. The study is an introductory stage of a wider research project concerning the effect of climate conditions and fuel type on emission of pollutants contained in exhaust gases generated by automotive vehicles.
Consistency of Measured Accuracy in Grammar Knowledge Tests and Writing: TOEFL PBT
Ahangari, Saeideh; Barghi, Ali Hamed
2012-01-01
... level of accuracy in communication. Meanwhile, examples abound where many examinees relatively do well on grammar knowledge tests despite failing to retain accuracy in real-time communicative activities like writing compositions...
Skin prick/puncture testing in North America: a call for standards and consistency
Fatteh, Shahnaz; Rekkerth, Donna J.; Hadley, James A
2014-01-01
Background Skin prick/puncture testing (SPT) is widely accepted as a safe, dependable, convenient, and cost-effective procedure to detect allergen-specific IgE sensitivity. It is, however, prone to influence by a variety of factors that may significantly alter test outcomes, affect the accuracy of diagnosis, and the effectiveness of subsequent immunotherapy regimens. Proficiency in SPT administration is a key variable that can be routinely measured and documented to improve the predictive val...
Lower Bounds for Existential Pebble Games and k-Consistency Tests
Berkholz, Christoph
2012-01-01
The existential k-pebble game characterizes the expressive power of the existential-positive k-variable fragment of first-order logic on finite structures. The winner of the existential k-pebble game on two given finite structures can easily be determined in time O(n^{2k}). We show that there is no O(n^{(k-3)/12}) time algorithm that decides which player can win the existential k-pebble game on two given structures. This lower bound is unconditional and does not rely on any unproven complexity theoretic assumptions. Establishing strong k-consistency is a well-known heuristic for solving the constraint satisfaction problem (CSP). By the game characterization of Kolaitis and Vardi our result implies that there is no O(n^{(k-3)/12}) time algorithm that decides if strong k-consistency can be established for a given CSP-instance.
Estimation of Internal Consistency Reliability When Test Parts Vary in Effective Length.
Feldt, Leonard S.; Charter, Richard A.
2003-01-01
Evaluating a test's reliability often requires dividing it into 3 or more unequal parts, which causes violation of the tau equivalence assumption of Cronbach's alpha. This article presents a criterion for abandoning alpha and an approach for computing a more appropriate estimate of reliability, the Gilmer-Feldt coefficient. (Author)
Validity test and its consistency in the construction of patient loyalty model
Yanuar, Ferra
2016-04-01
The main objective of this present study is to demonstrate the estimation of validity values and its consistency based on structural equation model. The method of estimation was then implemented to an empirical data in case of the construction the patient loyalty model. In the hypothesis model, service quality, patient satisfaction and patient loyalty were determined simultaneously, each factor were measured by any indicator variables. The respondents involved in this study were the patients who ever got healthcare at Puskesmas in Padang, West Sumatera. All 394 respondents who had complete information were included in the analysis. This study found that each construct; service quality, patient satisfaction and patient loyalty were valid. It means that all hypothesized indicator variables were significant to measure their corresponding latent variable. Service quality is the most measured by tangible, patient satisfaction is the most mesured by satisfied on service and patient loyalty is the most measured by good service quality. Meanwhile in structural equation, this study found that patient loyalty was affected by patient satisfaction positively and directly. Service quality affected patient loyalty indirectly with patient satisfaction as mediator variable between both latent variables. Both structural equations were also valid. This study also proved that validity values which obtained here were also consistence based on simulation study using bootstrap approach.
Sakhile Mhlongo
Full Text Available BACKGROUND: Besides access to medical male circumcision, HIV testing, access to condoms and consistent condom use are additional strategies men can use to prevent HIV acquisition. We examine male behavior toward testing and condom use. OBJECTIVE: To determine factors associated with never testing for HIV and consistent condom use among men who never test in Soweto. METHODS: A cross-sectional survey in Soweto was conducted in 1539 men aged 18-32 years in 2007. Data were collected on socio-demographic and behavioral characteristics to determine factors associated with not testing and consistent condom use. RESULTS: Over two thirds (71% of men had not had an HIV test and the majority (55%, n = 602 were young (18-23. Of those not testing, condom use was poor (44%, n = 304. Men who were 18-23 years (aOR: 2.261, CI: 1.534-3.331, with primary (aOR: 2.096, CI: 1.058-4.153 or high school (aOR: 1.622, CI: 1.078-2.439 education, had sex in the last 6 months (aOR: 1.703, CI: 1.055-2.751, and had ≥1 sexual partner (aOR: 1.749, CI: 1.196-2.557 were more likely not to test. Of those reporting condom use (n = 1036, 67%, consistent condom use was 43% (n = 451. HIV testing did not correlate with condom use. CONCLUSION: Low rates of both condom use and HIV testing among men in a high HIV prevalence setting are worrisome and indicate an urgent need to develop innovative behavioral strategies to address this shortfall. Condom use is poor in this population whether tested or not tested for HIV, indicating no association between condom use and HIV testing.
Operational quality control of daily precipitation using spatio-climatological consistency testing
Scherrer, S. C.; Croci-Maspoli, M.; van Geijtenbeek, D.; Naguel, C.; Appenzeller, C.
2010-09-01
Quality control (QC) of meteorological data is of utmost importance for climate related decisions. The search for an effective automated QC of precipitation data has proven difficult and many weather services still use mainly manual inspection of daily precipitation including MeteoSwiss. However, man power limitations force many weather services to move towards less labour intensive and more automated QC with the challenge to keeping data quality high. In the last decade, several approaches have been presented to objectify daily precipitation QC. Here we present a spatio-climatological approach that will be implemented operationally at MeteoSwiss. It combines the information from the event based spatial distribution of everyday's precipitation field and the historical information of the interpolation error using different precipitation intensity intervals. Expert judgement shows that the system is able to detect potential outliers very well (hardly any missed errors) without creating too many false alarms that need human inspection. 50-80% of all flagged values have been classified as real errors by the data editor. This is much better than the roughly 15-20% using standard spatial regression tests. Very helpful in the QC process is the automatic redistribution of accumulated several day sums. Manual inspection in operations can be reduced and the QC of precipitation objectified substantially.
Kalach, Nicolas; Gosset, Pierre; Dehecq, Eric; Decoster, Anne; Georgel, Anne-France; Spyckerelle, Claire; Papadopoulos, Stephanos; Dupont, Christophe; Raymond, Josette
2017-07-01
This French study assessed a quick, noninvasive, immuno-chromatographic, Helicobacter pylori (H. pylori) stool antigen test for detecting infections in children. We enrolled 158 children, with a median age of 8.5 years (range eight months to 17 years), with digestive symptoms suggesting upper gastrointestinal tract disease. Upper digestive endoscopy was performed with gastric biopsy specimens for histology, a rapid urease test, culture test and quantitative real-time polymerase chain reaction. The H. pylori stool antigen test was performed twice for each child and the results were compared to the reference method. The reference methods showed that 23 (14.6%) of the 158 children tested were H. pylori positive. The H. pylori stool antigen test showed 91.3% sensitivity, with a 95% confidence interval (95% CI) of 86.9-95.6 and 97% specificity (95% CI 94.3-99.6), 30.84 positive likelihood ratio and 0.09 negative likelihood ratio. The test accuracy was 96.2% (95% CI 93.2-99.1). The two blinded independent observers produced identical H. pylori stool antigen test results and the Kappa coefficient for the H. pylori stool antigen test was one. The H. pylori stool antigen test was found to be a consistent, reliable, quick and specific test for detecting the H. pylori infection in children. ©2017 Foundation Acta Paediatrica. Published by John Wiley & Sons Ltd.
American Society for Testing and Materials. Philadelphia
2002-01-01
1.1 These product consistency test methods A and B evaluate the chemical durability of homogeneous glasses, phase separated glasses, devitrified glasses, glass ceramics, and/or multiphase glass ceramic waste forms hereafter collectively referred to as “glass waste forms” by measuring the concentrations of the chemical species released to a test solution. 1.1.1 Test Method A is a seven-day chemical durability test performed at 90 ± 2°C in a leachant of ASTM-Type I water. The test method is static and conducted in stainless steel vessels. Test Method A can specifically be used to evaluate whether the chemical durability and elemental release characteristics of nuclear, hazardous, and mixed glass waste forms have been consistently controlled during production. This test method is applicable to radioactive and simulated glass waste forms as defined above. 1.1.2 Test Method B is a durability test that allows testing at various test durations, test temperatures, mesh size, mass of sample, leachant volume, a...
Test-Retest Reliability and Internal Consistency of the Activity Card Sort-Australia (18-64).
Gustafsson, Louise; Hung, Inez Hui Min; Liddle, Jacki
2017-01-01
The Activity Card Sort (ACS) measures activity engagement levels. The Activity Card Sort-Australian version for adults aged 18 to 64 (ACS-Aus (18-64)) was recently developed, and psychometric properties have not yet been determined. This study was established to determine the test-retest reliability and internal consistency of the ACS-Aus (18-64) and describe activity engagement trends for healthy adults. Fifty-four adults aged 18 to 64 participated in this descriptive study. The ACS-Aus (18-64) demonstrated excellent test-retest reliability ( r = .92, p maintenance activities ( t = -2.22, p = .03), and recreation and relaxation activities ( t = -2.38, p = .02). The ACS-Aus (18-64) may be used to explore the activity engagement patterns of community-dwelling Australian adults aged 18 to 64. Further research will determine validity for clinical populations.
Ayadim, A; Amokrane, S
2010-01-27
The accuracy of the structural data obtained from the recently proposed generalization to non-additive hard-spheres (Schmidt 2004 J. Phys.: Condens. Matter 16 L351) of Rosenfeld's functional is investigated. The radial distribution functions computed from the direct correlation functions generated by the functional, through the Ornstein-Zernike equations, are compared with those obtained from the density profile equations in the test-particle limit, without and with test-particle consistency. The differences between these routes and the role of the optimization of the parameters of the reference system when the functional is used to obtain the reference bridge functional are discussed in the case of symmetric binary mixtures of non-additive hard-spheres. The case of highly asymmetric mixtures is finally briefly discussed.
J. Freixa
2012-01-01
Full Text Available Experimental results obtained at integral test facilities (ITFs are used in the validation process of system codes for the transient analyses of light water reactors (LWRs. The expertise and guidelines derived from this work are later applied to transient analyses of nuclear power plants (NPPs. However, the boundary conditions at the NPPs will always differ from those at the ITF, and hence, the soundness of the ITF model needs to be maximized. An unaltered ITF nodalization should prove to be able to simulate as many tests as possible, before any conclusion is derived to NPP analyses. The STARS group at the Paul Scherrer Institut (PSI actively participates in several international programs, where ITFs are being used (e.g., ROSA, PKL. Several tests carried out at the ROSA large-scale test facility operated by the Japan Atomic Energy Agency (JAEA have been simulated in recent years by using the United States Nuclear Regulatory Commission (US-NRC system code TRACE. In this paper, 5 different posttest analyses are presented, along with the evolution of the employed TRACE nodalization and the process followed to track the consistency of the nodalization modifications. The ROSA TRACE nodalization provided results in a reasonable agreement with all 5 experiments.
Varghese, Rini; Hui-Chan, Christina W Y; Wang, Edward; Bhatt, Tanvi
2014-10-01
The purpose of this study was to establish the internal consistency and test-retest reliability of the electromyographic and accelerometric data sampled from the prime movers of the dominant arm during an antigravity, within-arm's length stand-reaching task without trunk restraint. Ten healthy young adults participated in two experimental sessions, approximately 7-10days apart. During each session, subjects performed 15 trials of both a flexion- and an abduction-reaching task. Surface EMG and acceleration using wireless sensors were sampled from the anterior and middle deltoid. Reliability was established using Cronbach's alpha, intraclass correlation coefficients (ICC 2, k) and standard error of measurements (SEM) for electromyographic reaction time, burst duration and normalized amplitude along with peak acceleration. Results indicated high degrees of inter-trial and test-retest reliability for flexion (Cronbach's α range=0.92-0.99; ICC range=0.82-0.92) as well as abduction (Cronbach's α range=0.94-0.99; ICC range=0.81-0.94) reaching. The SEM associated with response variables for flexion and abduction ranged from 1.55-3.26% and 3.33-3.95% of means, respectively. Findings from this study revealed that electromyographic and accelerometric data collected from prime movers of the arm during the relatively functional stand-reaching task were highly reproducible. Given its high reliability and portability, the proposed test could have applications in clinical and laboratory settings to quantify upper limb function.
Kadiyala Srikanth
2009-10-01
Full Text Available Abstract Background U.S. cancer screening guidelines communicate important information regarding the ages for which screening tests are appropriate. Little attention has been given to whether breast, colorectal and prostate cancer screening test use is responsive to guideline age information regarding the age of screening initiation. Methods The 2006 Behavioral Risk Factor Social Survey and the 2003 National Health Interview Surveys were used to compute breast, colorectal and prostate cancer screening test rates by single year of age. Graphical and logistic regression analyses were used to compare screening rates for individuals close to and on either side of the guideline recommended screening initiation ages. Results We identified large discrete shifts in the use of screening tests precisely at the ages where guidelines recommend that screening begin. Mammography screening in the last year increased from 22% [95% CI = 20, 25] at age 39 to 36% [95% CI = 33, 39] at age 40 and 47% [95% CI = 44, 51] at age 41. Adherence to the colorectal cancer screening guidelines within the last year increased from 18% [95% CI = 15, 22] at age 49 to 19% [95% CI = 15, 23] at age 50 and 34% [95% CI = 28, 39] at age 51. Prostate specific antigen screening in the last year increased from 28% [95% CI = 25, 31] at age 49 to 33% [95% CI = 29, 36] and 42% [95% CI = 38, 46] at ages 50 and 51. These results are robust to multivariate analyses that adjust for age, sex, income, education, marital status and health insurance status. Conclusion The results from this study suggest that cancer screening test utilization is consistent with guideline age information regarding the age of screening initiation. Screening test and adherence rates increased by approximately 100% at the breast and colorectal cancer guideline recommended ages compared to only a 50% increase in the screening test rate for prostate cancer screening. Since information regarding the age of cancer screening
Rácz, A; Bajusz, D; Héberger, K
2015-01-01
Recent implementations of QSAR modelling software provide the user with numerous models and a wealth of information. In this work, we provide some guidance on how one should interpret the results of QSAR modelling, compare and assess the resulting models, and select the best and most consistent ones. Two QSAR datasets are applied as case studies for the comparison of model performance parameters and model selection methods. We demonstrate the capabilities of sum of ranking differences (SRD) in model selection and ranking, and identify the best performance indicators and models. While the exchange of the original training and (external) test sets does not affect the ranking of performance parameters, it provides improved models in certain cases (despite the lower number of molecules in the training set). Performance parameters for external validation are substantially separated from the other merits in SRD analyses, highlighting their value in data fusion.
Chin Lin
Full Text Available Conventional genome-wide association studies (GWAS have been proven to be a successful strategy for identifying genetic variants associated with complex human traits. However, there is still a large heritability gap between GWAS and transitional family studies. The "missing heritability" has been suggested to be due to lack of studies focused on epistasis, also called gene-gene interactions, because individual trials have often had insufficient sample size. Meta-analysis is a common method for increasing statistical power. However, sufficient detailed information is difficult to obtain. A previous study employed a meta-regression-based method to detect epistasis, but it faced the challenge of inconsistent estimates. Here, we describe a Markov chain Monte Carlo-based method, called "Epistasis Test in Meta-Analysis" (ETMA, which uses genotype summary data to obtain consistent estimates of epistasis effects in meta-analysis. We defined a series of conditions to generate simulation data and tested the power and type I error rates in ETMA, individual data analysis and conventional meta-regression-based method. ETMA not only successfully facilitated consistency of evidence but also yielded acceptable type I error and higher power than conventional meta-regression. We applied ETMA to three real meta-analysis data sets. We found significant gene-gene interactions in the renin-angiotensin system and the polycyclic aromatic hydrocarbon metabolism pathway, with strong supporting evidence. In addition, glutathione S-transferase (GST mu 1 and theta 1 were confirmed to exert independent effects on cancer. We concluded that the application of ETMA to real meta-analysis data was successful. Finally, we developed an R package, etma, for the detection of epistasis in meta-analysis [etma is available via the Comprehensive R Archive Network (CRAN at https://cran.r-project.org/web/packages/etma/index.html].
Lin, Chin; Chu, Chi-Ming; Su, Sui-Lung
2016-01-01
Conventional genome-wide association studies (GWAS) have been proven to be a successful strategy for identifying genetic variants associated with complex human traits. However, there is still a large heritability gap between GWAS and transitional family studies. The "missing heritability" has been suggested to be due to lack of studies focused on epistasis, also called gene-gene interactions, because individual trials have often had insufficient sample size. Meta-analysis is a common method for increasing statistical power. However, sufficient detailed information is difficult to obtain. A previous study employed a meta-regression-based method to detect epistasis, but it faced the challenge of inconsistent estimates. Here, we describe a Markov chain Monte Carlo-based method, called "Epistasis Test in Meta-Analysis" (ETMA), which uses genotype summary data to obtain consistent estimates of epistasis effects in meta-analysis. We defined a series of conditions to generate simulation data and tested the power and type I error rates in ETMA, individual data analysis and conventional meta-regression-based method. ETMA not only successfully facilitated consistency of evidence but also yielded acceptable type I error and higher power than conventional meta-regression. We applied ETMA to three real meta-analysis data sets. We found significant gene-gene interactions in the renin-angiotensin system and the polycyclic aromatic hydrocarbon metabolism pathway, with strong supporting evidence. In addition, glutathione S-transferase (GST) mu 1 and theta 1 were confirmed to exert independent effects on cancer. We concluded that the application of ETMA to real meta-analysis data was successful. Finally, we developed an R package, etma, for the detection of epistasis in meta-analysis [etma is available via the Comprehensive R Archive Network (CRAN) at https://cran.r-project.org/web/packages/etma/index.html].
Najib, Liana; Abdullah, Lazim
2013-04-01
In the new global economy, road accident has become continuing issue in Malaysia without precise solution on searching the weights and ranks of the causes contributing to road accident. The statistical of road accident fatalities in worldwide are generally persist on increasing from day to day. Thus, the aim of the paper is to purpose fuzzy multi criteria decision making (MCDM) to evaluate causes option with respect to road accident problem. The fuzzy Analytic Hierarchy Process (FAHP) is applied to conduct the relative weights priority related to causes associated with road accident, testing the consistency test of matrix pair-wise comparison of criterion and alternatives by proposed method of Lambda-Max and normalization of weights vectors via Technique for Order Performance by Similarity to Ideal Solution (TOPSIS) method as a sequence to rank the weights priority factors. The triangular fuzzy numbers (TFNs) were applied to handle vagueness and imprecision of the data in fuzzy MCDM. Empirical results are determined by using linguistic variables data via interview with the decision makers. The results demonstrate the capability of the causes contributed to road accident in Malaysia.
Karhula, Kati; Härmä, Mikko; Sallinen, Mikael; Lindholm, Harri; Hirvonen, Ari; Elovainio, Marko; Kivimäki, Mika; Vahtera, Jussi; Puttonen, Sampsa
2017-07-10
This study examined the consistency of salivary cortisol and alpha-amylase (sAA) total daily secretion between laboratory and field circumstances. The 95 participants were shift working female health care professionals with high (n = 53) or low (n = 42) psychosocial stress (job strain) measured by the Job Content Questionnaire (JCQ). The Trier Social Stress Test including a 5-minute free speech and a mental arithmetic task was conducted with four, and field measurements with three daily saliva samples of cortisol and sAA during circadian rhythm and inter-shift recovery controlled morning shift, night shift, and a day off. The associations of salivary cortisol and sAA area under the curve with respect to ground (AUCg) and area under the curve with respect to increase (AUCi) between laboratory and field were tested using OLS (Ordinary Least Squares) regression. The sAA AUCg output in the laboratory was correlated with output during all field measurement days and similarly among high and low job strain groups (p<0.001). SAA AUCi and salivary cortisol AUCg and AUCi were not correlated between laboratory and field measurement, neither in the whole sample nor among low or high job strain group. In conclusion, a laboratory measure of sAA AUCg output is promising in predicting stress-related output during burdensome work shifts and leisure time, whereas sAA AUCi or salivary cortisol seem not to have this potential.
Nass, C; Lee, K M
2001-09-01
Would people exhibit similarity-attraction and consistency-attraction toward unambiguously computer-generated speech even when personality is clearly not relevant? In Experiment 1, participants (extrovert or introvert) heard a synthesized voice (extrovert or introvert) on a book-buying Web site. Participants accurately recognized personality cues in text to speech and showed similarity-attraction in their evaluation of the computer voice, the book reviews, and the reviewer. Experiment 2, in a Web auction context, added personality of the text to the previous design. The results replicated Experiment 1 and demonstrated consistency (voice and text personality)-attraction. To maximize liking and trust, designers should set parameters, for example, words per minute or frequency range, that create a personality that is consistent with the user and the content being presented.
Johnston, Shirley H.; And Others
A computer simulation was undertaken to determine the effects of using Huynh's single-administration estimates of the decision consistency indices for agreement and coefficient kappa, under conditions that violated the beta-binomial assumption. Included in the investigation were two unimodal score distributions that fit the model and two bimodal…
Popescu, Viorel D; Valpine, Perry; Sweitzer, Rick A
2014-04-01
Wildlife data gathered by different monitoring techniques are often combined to estimate animal density. However, methods to check whether different types of data provide consistent information (i.e., can information from one data type be used to predict responses in the other?) before combining them are lacking. We used generalized linear models and generalized linear mixed-effects models to relate camera trap probabilities for marked animals to independent space use from telemetry relocations using 2 years of data for fishers (Pekania pennanti) as a case study. We evaluated (1) camera trap efficacy by estimating how camera detection probabilities are related to nearby telemetry relocations and (2) whether home range utilization density estimated from telemetry data adequately predicts camera detection probabilities, which would indicate consistency of the two data types. The number of telemetry relocations within 250 and 500 m from camera traps predicted detection probability well. For the same number of relocations, females were more likely to be detected during the first year. During the second year, all fishers were more likely to be detected during the fall/winter season. Models predicting camera detection probability and photo counts solely from telemetry utilization density had the best or nearly best Akaike Information Criterion (AIC), suggesting that telemetry and camera traps provide consistent information on space use. Given the same utilization density, males were more likely to be photo-captured due to larger home ranges and higher movement rates. Although methods that combine data types (spatially explicit capture-recapture) make simple assumptions about home range shapes, it is reasonable to conclude that in our case, camera trap data do reflect space use in a manner consistent with telemetry data. However, differences between the 2 years of data suggest that camera efficacy is not fully consistent across ecological conditions and make the case
Staunstrup, Jørgen
1998-01-01
This paper proposes that Interface Consistency is an important issue for the development of modular designs. Byproviding a precise specification of component interfaces it becomes possible to check that separately developedcomponents use a common interface in a coherent matter thus avoiding a very...... significant source of design errors. Awide range of interface specifications are possible, the simplest form is a syntactical check of parameter types.However, today it is possible to do more sophisticated forms involving semantic checks....
Wang Ting Ting
2016-01-01
Full Text Available A convenient method is proposed here to determine the relative accuracy of measuring ionization chamber (MIC. A non-dimensional quantity is presented here to characterize the relative accuracy of the testing sample to the reference sample. There are two key points of the program. One is comparison by pairs. The other is eliminating the uncertainty in the program by exchanging the position and control unit in each testing group.
Bordin, Lorenzo; Creminelli, Paolo; Mirbabayi, Mehrdad; Noreña, Jorge
2017-03-01
We argue that isotropic scalar fluctuations in solid inflation are adiabatic in the super-horizon limit. During the solid phase this adiabatic mode has peculiar features: constant energy-density slices and comoving slices do not coincide, and their curvatures, parameterized respectively by ζ and Script R, both evolve in time. The existence of this adiabatic mode implies that Maldacena's squeezed limit consistency relation holds after angular average over the long mode. The correlation functions of a long-wavelength spherical scalar mode with several short scalar or tensor modes is fixed by the scaling behavior of the correlators of short modes, independently of the solid inflation action or dynamics of reheating.
Using a Theory-Consistent CVAR Scenario to Test an Exchange Rate Model Based on Imperfect Knowledge
Katarina Juselius
2017-07-01
Full Text Available A theory-consistent CVAR scenario describes a set of testable regularieties one should expect to see in the data if the basic assumptions of the theoretical model are empirically valid. Using this method, the paper demonstrates that all basic assumptions about the shock structure and steady-state behavior of an an imperfect knowledge based model for exchange rate determination can be formulated as testable hypotheses on common stochastic trends and cointegration. This model obtaines remarkable support for almost every testable hypothesis and is able to adequately account for the long persistent swings in the real exchange rate.
Solomon, B.O.; Erickson, L.E.; Hess, J.L.
1981-10-01
The microorganism Candida utilis was grown on both filtered and unfiltered substrate obtained from enzymatic hydrolysis of starch in corn dust. For growth on filtered substrate, the average integrated biomass energetic yield value based on biomass-substrate data was eta = 0.55 and for growth on unfiltered substrate an average yield value of eta = 0.59 was obtained. Material and energy balances showed that the presence of unfiltered corn residue in the media had no significant effect on the yields. Statistical methods were developed and used to obtain best estimates of the growth parameters. Values of the biomass energetic yield corrected for maintenance (eta maximum = 0.619) and the maintenance coefficient (Me = 0.043) were obtained for growth on filtered substrate. Values of the maximum = 0.741 and Me = 0.142 were obtained for the grown on unfiltered substrate. The consistency of data and parameter estimates was relatively good for filtered substrate; however, the parameter estimates for unfiltered substrate were not consistent. Growth experiments without filtration of the products of starch hydrolysis resulted in protein-enriched products with about 39.73% protein. (Refs. 26).
Sahai, Vic; Demeyere, Petra; Poirier, Sheila; Piro, Felice
1998-01-01
The recall of information about Hepatitis B demonstrated by 180 seventh graders was tested with three test types: (1) short-answer; (2) true/false; and (3) multiple-choice. Short answer testing was the most reliable. Suggestions are made for the use of short-answer tests in evaluating student knowledge. (SLD)
Zhao, Liang; Mak, Thomas C W
2005-11-01
New Agn subset C2-R-C2 supersetAgn (R = p-, m-, o-C6H4; n = 4, 5) supramolecular synthons have been explored in the coordination network assembly of silver(I) complexes of the isomeric phenylenediethynides. An unprecedented mu5-eta1-coordination mode for the ethynide moiety and a mixed mu4,mu5-coordination mode for the o-phenylenediethynide group are observed, providing a rationale for the abundant occurrence of C2@Agn (n
Shaw, Rachael C
2017-01-01
Developing cognitive tasks to reliably quantify individual differences in cognitive ability is critical to advance our understanding of the fitness consequences of cognition in the wild. Several factors may influence individual performance in a cognitive task, with some being unrelated to the cognitive ability that is the target of the test. It is therefore essential to assess how extraneous factors may affect task performance, particularly for those tasks that are frequently used to quantify individual differences in cognitive ability. The current study therefore measured the performance of wild North Island robins in two tasks commonly used to measure individual differences in avian cognition: a novel motor task and a detour reaching task. The robins' performance in the motor task was affected by prior experience; individuals that had previously participated in a similar task that required a different motor action pattern outperformed naïve subjects. By contrast, detour reaching performance was influenced by an individual's body condition, suggesting that energetic state may affect inhibitory control in robins. Designing tasks that limit the influence of past experience and developing means of standardising motivation across animals tested in the wild remain key challenges to improving current measurements of cognitive ability in birds.
Lin, Jonathan; Moore, David; Hockey, Brad; Di Lernia, Rachel; Gorelik, Alexandra; Liew, Danny; Nicoll, Amanda
2014-04-01
Volatile anaesthetic drug-induced liver injury can range from asymptomatic alanine transaminase elevations to fatal hepatic necrosis. There is very limited research regarding hepatotoxicity of modern volatile anaesthetic agents. The aim of this study was to determine how common liver injury consistent with volatile anaesthetic hepatitis is, following exposure to isoflurane, desflurane and sevoflurane; and to propose risk factors for its development. Following ethics approval, we conducted a retrospective audit of adult trauma patients with abnormal liver biochemistry following volatile anaesthesia during January 1 to December 31, 2009. The data collected included patient demographics, volatile anaesthetic administration, concurrent medication, perioperative liver biochemistry results and comorbidities. The Council for International Organisations of Medical Sciences/Roussel Uclaf Causality Assessment Method scoring system was used to group cases according to the likelihood of volatile anaesthetic being the causative agent of drug-induced hepatotoxicity. Forty-seven (3%) of 1556 patients had abnormal post-operative liver biochemistry potentially attributable to volatile anaesthetic. Of the 47, 12 patients (26%) had peak alanine transaminase levels greater than 200 U/L. No significant predictors of volatile anaesthetic drug-induced liver injury following isoflurane, desflurane or sevoflurane anaesthesia could be identified. Volatile anaesthetic drug-induced liver injury in adult trauma patients may be significantly more common than previously noted. This study suggests that about a quarter of patients with volatile anaesthetic drug-induced liver injury develop significant liver injury. Further prospective studies are required to define risk factors and clinical outcomes. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
SLOBODAN P. SERBANOVIC
2000-12-01
Full Text Available The Kojima-Moon-Ochi (KMO thermodynamic consistency test of vapourliquid equilibrium (VLE measurements for 32 isothermal data sets of binary systems of various complexity was applied using two fitting equations: the Redlich-Kister equation and the Sum of Symmetrical Functions. It was shown that the enhanced reliability of the fitting of the experimental data can change the conclusions drawn on their thermodynamic consistency in those cases of VLE data sets that are estimated to be near the border of consistency.
Ananthanarayan, B.; Das, Diganta; Sentitemsu Imsong, I.
2012-10-01
Ampcalculator (AMPC) is a Mathematica © based program that was made publicly available some time ago by Unterdorfer and Ecker. It enables the user to compute several processes at one loop (upto O( p 4) in SU(3) chiral perturbation theory. They include computing matrix elements and form factors for strong and non-leptonic weak processes with at most six external states. It was used to compute some novel processes and was tested against well-known results by the original authors. Here we present the results of several thorough checks of the package. Exhaustive checks performed by the original authors are not publicly available, and hence the present effort. Some new results are obtained from the software especially in the kaon odd-intrinsic parity non-leptonic decay sector involving the coupling G 27. Another illustrative set of amplitudes at tree level we provide is in the context of τ-decays with several mesons including quark mass effects, of use to the BELLE experiment. All eight meson-meson scattering amplitudes have been checked. The Kaon-Compton amplitude has been checked and a minor error in the published results has been pointed out. This exercise is a tutorial-based one, wherein several input and output notebooks are also being made available as ancillary files on the arXiv. Some of the additional notebooks we provide contain explicit expressions that we have used for comparison with established results. The purpose is to encourage users to apply the software to suit their specific needs. An automatic amplitude generator of this type can provide error-free outputs that could be used as inputs for further simplification, and in varied scenarios such as applications of chiral perturbation theory at finite temperature, density and volume. This can also be used by students as a learning aid in low-energy hadron dynamics.
Ananthanarayan, B.; Sentitemsu Imsong, I. [Indian Institute of Science, Centre for High Energy Physics, Bangalore (India); Das, Diganta [The Institute of Mathematical Sciences Taramani, Chennai (India)
2012-10-15
Ampcalculator (AMPC) is a Mathematica {sup copyright} based program that was made publicly available some time ago by Unterdorfer and Ecker. It enables the user to compute several processes at one loop (upto O(p {sup 4})) in SU(3) chiral perturbation theory. They include computing matrix elements and form factors for strong and non-leptonic weak processes with at most six external states. It was used to compute some novel processes and was tested against well-known results by the original authors. Here we present the results of several thorough checks of the package. Exhaustive checks performed by the original authors are not publicly available, and hence the present effort. Some new results are obtained from the software especially in the kaon odd-intrinsic parity non-leptonic decay sector involving the coupling G{sub 27}. Another illustrative set of amplitudes at tree level we provide is in the context of {tau}-decays with several mesons including quark mass effects, of use to the BELLE experiment. All eight meson-meson scattering amplitudes have been checked. The Kaon-Compton amplitude has been checked and a minor error in the published results has been pointed out. This exercise is a tutorial-based one, wherein several input and output notebooks are also being made available as ancillary files on the arXiv. Some of the additional notebooks we provide contain explicit expressions that we have used for comparison with established results. The purpose is to encourage users to apply the software to suit their specific needs. An automatic amplitude generator of this type can provide error-free outputs that could be used as inputs for further simplification, and in varied scenarios such as applications of chiral perturbation theory at finite temperature, density and volume. This can also be used by students as a learning aid in low-energy hadron dynamics. (orig.)
Fox, K. M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Edwards, T. B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Mcclane, D. L. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)
2016-03-01
In this report, Savannah River National Laboratory provides chemical analyses and Product Consistency Test (PCT) results for a series of simulated high level waste (HLW) glasses fabricated by Pacific Northwest National Laboratory (PNNL) as part of an ongoing nepheline crystallization study. The results of these analyses will be used to improve the ability to predict crystallization of nepheline as a function of composition and heat treatment for glasses formulated at high alumina concentrations.
Fox, K. M. [Savannah River Site (SRS), Aiken, SC (United States); Edwards, T. B. [Savannah River Site (SRS), Aiken, SC (United States); Mcclane, D. L. [Savannah River Site (SRS), Aiken, SC (United States)
2016-02-17
In this report, SRNL provides chemical analyses and Product Consistency Test (PCT) results for a series of simulated HLW glasses fabricated by Pacific Northwest National Laboratory (PNNL) as part of an ongoing nepheline crystallization study. The results of these analyses will be used to improve the ability to predict crystallization of nepheline as a function of composition and heat treatment for glasses formulated at high alumina concentrations.
Fox, K. M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Edwards, T. B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Riley, W. T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Best, D. R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)
2015-09-03
In this report, the Savannah River National Laboratory provides chemical analyses and Product Consistency Test (PCT) results for several simulated low activity waste (LAW) glasses (designated as the January, March, and April 2015 LAW glasses) fabricated by the Pacific Northwest National Laboratory. The results of these analyses will be used as part of efforts to revise or extend the validation regions of the current Hanford Waste Treatment and Immobilization Plant glass property models to cover a broader span of waste compositions.
Kimbrel, Nathan A; Flynn, Elisa J; Carpenter, Grace Stephanie J; Cammarata, Claire M; Leto, Frank; Ostiguy, William J; Kamholz, Barbara W; Zimering, Rose T; Gulliver, Suzy B
2015-08-30
This study examined the psychometric properties of a Likert-based version of the Sources of Occupational Stress-14 (SOOS-14) scale. Internal consistency for the SOOS-14 ranged from 0.78-0.84, whereas three-month test-retest reliability was 0.51. In addition, SOOS-14 scores were prospectively associated with symptoms of PTSD and depression at a three-month follow-up assessment. Published by Elsevier Ireland Ltd.
Fox, K. M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Edwards, T. B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Best, D. R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)
2015-07-07
In this report, the Savannah River National Laboratory provides chemical analyses and Product Consistency Test (PCT) results for several simulated low activity waste (LAW) glasses (designated as the August and October 2014 LAW glasses) fabricated by the Pacific Northwest National Laboratory. The results of these analyses will be used as part of efforts to revise or extend the validation regions of the current Hanford Waste Treatment and Immobilization Plant glass property models to cover a broader span of waste compositions.
Farooqi, Rahmat Ullah; Hrma, Pavel
2016-06-01
We have investigated the effect of Al/B ratio on the Product Consistency Test (PCT) response. In an aluminoborosilicate soda-lime glass based on a modified International Simple Glass, ISG-3, the Al/B ratio varied from 0 to 0.55 (in mole fractions). In agreement with various models of the PCT response as a function of glass composition, we observed a monotonic increase of B and Na releases with decreasing Al/B mole ratio, but only when the ratio was higher than 0.05. Below this value (Al/B < 0.05), we observed a sharp decrease that we attribute to B in tetrahedral coordination.
Farooqi, Rahmat Ullah; Hrma, Pavel
2016-06-01
We have investigated the effect of A1/B ratio on the Product Consistency Test (PCT) response. In an aluminoborosilicate soda-lime glass based on a modified International Simple Glass, ISG-3, the A1/B ratio varied from 0 to 0.55 (in mole fractions). In agreement with various models of the PCT response as a function of glass composition, we observed a monotonic increase of B and Na releases with decreasing A1/B mole ratio, but only when the ratio was higher than 0.05. Below this value (A1/B < 0.05), we observed a sharp decrease that we attribute to B in tetrahedral coordination.
Ayadim, A; Amokrane, S [Physique des Liquides et Milieux Complexes, Faculte des Sciences et Technologie, Universite Paris-Est (Creteil), 61 Avenue du General de Gaulle, 94010 Creteil Cedex (France)
2010-01-27
The accuracy of the structural data obtained from the recently proposed generalization to non-additive hard-spheres (Schmidt 2004 J. Phys.: Condens. Matter 16 L351) of Rosenfeld's functional is investigated. The radial distribution functions computed from the direct correlation functions generated by the functional, through the Ornstein-Zernike equations, are compared with those obtained from the density profile equations in the test-particle limit, without and with test-particle consistency. The differences between these routes and the role of the optimization of the parameters of the reference system when the functional is used to obtain the reference bridge functional are discussed in the case of symmetric binary mixtures of non-additive hard-spheres. The case of highly asymmetric mixtures is finally briefly discussed.
Robin, C.; Pillet, N.; Peña Arteaga, D.; Berger, J.-F.
2016-02-01
Background: Although self-consistent multiconfiguration methods have been used for decades to address the description of atomic and molecular many-body systems, only a few trials have been made in the context of nuclear structure. Purpose: This work aims at the development of such an approach to describe in a unified way various types of correlations in nuclei in a self-consistent manner where the mean-field is improved as correlations are introduced. The goal is to reconcile the usually set-apart shell-model and self-consistent mean-field methods. Method: This approach is referred to as "variational multiparticle-multihole configuration mixing method." It is based on a double variational principle which yields a set of two coupled equations that determine at the same time the expansion coefficients of the many-body wave function and the single-particle states. The solution of this problem is obtained by building a doubly iterative numerical algorithm. Results: The formalism is derived and discussed in a general context, starting from a three-body Hamiltonian. Links to existing many-body techniques such as the formalism of Green's functions are established. First applications are done using the two-body D1S Gogny effective force. The numerical procedure is tested on the 12C nucleus to study the convergence features of the algorithm in different contexts. Ground-state properties as well as single-particle quantities are analyzed, and the description of the first 2+ state is examined. Conclusions: The self-consistent multiparticle-multihole configuration mixing method is fully applied for the first time to the description of a test nucleus. This study makes it possible to validate our numerical algorithm and leads to encouraging results. To test the method further, we will realize in the second article of this series a systematic description of more nuclei and observables obtained by applying the newly developed numerical procedure with the same Gogny force. As
Cole, Jason C; Ito, Diane; Chen, Yaozhu J; Cheng, Rebecca; Bolognese, Jennifer; Li-McLeod, Josephine
2014-09-04
There is a lack of validated instruments to measure the level of burden of Alzheimer's disease (AD) on caregivers. The Impact of Alzheimer's Disease on Caregiver Questionnaire (IADCQ) is a 12-item instrument with a seven-day recall period that measures AD caregiver's burden across emotional, physical, social, financial, sleep, and time aspects. Primary objectives of this study were to evaluate psychometric properties of IADCQ administered on the Web and to determine most appropriate scoring algorithm. A national sample of 200 unpaid AD caregivers participated in this study by completing the Web-based version of IADCQ and Short Form-12 Health Survey Version 2 (SF-12v2™). The SF-12v2 was used to measure convergent validity of IADCQ scores and to provide an understanding of the overall health-related quality of life of sampled AD caregivers. The IADCQ survey was also completed four weeks later by a randomly selected subgroup of 50 participants to assess test-retest reliability. Confirmatory factor analysis (CFA) was implemented to test the dimensionality of the IADCQ items. Classical item-level and scale-level psychometric analyses were conducted to estimate psychometric characteristics of the instrument. Test-retest reliability was performed to evaluate the instrument's stability and consistency over time. Virtually none (2%) of the respondents had either floor or ceiling effects, indicating the IADCQ covers an ideal range of burden. A single-factor model obtained appropriate goodness of fit and provided evidence that a simple sum score of the 12 items of IADCQ can be used to measure AD caregiver's burden. Scales-level reliability was supported with a coefficient alpha of 0.93 and an intra-class correlation coefficient (for test-retest reliability) of 0.68 (95% CI: 0.50-0.80). Low-moderate negative correlations were observed between the IADCQ and scales of the SF-12v2. The study findings suggest the IADCQ has appropriate psychometric characteristics as a
Sharmila Vaz
Full Text Available The social skills rating system (SSRS is used to assess social skills and competence in children and adolescents. While its characteristics based on United States samples (US are published, corresponding Australian figures are unavailable. Using a 4-week retest design, we examined the internal consistency, retest reliability and measurement error (ME of the SSRS secondary student form (SSF in a sample of Year 7 students (N = 187, from five randomly selected public schools in Perth, western Australia. Internal consistency (IC of the total scale and most subscale scores (except empathy on the frequency rating scale was adequate to permit independent use. On the importance rating scale, most IC estimates for girls fell below the benchmark. Test-retest estimates of the total scale and subscales were insufficient to permit reliable use. ME of the total scale score (frequency rating for boys was equivalent to the US estimate, while that for girls was lower than the US error. ME of the total scale score (importance rating was larger than the error using the frequency rating scale. The study finding supports the idea of using multiple informants (e.g. teacher and parent reports, not just student as recommended in the manual. Future research needs to substantiate the clinical meaningfulness of the MEs calculated in this study by corroborating them against the respective Minimum Clinically Important Difference (MCID.
Sheahan, Peter J; Nelson-Wong, Erika J; Fischer, Steven L
2015-01-01
The Oswestry Disability Index (ODI) is a self-report-based outcome measure used to quantify the extent of disability related to low back pain (LBP), a substantial contributor to workplace absenteeism. The ODI tool has been adapted for use by patients in several non-English speaking nations. It is unclear, however, if these adapted versions of the ODI are as credible as the original ODI developed for English-speaking nations. The objective of this study was to conduct a review of the literature to identify culturally adapted versions of the ODI and to report on the adaptation process, construct validity, test-retest reliability and internal consistency of these ODIs. Following a pragmatic review process, data were extracted from each study with regard to these four outcomes. While most studies applied adaptation processes in accordance with best-practice guidelines, there were some deviations. However, all studies reported high-quality psychometric properties: group mean construct validity was 0.734 ± 0.094 (indicated via a correlation coefficient), test-retest reliability was 0.937 ± 0.032 (indicated via an intraclass correlation coefficient) and internal consistency was 0.876 ± 0.047 (indicated via Cronbach's alpha). Researchers can be confident when using any of these culturally adapted ODIs, or when comparing and contrasting results between cultures where these versions were employed. Implications for Rehabilitation Low back pain is the second leading cause of disability in the world, behind only cancer. The Oswestry Disability Index (ODI) has been developed as a self-report outcome measure of low back pain for administration to patients. An understanding of the various cross-cultural adaptations of the ODI is important for more concerted multi-national research efforts. This review examines 16 cross-cultural adaptations of the ODI and should inform the work of health care and rehabilitation professionals.
Miller, William; Liu, Jian; Miller, William H.
2008-03-15
The linearized approximation to the semiclassical initial value representation (LSC-IVR) is used to calculate time correlation functions relevant to the incoherent dynamic structure factor for inelastic neutron scattering from liquid para-hydrogen at 14 K. Various time correlations functions were used which, if evaluated exactly, would give identical results, but they do not because the LSC-IVR is approximate. Some of the correlation functions involve only linear operators, and others involve non-linear operators. The consistency of the results obtained with the various time correlation functions thus provides a useful test of the accuracy of the LSC-IVR approximation and its ability to treat correlation functions involving both linear and nonlinear operators in realistic anharmonic systems. The good agreement of the results obtained from different correlation functions, their excellent behavior in the spectral moment tests based on the exact moment constraints, and their semi-quantitative agreement with the inelastic neutron scattering experimental data all suggest that the LSC-IVR is indeed a good short-time approximation for quantum mechanical correlation functions.
Miehls, Scott M.; Johnson, Nicholas S.; Hrodey, Pete J.
2017-01-01
Control of the invasive Sea Lamprey Petromyzon marinus is critical for management of commercial and recreational fisheries in the Laurentian Great Lakes. Use of physical barriers to block Sea Lampreys from spawning habitat is a major component of the control program. However, the resulting interruption of natural streamflow and blockage of nontarget species present substantial challenges. Development of an effective nonphysical barrier would aid the control of Sea Lampreys by eliminating their access to spawning locations while maintaining natural streamflow. We tested the effect of a nonphysical barrier consisting of strobe lights, low-frequency sound, and a bubble screen on the movement of Sea Lampreys in an experimental raceway designed as a two-choice maze with a single main channel fed by two identical inflow channels (one control and one blocked). Sea Lampreys were more likely to move upstream during trials when the strobe light and low-frequency sound were active compared with control trials and trials using the bubble screen alone. For those Sea Lampreys that did move upstream to the confluence of inflow channels, no combination of stimuli or any individual stimulus significantly influenced the likelihood that Sea Lampreys would enter the blocked inflow channel, enter the control channel, or return downstream.
Fox, K. M. [Savannah River Site (SRS), Aiken, SC (United States); Edwards, T. B. [Savannah River Site (SRS), Aiken, SC (United States)
2015-12-01
In this report, the Savannah River National Laboratory provides chemical analyses and Product Consistency Test (PCT) results for 14 simulated high level waste glasses fabricated by the Pacific Northwest National Laboratory. The results of these analyses will be used as part of efforts to revise or extend the validation regions of the current Hanford Waste Treatment and Immobilization Plant glass property models to cover a broader span of waste compositions. The measured chemical composition data are reported and compared with the targeted values for each component for each glass. All of the measured sums of oxides for the study glasses fell within the interval of 96.9 to 100.8 wt %, indicating recovery of all components. Comparisons of the targeted and measured chemical compositions showed that the measured values for the glasses met the targeted concentrations within 10% for those components present at more than 5 wt %. The PCT results were normalized to both the targeted and measured compositions of the study glasses. Several of the glasses exhibited increases in normalized concentrations (NCi) after the canister centerline cooled (CCC) heat treatment. Five of the glasses, after the CCC heat treatment, had NC_{B} values that exceeded that of the Environmental Assessment (EA) benchmark glass. These results can be combined with additional characterization, including X-ray diffraction, to determine the cause of the higher release rates.
Katz, Harley; Lelli, Federico; McGaugh, Stacy S.; Di Cintio, Arianna; Brook, Chris B.; Schombert, James M.
2017-04-01
Cosmological N-body simulations predict dark matter (DM) haloes with steep central cusps (e.g. NFW). This contradicts observations of gas kinematics in low-mass galaxies that imply the existence of shallow DM cores. Baryonic processes such as adiabatic contraction and gas outflows can, in principle, alter the initial DM density profile, yet their relative contributions to the halo transformation remain uncertain. Recent high-resolution, cosmological hydrodynamic simulations by Di Cintio et al. (DC14) predict that inner density profiles depend systematically on the ratio of stellar-to-DM mass (M*/Mhalo). Using a Markov Chain Monte Carlo approach, we test the NFW and the M*/Mhalo-dependent DC14 halo models against a sample of 147 galaxy rotation curves from the new Spitzer Photometry and Accurate Rotation Curves data set. These galaxies all have extended H I rotation curves from radio interferometry as well as accurate stellar-mass-density profiles from near-infrared photometry. The DC14 halo profile provides markedly better fits to the data compared to the NFW profile. Unlike NFW, the DC14 halo parameters found in our rotation-curve fits naturally fall within two standard deviations of the mass-concentration relation predicted by Λ cold dark matter (ΛCDM) and the stellar mass-halo mass relation inferred from abundance matching with few outliers. Halo profiles modified by baryonic processes are therefore more consistent with expectations from ΛCDM cosmology and provide better fits to galaxy rotation curves across a wide range of galaxy properties than do halo models that neglect baryonic physics. Our results offer a solution to the decade long cusp-core discrepancy.
Elizângela Moreira Careta Galindo
2007-02-01
Full Text Available Este trabalho tem por objetivo traduzir, adaptar e validar o Eating Behaviours and Body Image Test, para uso com crianças de uma cidade do interior do estado de São Paulo. Foram sujeitos do estudo 261 escolares do sexo feminino, na faixa etária de 9 a 12 anos. Por meio da análise fatorial, com rotação varimax avaliou-se a consistência interna do instrumento. Esta análise, realizada com o auxílio do programa Statistical Package for Social Sciences, versão 10.0, revelou dois fatores. Para o instrumento total a consistência interna foi adequada (coeficiente a de Cronbach: 0,89 e para os dois fatores (1 e 2 os valores de a também foram considerados satisfatórios (alfa=0,90 e alfa=0,80, respectivamente, mostrando, assim, que o Eating Behaviours and Body Image Test é útil para uma avaliação precoce, rastreando atitudes indicadoras de possíveis distúrbios no comportamento alimentar. Foram mantidas as características psicométricas do instrumento original.This study aimed to translate, adapt and validate the Eating Bahaviours and Body Image Test, to be used with children in a city in upstate São Paulo. Study subjects were 261 female students aging from 9 to 12 years. The internal consistency of the instrument was evaluated by means of factorial analysis with varimax rotation. This analysis was accomplished through Statistical Package for Social Sciences, version 10.0, revealing two factors. The internal consistency was adequate for the total instrument (Cronbach's alpha=0.89 and a values were also considered satisfactory for the two factors (1 and 2 (alpha=0.90 and alpha=0.80, respectively, which demonstrated that the Eating Bahaviours and Body Image Test is useful for an initial evaluation, tracing symptoms that indicate possible eating behavior disorders. The psychometric characteristics of the original instrument were maintained.
Network Consistent Data Association.
Chakraborty, Anirban; Das, Abir; Roy-Chowdhury, Amit K
2016-09-01
Existing data association techniques mostly focus on matching pairs of data-point sets and then repeating this process along space-time to achieve long term correspondences. However, in many problems such as person re-identification, a set of data-points may be observed at multiple spatio-temporal locations and/or by multiple agents in a network and simply combining the local pairwise association results between sets of data-points often leads to inconsistencies over the global space-time horizons. In this paper, we propose a Novel Network Consistent Data Association (NCDA) framework formulated as an optimization problem that not only maintains consistency in association results across the network, but also improves the pairwise data association accuracies. The proposed NCDA can be solved as a binary integer program leading to a globally optimal solution and is capable of handling the challenging data-association scenario where the number of data-points varies across different sets of instances in the network. We also present an online implementation of NCDA method that can dynamically associate new observations to already observed data-points in an iterative fashion, while maintaining network consistency. We have tested both the batch and the online NCDA in two application areas-person re-identification and spatio-temporal cell tracking and observed consistent and highly accurate data association results in all the cases.
A Magnetic Consistency Relation
Jain, Rajeev Kumar
2012-01-01
If cosmic magnetic fields are indeed produced during inflation, they are likely to be correlated with the scalar metric perturbations that are responsible for the Cosmic Microwave Background anisotropies and Large Scale Structure. Within an archetypical model of inflationary magnetogenesis, we show that there exists a new simple consistency relation for the non-Gaussian cross correlation function of the scalar metric perturbation with two powers of the magnetic field in the squeezed limit where the momentum of the metric perturbation vanishes. We emphasize that such a consistency relation turns out to be extremely useful to test some recent calculations in the literature. Apart from primordial non-Gaussianity induced by the curvature perturbations, such a cross correlation might provide a new observational probe of inflation and can in principle reveal the primordial nature of cosmic magnetic fields.
Consistent wind Facilitates Vection
Masaki Ogawa
2011-10-01
Full Text Available We examined whether a consistent haptic cue suggesting forward self-motion facilitated vection. We used a fan with no blades (Dyson, AM01 providing a wind of constant strength and direction (wind speed was 6.37 m/s to the subjects' faces with the visual stimuli visible through the fan. We used an optic flow of expansion or contraction created by positioning 16,000 dots at random inside a simulated cube (length 20 m, and moving the observer's viewpoint to simulate forward or backward self-motion of 16 m/s. we tested three conditions for fan operation, which were normal operation, normal operation with the fan reversed (ie, no wind, and no operation (no wind and no sound. Vection was facilitated by the wind (shorter latency, longer duration and larger magnitude values with the expansion stimuli. The fan noise did not facilitate vection. The wind neither facilitated nor inhibited vection with the contraction stimuli, perhaps because a headwind is not consistent with backward self-motion. We speculate that the consistency between multi modalities is a key factor in facilitating vection.
The internal consistency of the Test of Variables of Attention (TOVA) was examined in a cohort of 6- to 12-year-old children (N = 63) strictly diagnosed with ADHD. The internal consistency of errors of omission (OMM), errors of commission (COM), response time (RT), and response time variability (RTV...
Robin, C; Arteaga, D Peña; Berger, J -F
2015-01-01
Although self-consistent multi-configuration methods have been used for decades to address the description of atomic and molecular many-body systems, only a few trials have been made in the context of nuclear structure. This work aims at the development of such an approach to describe in a unified way various types of correlations in nuclei, in a self-consistent manner where the mean-field is improved as correlations are introduced. The goal is to reconcile the usually set apart Shell-Model and Self-Consistent Mean-Field methods. This approach is referred as "variational multiparticle-multihole configuration mixing method". It is based on a double variational principle which yields a set of two coupled equations that determine at the same time the expansion coefficients of the many-body wave function and the single particle states. The formalism is derived and discussed in a general context, starting from a three-body Hamiltonian. Links to existing many-body techniques such as the formalism of Green's functio...
Dai, Junyi; Kerestes, Rebecca; Upton, Daniel J.; Busemeyer, Jerome R.; Stout, Julie C.
2015-01-01
The Iowa Gambling Task (IGT) and the Soochow Gambling Task (SGT) are two experience-based risky decision-making tasks for examining decision-making deficits in clinical populations. Several cognitive models, including the expectancy-valence learning (EVL) model and the prospect valence learning (PVL) model, have been developed to disentangle the motivational, cognitive, and response processes underlying the explicit choices in these tasks. The purpose of the current study was to develop an improved model that can fit empirical data better than the EVL and PVL models and, in addition, produce more consistent parameter estimates across the IGT and SGT. Twenty-six opiate users (mean age 34.23; SD 8.79) and 27 control participants (mean age 35; SD 10.44) completed both tasks. Eighteen cognitive models varying in evaluation, updating, and choice rules were fit to individual data and their performances were compared to that of a statistical baseline model to find a best fitting model. The results showed that the model combining the prospect utility function treating gains and losses separately, the decay-reinforcement updating rule, and the trial-independent choice rule performed the best in both tasks. Furthermore, the winning model produced more consistent individual parameter estimates across the two tasks than any of the other models. PMID:25814963
Junyi eDai
2015-03-01
Full Text Available The Iowa Gambling Task (IGT and the Soochow Gambling Task (SGT are two experience-based risky decision-making tasks for examining decision-making deficits in clinical populations. Several cognitive models, including the expectancy-valence learning model (EVL and the prospect valence learning model (PVL, have been developed to disentangle the motivational, cognitive, and response processes underlying the explicit choices in these tasks. The purpose of the current study was to develop an improved model that can fit empirical data better than the EVL and PVL models and, in addition, produce more consistent parameter estimates across the IGT and SGT. Twenty-six opiate users (mean age 34.23; SD 8.79 and 27 control participants (mean age 35; SD 10.44 completed both tasks. Eighteen cognitive models varying in evaluation, updating, and choice rules were fit to individual data and their performances were compared to that of a statistical baseline model to find a best fitting model. The results showed that the model combining the prospect utility function treating gains and losses separately, the decay-reinforcement updating rule, and the trial-independent choice rule performed the best in both tasks. Furthermore, the winning model produced more consistent individual parameter estimates across the two tasks than any of the other models.
Crocetti, Emanuele; Caldarella, Adele; Ferretti, Stefano; Ardanaz, Eva; Arveux, Patrick; Bara, Simona; Barrios, Enrique; Bento, Maria J; Bordoni, Andrea; Buzzoni, Carlotta; Candela, Giuseppina; Colombani, Françoise; Delafosse, Patricia; Federico, Massimo; Francart, Julie; Giacomin, Adriano; Grosclaude, Pascale; Guizard, Anne V; Izarzugaza, Isabel; Konzelmann, Isabelle; La Rosa, Francesco; Lapotre, Benedicte; Leone, Nathalie; Ligier, Karine; Mangone, Lucia; Marcos-Gragera, R; Martinez, Ruth; Michelena, Maria J; Michiara, Maria; Miranda, Ana; Molinié, Florence; Mugarza-Gomez, Conception; Paci, Eugenio; Piffer, Silvano; Puig-Vives, Montserrat; Sacchettini, Claudio; Sánchez, Maria J; Traina, Adele; Tretarre, Brigitte; Tumino, Rosario; Van Vaerenbergh, Elke; Velten, Michel; Woronoff, Anne S
2013-08-01
Biological markers are crucial factors in order to differentiate female breast cancers and to determine the right therapy. This study aims at evaluating whether testing for biomarkers for female breast cancer has similar frequency and characteristics across and within countries. Population-based cancer registries of the Association for cancer registration and epidemiology in Romance language countries (GRELL) were asked to complete a questionnaire on biomarkers testing. The data collected referred to invasive female breast cancer cases diagnosed between 2004 and 2009. The investigation focused on 1) the overexpression and amplification of the human epidermal growth factor receptor 2 oncogene (HER2); 2) the expression of oestrogen (ER) and progesterone (PgR) receptors; and 3) the proliferation index (PI). Weighted percentages, the heterogeneity among and within countries, and the correlation between responses and calendar years were evaluated. The study was based on 19,644 breast cancers. Overall, 85.9% of the cases were tested for HER2, 91.8% for both ER and PgR, and 74.1% for proliferative markers. For HER2 and ER-PgR, the frequency of testing increased from 2004 to 2009. Testing varied among countries (HER2 from 82.0% to 95.9%, ER-PgR from 89.3% to 98.9%, PI from 10% to 92%) and also within the same country (e.g. HER2 in Italy from 51% to 99%) as well as within single cancer registries. The most relevant differences were in the scores for positive/negative/not clearly defined HER2 (e.g. HER2 was defined positive if IHC 3+ in 21/33 registries), and in the cut-off of positive cells for ER/PgR (from >0% to >30%) and PI positivity (from >0% to >20%). Biological markers are widely tested in the Romance language countries; however, the parameters defining their positivity may vary, raising concerns about homogeneity in breast cancer classification and treatment. Copyright © 2013 Elsevier Ltd. All rights reserved.
Fehrenbacher, Anne E; Chowdhury, Debasish; Ghose, Toorjo; Swendeman, Dallas
2016-10-01
Consistent condom use (CCU) is the primary HIV/STI prevention option available to sex workers globally but may be undermined by economic insecurity, life-course vulnerabilities, behavioral factors, disempowerment, or lack of effective interventions. This study examines predictors of CCU in a random household survey of brothel-based female sex workers (n = 200) in two neighborhoods served by Durbar (the Sonagachi Project) in Kolkata, India. Multivariate logistic regression analyses indicated that CCU was significantly associated with perceived HIV risk, community mobilization participation, working more days in sex work, and higher proportion of occasional clients to regular clients. Exploratory analyses stratifying by economic insecurity indicators (i.e., debt, savings, income, housing security) indicate that perceived HIV risk and community mobilization were only associated with CCU for economically secure FSW. Interventions with FSW must prioritize economic security and access to social protections as economic insecurity may undermine the efficacy of more direct condom use intervention strategies.
Anja Linstädter
Full Text Available Despite our growing knowledge on plants' functional responses to grazing, there is no consensus if an optimum level of functional aggregation exists for detecting grazing effects in drylands. With a comparative approach we searched for plant functional types (PFTs with a consistent response to grazing across two areas differing in climatic aridity, situated in South Africa's grassland and savanna biomes. We aggregated herbaceous species into PFTs, using hierarchical combinations of traits (from single- to three-trait PFTs. Traits relate to life history, growth form and leaf width. We first confirmed that soil and grazing gradients were largely independent from each other, and then searched in each biome for PFTs with a sensitive response to grazing, avoiding confounding with soil conditions. We found no response consistency, but biome-specific optimum aggregation levels. Three-trait PFTs (e.g. broad-leaved perennial grasses and two-trait PFTs (e.g. perennial grasses performed best as indicators of grazing effects in the semi-arid grassland and in the arid savanna biome, respectively. Some PFTs increased with grazing pressure in the grassland, but decreased in the savanna. We applied biome-specific grazing indicators to evaluate if differences in grazing management related to land tenure (communal versus freehold had effects on vegetation. Tenure effects were small, which we mainly attributed to large variability in grazing pressure across farms. We conclude that the striking lack of generalizable PFT responses to grazing is due to a convergence of aridity and grazing effects, and unlikely to be overcome by more refined classification approaches. Hence, PFTs with an opposite response to grazing in the two biomes rather have a unimodal response along a gradient of additive forces of aridity and grazing. The study advocates for hierarchical trait combinations to identify localized indicator sets for grazing effects. Its methodological approach may
Linstädter, Anja; Schellberg, Jürgen; Brüser, Katharina; Moreno García, Cristian A.; Oomen, Roelof J.; du Preez, Chris C.; Ruppert, Jan C.; Ewert, Frank
2014-01-01
Despite our growing knowledge on plants’ functional responses to grazing, there is no consensus if an optimum level of functional aggregation exists for detecting grazing effects in drylands. With a comparative approach we searched for plant functional types (PFTs) with a consistent response to grazing across two areas differing in climatic aridity, situated in South Africa’s grassland and savanna biomes. We aggregated herbaceous species into PFTs, using hierarchical combinations of traits (from single- to three-trait PFTs). Traits relate to life history, growth form and leaf width. We first confirmed that soil and grazing gradients were largely independent from each other, and then searched in each biome for PFTs with a sensitive response to grazing, avoiding confounding with soil conditions. We found no response consistency, but biome-specific optimum aggregation levels. Three-trait PFTs (e.g. broad-leaved perennial grasses) and two-trait PFTs (e.g. perennial grasses) performed best as indicators of grazing effects in the semi-arid grassland and in the arid savanna biome, respectively. Some PFTs increased with grazing pressure in the grassland, but decreased in the savanna. We applied biome-specific grazing indicators to evaluate if differences in grazing management related to land tenure (communal versus freehold) had effects on vegetation. Tenure effects were small, which we mainly attributed to large variability in grazing pressure across farms. We conclude that the striking lack of generalizable PFT responses to grazing is due to a convergence of aridity and grazing effects, and unlikely to be overcome by more refined classification approaches. Hence, PFTs with an opposite response to grazing in the two biomes rather have a unimodal response along a gradient of additive forces of aridity and grazing. The study advocates for hierarchical trait combinations to identify localized indicator sets for grazing effects. Its methodological approach may also be
Linstädter, Anja; Schellberg, Jürgen; Brüser, Katharina; Moreno García, Cristian A; Oomen, Roelof J; du Preez, Chris C; Ruppert, Jan C; Ewert, Frank
2014-01-01
Despite our growing knowledge on plants' functional responses to grazing, there is no consensus if an optimum level of functional aggregation exists for detecting grazing effects in drylands. With a comparative approach we searched for plant functional types (PFTs) with a consistent response to grazing across two areas differing in climatic aridity, situated in South Africa's grassland and savanna biomes. We aggregated herbaceous species into PFTs, using hierarchical combinations of traits (from single- to three-trait PFTs). Traits relate to life history, growth form and leaf width. We first confirmed that soil and grazing gradients were largely independent from each other, and then searched in each biome for PFTs with a sensitive response to grazing, avoiding confounding with soil conditions. We found no response consistency, but biome-specific optimum aggregation levels. Three-trait PFTs (e.g. broad-leaved perennial grasses) and two-trait PFTs (e.g. perennial grasses) performed best as indicators of grazing effects in the semi-arid grassland and in the arid savanna biome, respectively. Some PFTs increased with grazing pressure in the grassland, but decreased in the savanna. We applied biome-specific grazing indicators to evaluate if differences in grazing management related to land tenure (communal versus freehold) had effects on vegetation. Tenure effects were small, which we mainly attributed to large variability in grazing pressure across farms. We conclude that the striking lack of generalizable PFT responses to grazing is due to a convergence of aridity and grazing effects, and unlikely to be overcome by more refined classification approaches. Hence, PFTs with an opposite response to grazing in the two biomes rather have a unimodal response along a gradient of additive forces of aridity and grazing. The study advocates for hierarchical trait combinations to identify localized indicator sets for grazing effects. Its methodological approach may also be useful
Chip Multithreaded Consistency Model
Zu-Song Li; Dan-Dan Huan; Wei-Wu Hu; Zhi-Min Tang
2008-01-01
Multithreaded technique is the developing trend of high performance processor. Memory consistency model is essential to the correctness, performance and complexity of multithreaded processor. The chip multithreaded consistency model adapting to multithreaded processor is proposed in this paper. The restriction imposed on memory event ordering by chip multithreaded consistency is presented and formalized. With the idea of critical cycle built by Wei-Wu Hu, we prove that the proposed chip multithreaded consistency model satisfies the criterion of correct execution of sequential consistency model. Chip multithreaded consistency model provides a way of achieving high performance compared with sequential consistency model and ensures the compatibility of software that the execution result in multithreaded processor is the same as the execution result in uniprocessor. The implementation strategy of chip multithreaded consistency model in Godson-2 SMT processor is also proposed. Godson-2 SMT processor supports chip multithreaded consistency model correctly by exception scheme based on the sequential memory access queue of each thread.
du Plooy, Christopher; Thomas, Kevin G F; Henry, Michelle; Human, Robyn; Jacobs, W Jake
2014-06-01
We describe a method to administer a controlled, effective stressor to humans in the laboratory. The method combines the Trier Social Stress Test (TSST) and the Cold Pressor Test into a single, believable procedure called the Fear-Factor Stress Test (FFST). In the procedure, participants imagine auditioning for the reality television show Fear Factor. They stand before a video recorder and a panel of judges while (a) delivering a motivational speech, (b) performing a verbal arithmetic task, and (c) placing one hand into a bucket of ice water for up to 2 min. We measured subjective anxiety, heart rate, and salivary cortisol in three groups of young adults (n = 30 each, equal numbers of men and women): FFST, TSST, and Control (a placebo version of the FFST). Although the FFST and TSST groups were not distinguishable at the cortisol measure taken 5 min post-manipulation, at 35 min postmanipulation average cortisol levels in the TSST group had returned to baseline, whereas those in the FFST group continued to rise. The proportion of individual cortisol responders (≥ 2 nmol/l increase over baseline) in the TSST and FFST groups did not differ at the 5-min measure, but at the 35-min measure the FFST group contained significantly more responders. The findings indicate that the FFST induces a more robust and sustained cortisol response (which we assume is a marker of an HPA-axis response) than the TSST, and that it does so without increasing participant discomfort or incurring appreciably greater resource and time costs.
温忠麟; 叶宝娟
2011-01-01
沿用经典的测验信度定义,简介了信度与α系数的关系以及α系数的局限.为了推荐替代α系数的信度估计方法,深入讨论了与α系数关系密切的同质性信度和内部一致性信度.在很一般的条件下,证明了α系数和同质性信度都不超过内部一致性信度,后者不超过测验信度,说明内部一致性信度比较接近测验信度.总结出一个测验信度分析流程,说明什么情况下α系数还有参考价值；什么情况下α系数不再适用,应当使用内部一致性信度(文献上也常称为合成信度).提供了计算同质性信度和内部一致性信度的计算程序,一般的应用工作者可以直接套用.%In the research of psychology and other social sciences, test reliability is often used to reflect measurement stability and consistency. Coefficient a is the most popular indicator of test reliability. Recent years, however, coefficient a was challenged now and again. Is coefficient a still recommended for evaluating test reliability? If not, what should replace it?With the classical concept of reliability, which is defined as the ratio of true variance to observed variance on a test under consideration, we introduced the relationship between test reliability and coefficient a, and the limitations of coefficient a. The concepts closely related to coefficient a were considered. We clearly defined homogeneity reliability and internal consistency reliability. Homogeneity reflects the presence of a general factor, whereas internal consistency relates the presence of common factors (including a general factor and local factors). For unidimensional tests, homogeneity and internal consistency are the same concept. Investigating the relationship between test reliability, coefficient o, homogeneity reliability, and internal consistency reliability, we showed that homogeneity reliability is not larger than internal consistency reliability, and that the latter is not larger than test
Katz, Harley; McGaugh, Stacy S; Di Cintio, Arianna; Brook, Chris B; Schombert, James M
2016-01-01
Cosmological N-body simulations predict dark matter (DM) haloes with steep central cusps (e.g. NFW), which contradicts observations of gas kinematics in low mass galaxies that imply the existence of shallow DM cores. Baryonic processes such as adiabatic contraction and gas outflows can, in principle, alter the initial DM density profile, yet their relative contributions to the halo transformation remain uncertain. Recent high resolution, cosmological hydrodynamic simulations (Di Cintio et al. 2014, DC14) predict that inner density profiles depend systematically on the ratio of stellar to DM mass (M$_*$/M$_{\\rm halo}$). Using a Markov Chain Monte Carlo approach, we test the NFW and the M$_*$/M$_{\\rm halo}$-dependent DC14 halo models against a sample of 147 galaxy rotation curves from the new SPARC data set. These galaxies all have extended HI rotation curves from radio interferometry as well as accurate stellar mass density profiles from near-infrared photometry. The DC14 halo profile provides markedly better ...
Consistent model driven architecture
Niepostyn, Stanisław J.
2015-09-01
The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.
No consistent bimetric gravity?
Deser, S; Waldron, A
2013-01-01
We discuss the prospects for a consistent, nonlinear, partially massless (PM), gauge symmetry of bimetric gravity (BMG). Just as for single metric massive gravity, ultimate consistency of both BMG and the putative PM BMG theory relies crucially on this gauge symmetry. We argue, however, that it does not exist.
Akkermans AM; Hendriksen CFM; Marsman FR; de Jong WH; van de Donk HJM
1993-01-01
A single-dilution assay can be a valid procedure to demonstrate that a product exceeds the minimal requirement given for potency provided that consistency in production and testing has been proven. Information is presented justifying the use of a single dilution assay based upon quantitative respon
Akkermans AM; Hendriksen CFM; Marsman FR; de Jong WH; van de Donk HJM
1993-01-01
A single-dilution assay can be a valid procedure to demonstrate that a product exceeds the minimal requirement given for potency provided that consistency in production and testing has been proven. Information is presented justifying the use of a single dilution assay based upon quantitative respon
Hiscock, S.
1986-07-01
The importance of consistency in coal quality has become of increasing significance recently, with the current trend towards using coal from a range of sources. A significant development has been the swing in responsibilities for coal quality. The increasing demand for consistency in quality has led to a re-examination of where in the trade and transport chain the quality should be assessed and where further upgrading of inspection and preparation facilities are required. Changes are in progress throughout the whole coal transport chain which will improve consistency of delivered coal quality. These include installation of beneficiation plant at coal mines, export terminals, and on the premises of end users. It is suggested that one of the keys to success for the coal industry will be the ability to provide coal of a consistent quality.
Kent, A
1996-01-01
In the consistent histories formulation of quantum theory, the probabilistic predictions and retrodictions made from observed data depend on the choice of a consistent set. We show that this freedom allows the formalism to retrodict several contradictory propositions which correspond to orthogonal commuting projections and which all have probability one. We also show that the formalism makes contradictory probability one predictions when applied to generalised time-symmetric quantum mechanics.
周宇豪; 许金芳; 贺佳
2011-01-01
Objective To evaluate different assessment methods of consistency of diagnostic test.Methods Different assessment methods were adopted to evaluate the consistency of a true example with good consistency in fact and three simulation situations developed from the true one.Results All assessment methods showed good consistency for the true example except paired t test.Simple correlation analysis was insensitive in the situation of obvious systematic bias.And in the situation with comparatively large random error where low consistency was accepted,paired t test suggested better consistency.In the situation with low range of measurement where both systematic bias and random error were small,intra-class correlation analysis failed demonstrate exact results.Conclusion Both paired t test and simple correlation analysis have obvious defects in assessing consistency.Although intra-class correlation analysis has some limitation,it take account of systematic bias and random bias.And BlandAltman methods and ATE/LER zones are recommended to evaluate the consistency.%目的 对诊断试验中评价一致性的几种方法进行比较.方法 对实例及3种模拟情况进行一致性评价,比较几种方法的优劣.结果 检验试剂与参考试剂测量游离前列腺特异抗原(FPSA)的实例分析中,除配对t检验外,几种方法均显示出良好的一致性;在系统误差较大的模拟情况下,简单相关分析所得结果相对实例而言没有变化;在随机误差较大的模拟情况下,配对t检验得出两种试剂具有良好的一致性;在系统误差和随机误差均小的模拟情况下,组内相关系数所得结果显示一致性较差.结论 配对t检验、简单相关分析用于一致性评价都只适用于部分资料,组内相关系数法适用于存在系统误差和/或随机误差的时候,而Bland-Altman法以及ATE/LER区域可作为一致性评价的优先考虑方法.
基于CAN总线ECU诊断功能的一致性测试设计%Design of consistency testing of ECU diagnostic function based on CAN bus
程安宇; 赵恬; 申帅
2013-01-01
为了保证基于CAN总线ECU诊断功能的正确性和稳定性,在分析CAN总线诊断系统的基础上,提出了一种针对诊断功能的测试方法.根据CAN总线诊断协议ISO 15765,从黑盒测试的角度设计了测试方案和测试用例.最后以CANoe总线分析测试软件为基础搭建测试平台,完成了对ECU诊断功能的一致性测试.测试结果验证了此方法能有效地测试ECU诊断功能与诊断协议ISO 15765的一致性,表明此诊断测试方法可行.%In order to ensure the correctness and stability of ECU diagnostic function based on CAN bus,on the basis of analysis of CAN bus diagnostic system,a test method for diagnostic function is proposed.According to CAN bus diagnostic protocol ISO 15765,test scheme and test cases are designed from the view of blackboxtesting.Finally,test platform based on CANoe Bus analysis testing software is built up,the consistency test of ECU diagnostic function is fulfilled.Test result verifies that the method can effectively test the consistency between the ECU diagnostic function and diagnostic protocol ISO 15765,and shows the diagnostic test method is feasible.
Consistency argued students of fluid
Viyanti; Cari; Suparmi; Winarti; Slamet Budiarti, Indah; Handika, Jeffry; Widyastuti, Fatma
2017-01-01
Problem solving for physics concepts through consistency arguments can improve thinking skills of students and it is an important thing in science. The study aims to assess the consistency of the material Fluid student argmentation. The population of this study are College students PGRI Madiun, UIN Sunan Kalijaga Yogyakarta and Lampung University. Samples using cluster random sampling, 145 samples obtained by the number of students. The study used a descriptive survey method. Data obtained through multiple-choice test and interview reasoned. Problem fluid modified from [9] and [1]. The results of the study gained an average consistency argmentation for the right consistency, consistency is wrong, and inconsistent respectively 4.85%; 29.93%; and 65.23%. Data from the study have an impact on the lack of understanding of the fluid material which is ideally in full consistency argued affect the expansion of understanding of the concept. The results of the study as a reference in making improvements in future studies is to obtain a positive change in the consistency of argumentations.
Thomsen, Christa; Nielsen, Anne Ellerup
2006-01-01
of a case study showing that companies use different and not necessarily consistent strategies for reporting on CSR. Finally, the implications for managerial practice are discussed. The chapter concludes by highlighting the value and awareness of the discourse and the discourse types adopted......This chapter first outlines theory and literature on CSR and Stakeholder Relations focusing on the different perspectives and the contextual and dynamic character of the CSR concept. CSR reporting challenges are discussed and a model of analysis is proposed. Next, our paper presents the results...... in the reporting material. By implementing consistent discourse strategies that interact according to a well-defined pattern or order, it is possible to communicate a strong social commitment on the one hand, and to take into consideration the expectations of the shareholders and the other stakeholders...
Consistency in Distributed Systems
Kemme, Bettina; Ramalingam, Ganesan; Schiper, André; Shapiro, Marc; Vaswani, Kapil
2013-01-01
International audience; In distributed systems, there exists a fundamental trade-off between data consistency, availability, and the ability to tolerate failures. This trade-off has significant implications on the design of the entire distributed computing infrastructure such as storage systems, compilers and runtimes, application development frameworks and programming languages. Unfortunately, it also has significant, and poorly understood, implications for the designers and developers of en...
Geometrically Consistent Mesh Modification
Bonito, A.
2010-01-01
A new paradigm of adaptivity is to execute refinement, coarsening, and smoothing of meshes on manifolds with incomplete information about their geometry and yet preserve position and curvature accuracy. We refer to this collectively as geometrically consistent (GC) mesh modification. We discuss the concept of discrete GC, show the failure of naive approaches, and propose and analyze a simple algorithm that is GC and accuracy preserving. © 2010 Society for Industrial and Applied Mathematics.
Infanticide and moral consistency.
McMahan, Jeff
2013-05-01
The aim of this essay is to show that there are no easy options for those who are disturbed by the suggestion that infanticide may on occasion be morally permissible. The belief that infanticide is always wrong is doubtfully compatible with a range of widely shared moral beliefs that underlie various commonly accepted practices. Any set of beliefs about the morality of abortion, infanticide and the killing of animals that is internally consistent and even minimally credible will therefore unavoidably contain some beliefs that are counterintuitive.
Serfon, Cedric; The ATLAS collaboration
2016-01-01
One of the biggest challenge with Large scale data management system is to ensure the consistency between the global file catalog and what is physically on all storage elements. To tackle this issue, the Rucio software which is used by the ATLAS Distributed Data Management system has been extended to automatically handle lost or unregistered files (aka Dark Data). This system automatically detects these inconsistencies and take actions like recovery or deletion of unneeded files in a central manner. In this talk, we will present this system, explain the internals and give some results.
When is holography consistent?
McInnes, Brett, E-mail: matmcinn@nus.edu.sg [National University of Singapore (Singapore); Ong, Yen Chin, E-mail: yenchin.ong@nordita.org [Nordita, KTH Royal Institute of Technology and Stockholm University, Roslagstullsbacken 23, SE-106 91 Stockholm (Sweden)
2015-09-15
Holographic duality relates two radically different kinds of theory: one with gravity, one without. The very existence of such an equivalence imposes strong consistency conditions which are, in the nature of the case, hard to satisfy. Recently a particularly deep condition of this kind, relating the minimum of a probe brane action to a gravitational bulk action (in a Euclidean formulation), has been recognized; and the question arises as to the circumstances under which it, and its Lorentzian counterpart, is satisfied. We discuss the fact that there are physically interesting situations in which one or both versions might, in principle, not be satisfied. These arise in two distinct circumstances: first, when the bulk is not an Einstein manifold and, second, in the presence of angular momentum. Focusing on the application of holography to the quark–gluon plasma (of the various forms arising in the early Universe and in heavy-ion collisions), we find that these potential violations never actually occur. This suggests that the consistency condition is a “law of physics” expressing a particular aspect of holography.
丁树良; 毛萌萌; 汪文义; 罗芬; CUI Ying
2012-01-01
构建正确的认知模型是成功进行认知诊断的关键之一,如果认知诊断测验不能完整准确地代表这个认知模型,这个测验的效度就存在问题.属性及其层级可以表示一个认知模型.在认知模型正确基础上,给出了一个计量公式以衡量认知诊断测验能够多大程度上代表认知模型；对于不止包含一个知识状态的等价类及其形成原因进行了分析,对Cui等人的属性层级相合性指标(HCI)提出修改建议,以更好地探查数据与专家给出的认知模型的一致性.%Attributes and their hierarchy may present a cognitive model. Building a cognitive model is one of key steps for cognitive diagnosis as it is directly related to the validity and usefulness of test results. It is very important to detect whether the test specification coincides with the cognitive model before administering the test. In this paper, an explicit index is given to measure the extent to which the cognitive model is represented by the test items of the diagnostic test. We call this index as the theoretical construct validity (TCV). In terms of TCV, the test reported by Tatsuoka and her colleagues (1988) is reanalyzed, and the TCV of the test is only 9/24. That is, 24 knowledge states are obtained from the theoretic cognitive model but the test specification could distinguish only 9 knowledge states. Cui and her colleagues established a person fit index named hierarchy consistency index (HCI) to detect the fitness of an examinee's observed response pattern (ORP) to an expected response pattern (ERP). HCI is not defined well when an examinee mastered one attribute only and there is only one item measuring the same attribute in the test, and the examinee responses correctly to the item. The original HCI could not compute for the number of comparisons being zero, hence the denominator being zero. In addition, the HCI includes the slipping only. Combining the slipping and guessing in the response to
Consistent quantum measurements
Griffiths, Robert B.
2015-11-01
In response to recent criticisms by Okon and Sudarsky, various aspects of the consistent histories (CH) resolution of the quantum measurement problem(s) are discussed using a simple Stern-Gerlach device, and compared with the alternative approaches to the measurement problem provided by spontaneous localization (GRW), Bohmian mechanics, many worlds, and standard (textbook) quantum mechanics. Among these CH is unique in solving the second measurement problem: inferring from the measurement outcome a property of the measured system at a time before the measurement took place, as is done routinely by experimental physicists. The main respect in which CH differs from other quantum interpretations is in allowing multiple stochastic descriptions of a given measurement situation, from which one (or more) can be selected on the basis of its utility. This requires abandoning a principle (termed unicity), central to classical physics, that at any instant of time there is only a single correct description of the world.
胡玉伟; 马萍; 杨明; 王子才
2013-01-01
To improve the quality of consistency test,a comprehensive consistency test method based on improved grey relational analysis theory is proposed.An improved grey relational grade model is constructed in terms of the shape and distances between data series.Considering that the work process of actual system usually consists of multiple stages and work status in every stage has different impact on the final results,the comprehensive relational grade is defined and the weights of indicators are determined through analytic hierarchy process method.Finally,the dynamic consistency of the whole simulation data could be assessed by weighted comprehensive relational grade.The proposed method not only can make full use of finite data,but also can consider the actual work situation of system.Moreover,it has no demand for the volumes and distribution of data and is convenient to be realized on computer.By the consistency test for the rail-gun discharge current,which is an essential indicator,the feasibility and effectiveness of the proposed method has been verified.%为全面有效检验仿真数据的一致性,提出一种基于改进灰色关联分析的仿真数据综合一致性检验方法.以灰色关联分析思想为基础,综合考虑数据序列间的形状和距离,构建了改进的灰色关联度模型.针对实际系统工作过程包含多个阶段,各阶段在整个过程中的重要程度不同的问题,利用层次分析法确定其权重,定义综合关联度,用以评价整个仿真数据的一致性.该一致性检验方法既利用了有限的客观数据,又结合了对实际系统工作状况的考虑,而且对数据容量和分布特性没有要求,便于在计算机上实现.通过对电磁轨道炮的主要性能指标放电电流的一致性检验,验证了该方法的可行性和有效性.
郝琳; 马长林
2014-01-01
基于Matlab/GUI实现了AHP中的关键环节：成对比较矩阵的一致性检验与修正。软件界面简洁友好，操作便利，为应用AHP解决各领域的决策问题提供了便利。%This paper realizes the key in AHP based on Matlab/GUI:consistency test and correction of pairwise comparison matrix.The interface of software is terse,friendly and convenient. It can provide convenience for decision problems applying the AHP.
When Is Holography Consistent?
McInnes, Brett
2015-01-01
Holographic duality relates two radically different kinds of theory: one with gravity, one without. The very existence of such an equivalence imposes strong consistency conditions which are, in the nature of the case, hard to satisfy. Recently a particularly deep condition of this kind, relating the minimum of a probe brane action to a gravitational bulk action (in a Euclidean formulation), has been recognised; and the question arises as to the circumstances under which it, and its Lorentzian counterpart, are satisfied. We discuss the fact that there are physically interesting situations in which one or both versions might, in principle, \\emph{not} be satisfied. These arise in two distinct circumstances: first, when the bulk is not an Einstein manifold, and, second, in the presence of angular momentum. Focusing on the application of holography to the quark-gluon plasma (of the various forms arising in the early Universe and in heavy-ion collisions), we find that these potential violations never actually occur...
李涛; 胡爱群; 高尚
2015-01-01
为了检测网络通信协议的安全性，使用高效的模式识别方法对协议内容进行符合性测试。采用黑盒测试的方法，在检测端将协议服务器和检测模块分离，设计了协议安全性测试框架和测试流程；提出了以字节块为单位、分块计算摘要值再进行匹配的 BB-BM 算法。实验结果表明，使用该方法能够对网络协议按照内容种类划分值域空间，通过匹配算法进行符合性测试。在进行模式匹配时通过分块处理减少了模式串和目标串数量，从而导致跳跃距离增加，匹配次数减少，检测性能在最优和最差测试状态下较现有检测方法分别提高了20％和80％。在该测试框架下，以字节块为单位进行匹配有效提升了检测效率，适用于对字段格式固定的网络协议进行内容符合性测试。%In order to check the security of network communication protocols,the efficient pattern recognition method is used to test protocols'context consistency.Testing framework and process for protocol security are designed based on the black testing method with the testing part being divided into the protocol server and the testing module.The BB-BM (block based Boyer Moore)algorithm is proposed,in which the words block is used as unit and matching blocks is carried out after the cal-culation of digests.The experimental results show that the proposed system can divide the value space of network protocol based on the context type.The consistency test is carried out by the recog-nition algorithm.The number of pattern strings and target strings decreases by the block division process during the pattern matching,and correspondingly the skip distance increases and the times of recognition decrease.Compared with the existing matching methods,the testing performance of the proposed method increases by 20% and 80% under the best and worst testing conditions,respective-ly.In this system,the testing efficiency is
Maria Aparecida Conti
2012-01-01
Full Text Available OBJETIVO: Este estudo teve por objetivo a adaptação transcultural do Internet Addiction Test (IAT para o idioma português. MÉTODOS: O trabalho consistiu em cinco etapas: (1 tradução; (2 retradução; (3 revisão técnica e avaliação da equivalência semântica por profissionais da área; (4 avaliação do instrumento por uma amostra de estudantes, avaliando-se o seu grau de compreensão; e (5 análise da consistência interna por meio do coeficiente alfa de Cronbach. RESULTADOS: O instrumento foi traduzido e adaptado para o idioma português, demonstrando ser facilmente compreendido e apresentando valor de consistência interna de 0,85. CONCLUSÃO: O instrumento encontra-se traduzido e adaptado para o português e apresenta consistência interna satisfatória. São necessárias análises de equivalência de mensuração e reprodutibilidade.
Joaquín García-Alandete
2013-06-01
Full Text Available El objetivo de este trabajo fue examinar la estructura factorial y la consistencia interna de la versión española del Purpose-In-Life Test, instrumento que mide el logro de sentido de la vida desde los supuestos de la logoterapia. En el estudio participaron 457 universitarios españoles (320 mujeres, 137 hombres de 18 a 55 años de edad, M = 21.80, DE = 4.56. Se realizaron análisis descriptivos y correlaciones entre los ítems y el total de la escala inicial, un análisis factorial exploratorio, la estimación de la consistencia interna de los factores y de la escala obtenida, el análisis factorial confirmatorio de la misma, la prueba t para comparación de medias entre mujeres y hombres y la prueba de Kruskal-Wallis para el efecto de la edad. Los resultados mostraron una estructura de dos factores correlacionados con aceptable consistencia interna de la escala y de los factores, diferencias significativas entre mujeres y hombres en la puntuación total y uno de los factores, y no significativas en función de la edad. El análisis factorial confirmatorio muestra un adecuado ajuste, apoyando el modelo propuesto.
Measuring process and knowledge consistency
Edwards, Kasper; Jensen, Klaes Ladeby; Haug, Anders
2007-01-01
with a 5 point Liker scale and a corresponding scoring system. Process consistency is measured by using a first-person drawing tool with the respondent in the centre. Respondents sketch the sequence of steps and people they contact when configuring a product. The methodology is tested in one company...... for granted; rather the contrary, and attempting to implement a configuration system may easily ignite a political battle. This is because stakes are high in the sense that the rules and processes chosen may only reflect one part of the practice, ignoring a majority of the employees. To avoid this situation...
CONSISTENT AGGREGATION IN FOOD DEMAND SYSTEMS
Levedahl, J. William; Reed, Albert J.; Clark, J. Stephen
2002-01-01
Two aggregation schemes for food demand systems are tested for consistency with the Generalized Composite Commodity Theorem (GCCT). One scheme is based on the standard CES classification of food expenditures. The second scheme is based on the Food Guide Pyramid. Evidence is found that both schemes are consistent with the GCCT.
Consistency of trace norm minimization
Bach, Francis
2007-01-01
Regularization by the sum of singular values, also referred to as the trace norm, is a popular technique for estimating low rank rectangular matrices. In this paper, we extend some of the consistency results of the Lasso to provide necessary and sufficient conditions for rank consistency of trace norm minimization with the square loss. We also provide an adaptive version that is rank consistent even when the necessary condition for the non adaptive version is not fulfilled.
High SNR Consistent Compressive Sensing
Kallummil, Sreejith; Kalyani, Sheetal
2017-01-01
High signal to noise ratio (SNR) consistency of model selection criteria in linear regression models has attracted a lot of attention recently. However, most of the existing literature on high SNR consistency deals with model order selection. Further, the limited literature available on the high SNR consistency of subset selection procedures (SSPs) is applicable to linear regression with full rank measurement matrices only. Hence, the performance of SSPs used in underdetermined linear models ...
Apperly, Ian A.; Samson, Dana; Chiavarino, Claudia; Bickerton, Wai-Ling; Humphreys, Glyn W.
2007-01-01
To test the domain-specificity of ''theory of mind'' abilities we compared the performance of a case-series of 11 brain-lesioned patients on a recently developed test of false belief reasoning (Apperly, Samson, Chiavarino, & Humphreys, 2004) and on a matched false photograph task, which did not require belief reasoning and which addressed problems…
Coordinating user interfaces for consistency
Nielsen, Jakob
2001-01-01
In the years since Jakob Nielsen's classic collection on interface consistency first appeared, much has changed, and much has stayed the same. On the one hand, there's been exponential growth in the opportunities for following or disregarding the principles of interface consistency-more computers, more applications, more users, and of course the vast expanse of the Web. On the other, there are the principles themselves, as persistent and as valuable as ever. In these contributed chapters, you'll find details on many methods for seeking and enforcing consistency, along with bottom-line analys
Consistency of Random Survival Forests.
Ishwaran, Hemant; Kogalur, Udaya B
2010-07-01
We prove uniform consistency of Random Survival Forests (RSF), a newly introduced forest ensemble learner for analysis of right-censored survival data. Consistency is proven under general splitting rules, bootstrapping, and random selection of variables-that is, under true implementation of the methodology. Under this setting we show that the forest ensemble survival function converges uniformly to the true population survival function. To prove this result we make one key assumption regarding the feature space: we assume that all variables are factors. Doing so ensures that the feature space has finite cardinality and enables us to exploit counting process theory and the uniform consistency of the Kaplan-Meier survival function.
Process Fairness and Dynamic Consistency
S.T. Trautmann (Stefan); P.P. Wakker (Peter)
2010-01-01
textabstractAbstract: When process fairness deviates from outcome fairness, dynamic inconsistencies can arise as in nonexpected utility. Resolute choice (Machina) can restore dynamic consistency under nonexpected utility without using Strotz's precommitment. It can similarly justify dynamically
Gravitation, Causality, and Quantum Consistency
Hertzberg, Mark P
2016-01-01
We examine the role of consistency with causality and quantum mechanics in determining the properties of gravitation. We begin by constructing two different classes of interacting theories of massless spin 2 particles -- gravitons. One involves coupling the graviton with the lowest number of derivatives to matter, the other involves coupling the graviton with higher derivatives to matter, making use of the linearized Riemann tensor. The first class requires an infinite tower of terms for consistency, which is known to lead uniquely to general relativity. The second class only requires a finite number of terms for consistency, which appears as a new class of theories of massless spin 2. We recap the causal consistency of general relativity and show how this fails in the second class for the special case of coupling to photons, exploiting related calculations in the literature. In an upcoming publication [1] this result is generalized to a much broader set of theories. Then, as a causal modification of general ...
Student Effort, Consistency and Online Performance
Hilde Patron
2011-07-01
Full Text Available This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas effort, or total minutes spent online, is not. Other independent variables include GPA and the difference between a pre-test and a post-test. The GPA is used as a measure of motivation, and the difference between a post-test and pre-test as marginal learning. As expected, the level of motivation is found statistically significant at a 99% confidence level, and marginal learning is also significant at a 95% level.
Time-consistent and market-consistent evaluations
Pelsser, A.; Stadje, M.A.
2014-01-01
We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from mathemati
Evelyn Mendoza Torres
2012-03-01
Full Text Available CONTEXT: Genotyping of single nucleotide polymorphism (SNP C/T-13910 located upstream of the lactase gene is used to determine adult-type hypolactasia/lactase persistence in North-European Caucasian subjects. The applicability of this polymorphism has been studied by comparing it with the standard diagnostic methods in different populations. OBJECTIVE: To compare the lactose hydrogen breath test with the genetic test in a sample of the Colombian Caribbean population. METHODS: Lactose hydrogen breath test and genotyping of SNP C/T-13910 were applied to 128 healthy individuals (mean age 35 ± 1. A positive lactose hydrogen breath test was indicative of hypolactasia. Genotyping was done using polymerase chain reaction/restriction fragment length polymorphism. The kappa index was used to establish agreement between the two methods. RESULTS: Seventy-six subjects (59% were lactose-maldigesters (hypolactasia and 52 subjects (41% were lactose-digesters (lactase persistence. The frequencies of the CC, CT and TT genotypes were 80%, 20% and 0%, respectively. Genotyping had 97% sensitivity and 46% specificity. The kappa index = 0.473 indicates moderate agreement between the genotyping of SNP C/T-13910 and the lactose hydrogen breath test. CONCLUSION: The moderate agreement indicates that the genotyping of the SNP C/T-13910 is not applicable to determine adult-type hypolactasia/lactase persistence in the population participating in this study.
Market-consistent actuarial valuation
Wüthrich, Mario V
2016-01-01
This is the third edition of this well-received textbook, presenting powerful methods for measuring insurance liabilities and assets in a consistent way, with detailed mathematical frameworks that lead to market-consistent values for liabilities. Topics covered are stochastic discounting with deflators, valuation portfolio in life and non-life insurance, probability distortions, asset and liability management, financial risks, insurance technical risks, and solvency. Including updates on recent developments and regulatory changes under Solvency II, this new edition of Market-Consistent Actuarial Valuation also elaborates on different risk measures, providing a revised definition of solvency based on industry practice, and presents an adapted valuation framework which takes a dynamic view of non-life insurance reserving risk.
Consistent Histories in Quantum Cosmology
Craig, David A; 10.1007/s10701-010-9422-6
2010-01-01
We illustrate the crucial role played by decoherence (consistency of quantum histories) in extracting consistent quantum probabilities for alternative histories in quantum cosmology. Specifically, within a Wheeler-DeWitt quantization of a flat Friedmann-Robertson-Walker cosmological model sourced with a free massless scalar field, we calculate the probability that the univese is singular in the sense that it assumes zero volume. Classical solutions of this model are a disjoint set of expanding and contracting singular branches. A naive assessment of the behavior of quantum states which are superpositions of expanding and contracting universes may suggest that a "quantum bounce" is possible i.e. that the wave function of the universe may remain peaked on a non-singular classical solution throughout its history. However, a more careful consistent histories analysis shows that for arbitrary states in the physical Hilbert space the probability of this Wheeler-DeWitt quantum universe encountering the big bang/crun...
The Importance of being consistent
Wasserman, Adam; Jiang, Kaili; Kim, Min-Cheol; Sim, Eunji; Burke, Kieron
2016-01-01
We review the role of self-consistency in density functional theory. We apply a recent analysis to both Kohn-Sham and orbital-free DFT, as well as to Partition-DFT, which generalizes all aspects of standard DFT. In each case, the analysis distinguishes between errors in approximate functionals versus errors in the self-consistent density. This yields insights into the origins of many errors in DFT calculations, especially those often attributed to self-interaction or delocalization error. In many classes of problems, errors can be substantially reduced by using `better' densities. We review the history of these approaches, many of their applications, and give simple pedagogical examples.
Consistent supersymmetric decoupling in cosmology
Sousa Sánchez, Kepa
2012-01-01
The present work discusses several problems related to the stability of ground states with broken supersymmetry in supergravity, and to the existence and stability of cosmic strings in various supersymmetric models. In particular we study the necessary conditions to truncate consistently a sector o
Bendixen, Carsten
2014-01-01
Bidrag med en kortfattet, introducerende, perspektiverende og begrebsafklarende fremstilling af begrebet test i det pædagogiske univers.......Bidrag med en kortfattet, introducerende, perspektiverende og begrebsafklarende fremstilling af begrebet test i det pædagogiske univers....
Consistence of Network Filtering Rules
SHE Kun; WU Yuancheng; HUANG Juncai; ZHOU Mingtian
2004-01-01
The inconsistence of firewall/VPN(Virtual Private Network) rule makes a huge maintainable cost.With development of Multinational Company,SOHO office,E-government the number of firewalls/VPN will increase rapidly.Rule table in stand-alone or network will be increased in geometric series accordingly.Checking the consistence of rule table manually is inadequate.A formal approach can define semantic consistence,make a theoretic foundation of intelligent management about rule tables.In this paper,a kind of formalization of host rules and network ones for auto rule-validation based on SET theory were proporsed and a rule validation scheme was defined.The analysis results show the superior performance of the methods and demonstrate its potential for the intelligent management based on rule tables.
Self-consistent triaxial models
Sanders, Jason L
2015-01-01
We present self-consistent triaxial stellar systems that have analytic distribution functions (DFs) expressed in terms of the actions. These provide triaxial density profiles with cores or cusps at the centre. They are the first self-consistent triaxial models with analytic DFs suitable for modelling giant ellipticals and dark haloes. Specifically, we study triaxial models that reproduce the Hernquist profile from Williams & Evans (2015), as well as flattened isochrones of the form proposed by Binney (2014). We explore the kinematics and orbital structure of these models in some detail. The models typically become more radially anisotropic on moving outwards, have velocity ellipsoids aligned in Cartesian coordinates in the centre and aligned in spherical polar coordinates in the outer parts. In projection, the ellipticity of the isophotes and the position angle of the major axis of our models generally changes with radius. So, a natural application is to elliptical galaxies that exhibit isophote twisting....
On Modal Refinement and Consistency
Nyman, Ulrik; Larsen, Kim Guldstrand; Wasowski, Andrzej
2007-01-01
Almost 20 years after the original conception, we revisit several fundamental question about modal transition systems. First, we demonstrate the incompleteness of the standard modal refinement using a counterexample due to Hüttel. Deciding any refinement, complete with respect to the standard...... notions of implementation, is shown to be computationally hard (co-NP hard). Second, we consider four forms of consistency (existence of implementations) for modal specifications. We characterize each operationally, giving algorithms for deciding, and for synthesizing implementations, together...
Tri-Sasakian consistent reduction
Cassani, Davide
2011-01-01
We establish a universal consistent Kaluza-Klein truncation of M-theory based on seven-dimensional tri-Sasakian structure. The four-dimensional truncated theory is an N=4 gauged supergravity with three vector multiplets and a non-abelian gauge group, containing the compact factor SO(3). Consistency follows from the fact that our truncation takes exactly the same form as a left-invariant reduction on a specific coset manifold, and we show that the same holds for the various universal consistent truncations recently put forward in the literature. We describe how the global symmetry group SL(2,R) x SO(6,3) is embedded in the symmetry group E7(7) of maximally supersymmetric reductions, and make the connection with the approach of Exceptional Generalized Geometry. Vacuum AdS4 solutions spontaneously break the amount of supersymmetry from N=4 to N=3,1 or 0, and the spectrum contains massive modes. We find a subtruncation to minimal N=3 gauged supergravity as well as an N=1 subtruncation to the SO(3)-invariant secto...
Souto-Iglesias, Antonio; González, Leo M; Cercos-Pita, Jose L
2013-01-01
The consistency of Moving Particle Semi-implicit (MPS) method in reproducing the gradient, divergence and Laplacian differential operators is discussed in the present paper. Its relation to the Smoothed Particle Hydrodynamics (SPH) method is rigorously established. The application of the MPS method to solve the Navier-Stokes equations using a fractional step approach is treated, unveiling inconsistency problems when solving the Poisson equation for the pressure. A new corrected MPS method incorporating boundary terms is proposed. Applications to one dimensional boundary value Dirichlet and mixed Neumann-Dirichlet problems and to two-dimensional free-surface flows are presented.
Maintaining consistency in distributed systems
Birman, Kenneth P.
1991-01-01
In systems designed as assemblies of independently developed components, concurrent access to data or data structures normally arises within individual programs, and is controlled using mutual exclusion constructs, such as semaphores and monitors. Where data is persistent and/or sets of operation are related to one another, transactions or linearizability may be more appropriate. Systems that incorporate cooperative styles of distributed execution often replicate or distribute data within groups of components. In these cases, group oriented consistency properties must be maintained, and tools based on the virtual synchrony execution model greatly simplify the task confronting an application developer. All three styles of distributed computing are likely to be seen in future systems - often, within the same application. This leads us to propose an integrated approach that permits applications that use virtual synchrony with concurrent objects that respect a linearizability constraint, and vice versa. Transactional subsystems are treated as a special case of linearizability.
Decentralized Consistent Updates in SDN
Nguyen, Thanh Dang
2017-04-10
We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and blackholes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.
The Consistent Vehicle Routing Problem
Groer, Christopher S [ORNL; Golden, Bruce [University of Maryland; Edward, Wasil [American University
2009-01-01
In the small package shipping industry (as in other industries), companies try to differentiate themselves by providing high levels of customer service. This can be accomplished in several ways, including online tracking of packages, ensuring on-time delivery, and offering residential pickups. Some companies want their drivers to develop relationships with customers on a route and have the same drivers visit the same customers at roughly the same time on each day that the customers need service. These service requirements, together with traditional constraints on vehicle capacity and route length, define a variant of the classical capacitated vehicle routing problem, which we call the consistent VRP (ConVRP). In this paper, we formulate the problem as a mixed-integer program and develop an algorithm to solve the ConVRP that is based on the record-to-record travel algorithm. We compare the performance of our algorithm to the optimal mixed-integer program solutions for a set of small problems and then apply our algorithm to five simulated data sets with 1,000 customers and a real-world data set with more than 3,700 customers. We provide a technique for generating ConVRP benchmark problems from vehicle routing problem instances given in the literature and provide our solutions to these instances. The solutions produced by our algorithm on all problems do a very good job of meeting customer service objectives with routes that have a low total travel time.
Consistent thermodynamic properties of lipids systems
Cunico, Larissa; Ceriani, Roberta; Sarup, Bent
Physical and thermodynamic properties of pure components and their mixtures are the basic requirement for process design, simulation, and optimization. In the case of lipids, our previous works[1-3] have indicated a lack of experimental data for pure components and also for their mixtures...... different pressures, with azeotrope behavior observed. Available thermodynamic consistency tests for TPx data were applied before performing parameter regressions for Wilson, NRTL, UNIQUAC and original UNIFAC models. The relevance of enlarging experimental databank of lipids systems data in order to improve...
CMB lens sample covariance and consistency relations
Motloch, Pavel; Hu, Wayne; Benoit-Lévy, Aurélien
2017-02-01
Gravitational lensing information from the two and higher point statistics of the cosmic microwave background (CMB) temperature and polarization fields are intrinsically correlated because they are lensed by the same realization of structure between last scattering and observation. Using an analytic model for lens sample covariance, we show that there is one mode, separately measurable in the lensed CMB power spectra and lensing reconstruction, that carries most of this correlation. Once these measurements become lens sample variance dominated, this mode should provide a useful consistency check between the observables that is largely free of sampling and cosmological parameter errors. Violations of consistency could indicate systematic errors in the data and lens reconstruction or new physics at last scattering, any of which could bias cosmological inferences and delensing for gravitational waves. A second mode provides a weaker consistency check for a spatially flat universe. Our analysis isolates the additional information supplied by lensing in a model-independent manner but is also useful for understanding and forecasting CMB cosmological parameter errors in the extended Λ cold dark matter parameter space of dark energy, curvature, and massive neutrinos. We introduce and test a simple but accurate forecasting technique for this purpose that neither double counts lensing information nor neglects lensing in the observables.
Viewpoint Consistency: An Eye Movement Study
Filipe Cristino
2012-05-01
Full Text Available Eye movements have been widely studied, using images and videos in laboratories or portable eye trackers in the real world. Although a good understanding of the saccadic system and extensive models of gaze have been developed over the years, only a few studies have focused on the consistency of eye movements across viewpoints. We have developed a new technique to compute and map the depth of collected eye movements on stimuli rendered from 3D mesh objects using a traditional corneal reflection eye tracker (SR Eyelink 1000. Having eye movements mapped into 3D space (and not on an image space allowed us to compare fixations across viewpoints. Fixation sequences (scanpaths were also studied across viewpoints using the ScanMatch method (Cristino et al 2010, Behavioural and Research Methods 42, 692–700, extended to work with 3D eye movements. In a set of experiments where participants were asked to perform a recognition task on either a set of objects or faces, we recorded their gaze while performing the task. Participants either viewed the stimuli in 2D or using anaglyph glasses. The stimuli were shown from different viewpoints during the learning and testing phases. A high degree of gaze consistency was found across the different viewpoints, particularly between learning and testing phases. Scanpaths were also similar across viewpoints, suggesting not only that the gazed spatial locations are alike, but also their temporal order.
Evaluating Temporal Consistency in Marine Biodiversity Hotspots.
Piacenza, Susan E; Thurman, Lindsey L; Barner, Allison K; Benkwitt, Cassandra E; Boersma, Kate S; Cerny-Chipman, Elizabeth B; Ingeman, Kurt E; Kindinger, Tye L; Lindsley, Amy J; Nelson, Jake; Reimer, Jessica N; Rowe, Jennifer C; Shen, Chenchen; Thompson, Kevin A; Heppell, Selina S
2015-01-01
With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monitoring dataset collected over an eight year period off the US Pacific Coast, we developed a methodological approach for avoiding biases associated with hotspot delineation. We aggregated benthic fish species data from research trawls and calculated mean hotspot thresholds for fish species richness and Shannon's diversity indices over the eight year dataset. We used a spatial frequency distribution method to assign hotspot designations to the grid cells annually. We found no areas containing consistently high biodiversity through the entire study period based on the mean thresholds, and no grid cell was designated as a hotspot for greater than 50% of the time-series. To test if our approach was sensitive to sampling effort and the geographic extent of the survey, we followed a similar routine for the northern region of the survey area. Our finding of low consistency in benthic fish biodiversity hotspots over time was upheld, regardless of biodiversity metric used, whether thresholds were calculated per year or across all years, or the spatial extent for which we calculated thresholds and identified hotspots. Our results suggest that static measures of benthic fish biodiversity off the US West Coast are insufficient for identification of hotspots and that long-term data are required to appropriately identify patterns of high temporal variability in biodiversity for these highly mobile taxa. Given that ecological communities are responding to a changing climate and other
Usability problem reports for comparative studies: consistency and inspectability
Vermeeren, A.P.O.S.; Attema, J.; Akar, E.; De Ridder, H.; Van Doorn, A.J.; Erburg, Ç.; Berkman, A.E.; Maguire, M.
2008-01-01
This study explores issues of consistency and inspectability in usability test data analysis processes and reports. Problem reports resulting from usability tests performed by three professional usability labs in three different countries are compared. Each of the labs conducted a usability test on
Sludge characterization: the role of physical consistency
Spinosa, Ludovico; Wichmann, Knut
2003-07-01
The physical consistency is an important parameter in sewage sludge characterization as it strongly affects almost all treatment, utilization and disposal operations. In addition, in many european Directives a reference to the physical consistency is reported as a characteristic to be evaluated for fulfilling the regulations requirements. Further, in many analytical methods for sludge different procedures are indicated depending on whether a sample is liquid or not, is solid or not. Three physical behaviours (liquid, paste-like and solid) can be observed with sludges, so the development of analytical procedures to define the boundary limit between liquid and paste-like behaviours (flowability) and that between solid and paste-like ones (solidity) is of growing interest. Several devices can be used for evaluating the flowability and solidity properties, but often they are costly and difficult to be operated in the field. Tests have been carried out to evaluate the possibility to adopt a simple extrusion procedure for flowability measurements, and a Vicat needle for solidity ones. (author)
Retrocausation, Consistency, and the Bilking Paradox
Dobyns, York H.
2011-11-01
Retrocausation seems to admit of time paradoxes in which events prevent themselves from occurring and thereby create a physical instance of the liar's paradox, an event which occurs iff it does not occur. The specific version in which a retrocausal event is used to trigger an intervention which prevents its own future cause is called the bilking paradox (the event is bilked of its cause). The analysis of Echeverria, Klinkhammer, and Thorne (EKT) suggests time paradoxes cannot arise even in the presence of retrocausation. Any self-contradictory event sequence will be replaced in reality by a closely related but noncontradictory sequence. The EKT analysis implies that attempts to create bilking must instead produce logically consistent sequences wherein the bilked event arises from alternative causes. Bilking a retrocausal information channel of limited reliability usually results only in failures of signaling. An exception applies when the bilking is conducted in response only to some of the signal values that can be carried on the channel. Theoretical analysis based on EKT predicts that, since some of the channel outcomes are not bilked, the channel is capable of transmitting data with its normal reliability, and the paradox-avoidance effects will instead suppress the outcomes that would lead to forbidden (bilked) transmissions. A recent parapsychological experiment by Bem displays a retrocausal information channel of sufficient reliability to test this theoretical model of physical reality's response to retrocausal effects. A modified version with partial bilking would provide a direct test of the generality of the EKT mechanism.
Consistent Design of Dependable Control Systems
Blanke, M.
1996-01-01
Design of fault handling in control systems is discussed, and a method for consistent design is presented.......Design of fault handling in control systems is discussed, and a method for consistent design is presented....
Students’ conceptual understanding consistency of heat and temperature
Slamet Budiarti, Indah; Suparmi; Sarwanto; Harjana
2017-01-01
The aims of the research were to explore and to describe the consistency of students’ understanding of heat and temperature concept. The sample that was taken using purposive random sampling technique consisted of 99 high school students from 3 senior high schools in Jayapura city. The descriptive qualitative method was employed in this study. The data were collected using tests and interviews regarding the subject matters of Heat and Temperature. Based on the results of data analysis, it was concluded that 3.03% of the students was the consistency of right answer, 79.80% of the students was consistency but wrong answer and 17.17% of the students was inconsistency.
Behavioural consistency and life history of Rana dalmatina tadpoles.
Urszán, Tamás János; Török, János; Hettyey, Attila; Garamszegi, László Zsolt; Herczeg, Gábor
2015-05-01
The focus of evolutionary behavioural ecologists has recently turned towards understanding the causes and consequences of behavioural consistency, manifesting either as animal personality (consistency in a single behaviour) or behavioural syndrome (consistency across more behaviours). Behavioural type (mean individual behaviour) has been linked to life-history strategies, leading to the emergence of the integrated pace-of-life syndrome (POLS) theory. Using Rana dalmatina tadpoles as models, we tested if behavioural consistency and POLS could be detected during the early ontogenesis of this amphibian. We targeted two ontogenetic stages and measured activity, exploration and risk-taking in a common garden experiment, assessing both individual behavioural type and intra-individual behavioural variation. We observed that activity was consistent in all tadpoles, exploration only became consistent with advancing age and risk-taking only became consistent in tadpoles that had been tested, and thus disturbed, earlier. Only previously tested tadpoles showed trends indicative of behavioural syndromes. We found an activity-age at metamorphosis POLS in the previously untested tadpoles irrespective of age. Relative growth rate correlated positively with the intra-individual variation of activity of the previously untested older tadpoles. In previously tested older tadpoles, intra-individual variation of exploration correlated negatively and intra-individual variation of risk-taking correlated positively with relative growth rate. We provide evidence for behavioural consistency and POLS in predator- and conspecific-naive tadpoles. Intra-individual behavioural variation was also correlated to life history, suggesting its relevance for the POLS theory. The strong effect of moderate disturbance related to standard behavioural testing on later behaviour draws attention to the pitfalls embedded in repeated testing.
MDCC: Multi-Data Center Consistency
Kraska, Tim; Franklin, Michael J; Madden, Samuel
2012-01-01
Replicating data across multiple data centers not only allows moving the data closer to the user and, thus, reduces latency for applications, but also increases the availability in the event of a data center failure. Therefore, it is not surprising that companies like Google, Yahoo, and Netflix already replicate user data across geographically different regions. However, replication across data centers is expensive. Inter-data center network delays are in the hundreds of milliseconds and vary significantly. Synchronous wide-area replication is therefore considered to be unfeasible with strong consistency and current solutions either settle for asynchronous replication which implies the risk of losing data in the event of failures, restrict consistency to small partitions, or give up consistency entirely. With MDCC (Multi-Data Center Consistency), we describe the first optimistic commit protocol, that does not require a master or partitioning, and is strongly consistent at a cost similar to eventually consiste...
A dual-consistency cache coherence protocol
Ros, Alberto; Jimborean, Alexandra
2015-01-01
Weak memory consistency models can maximize system performance by enabling hardware and compiler optimizations, but increase programming complexity since they do not match programmers’ intuition. The design of an efficient system with an intuitive memory model is an open challenge. This paper proposes SPEL, a dual-consistency cache coherence protocol which simultaneously guarantees the strongest memory consistency model provided by the hardware and yields improvements in both performance and ...
A new approach to hull consistency
Kolev Lubomir
2016-06-01
Full Text Available Hull consistency is a known technique to improve the efficiency of iterative interval methods for solving nonlinear systems describing steady-states in various circuits. Presently, hull consistency is checked in a scalar manner, i.e. successively for each equation of the nonlinear system with respect to a single variable. In the present poster, a new more general approach to implementing hull consistency is suggested which consists in treating simultaneously several equations with respect to the same number of variables.
Personality Consistency in Dogs: A Meta-Analysis
Fratkin, Jamie L.; Sinn, David L.; Patall, Erika A.; Gosling, Samuel D.
2013-01-01
Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that ‘puppy tests’ measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family). Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43). Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., ‘puppy tests’) versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed. PMID:23372787
Consistent estimators in random censorship semiparametric models
王启华
1996-01-01
For the fixed design regression modelwhen Y, are randomly censored on the right, the estimators of unknown parameter and regression function g from censored observations are defined in the two cases .where the censored distribution is known and unknown, respectively. Moreover, the sufficient conditions under which these estimators are strongly consistent and pth (p>2) mean consistent are also established.
Student Effort, Consistency, and Online Performance
Patron, Hilde; Lopez, Salvador
2011-01-01
This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas…
Consistent truncations with massive modes and holography
Cassani, Davide; Faedo, Anton F
2011-01-01
We review the basic features of some recently found consistent Kaluza-Klein truncations including massive modes. We emphasize the general ideas underlying the reduction procedure, then we focus on type IIB supergravity on 5-dimensional manifolds admitting a Sasaki-Einstein structure, which leads to half-maximal gauged supergravity in five dimensions. Finally, we comment on the holographic picture of consistency.
A Framework of Memory Consistency Models
胡伟武; 施巍松; 等
1998-01-01
Previous descriptions of memory consistency models in shared-memory multiprocessor systems are mainly expressed as constraints on the memory access event ordering and hence are hardware-centric.This paper presents a framework of memory consistency models which describes the memory consistency model on the behavior level.Based on the understanding that the behavior of an execution is determined by the execution order of conflicting accesses,a memory consistency model is defined as an interprocessor synchronization mechanism which orders the execution of operations from different processors.Synchronization order of an execution under certain consistency model is also defined.The synchronization order,together with the program order determines the behavior of an execution.This paper also presents criteria for correct program and correct implementation of consistency models.Regarding an implementation of a consistency model as certain memory event ordering constraints,this paper provides a method to prove the correctness of consistency model implementations,and the correctness of the lock-based cache coherence protocol is proved with this method.
Sticky continuous processes have consistent price systems
Bender, Christian; Pakkanen, Mikko; Sayit, Hasanjan
Under proportional transaction costs, a price process is said to have a consistent price system, if there is a semimartingale with an equivalent martingale measure that evolves within the bid-ask spread. We show that a continuous, multi-asset price process has a consistent price system, under arb...
Putting Consistent Theories Together in Institutions
应明生
1995-01-01
The problem of putting consistent theories together in institutions is discussed.A general necessary condition for consistency of the resulting theory is carried out,and some sufficient conditions are given for diagrams of theories in which shapes are tree bundles or directed graphs.Moreover,some transformations from complicated cases to simple ones are established.
The consistency approach for the quality control of vaccines.
Hendriksen, Coenraad; Arciniega, Juan L; Bruckner, Lukas; Chevalier, Michel; Coppens, Emmanuelle; Descamps, Johan; Duchêne, Michel; Dusek, David Michael; Halder, Marlies; Kreeftenberg, Hans; Maes, Alexandrine; Redhead, Keith; Ravetkar, Satish D; Spieser, Jean-Marc; Swam, Hanny
2008-01-01
Current lot release testing of conventional vaccines emphasizes quality control of the final product and is characterized by its extensive use of laboratory animals. This report, which is based on the outcome of an ECVAM (European Centre for Validation of Alternative Methods, Institute for Health and Consumer Protection, European Commission Joint Research Centre, Ispra, Italy) workshop, discusses the concept of consistency testing as an alternative approach for lot release testing. The consistency approach for the routine release of vaccines is based upon the principle that the quality of vaccines is a consequence of a quality system and of consistent production of lots with similar characteristics to those lots that have been shown to be safe and effective in humans or the target species. The report indicates why and under which circumstances this approach can be applied, the role of the different stakeholders, and the need for international harmonization. It also gives recommendations for its implementation.
Consistency analysis of accelerated degradation mechanism based on gray theory
Yunxia Chen; Hongxia Chen; Zhou Yang; Rui Kang; Yi Yang
2014-01-01
A fundamental premise of an accelerated testing is that the failure mechanism under elevated and normal stress levels should remain the same. Thus, verification of the consistency of failure mechanisms is essential during an accelerated testing. A new consistency analysis method based on the gray theory is pro-posed for complex products. First of al , existing consistency ana-lysis methods are reviewed with a focus on the comparison of the differences among them. Then, the proposed consistency ana-lysis method is introduced. Two effective gray prediction models, gray dynamic model and new information and equal dimensional (NIED) model, are adapted in the proposed method. The process to determine the dimension of NIED model is also discussed, and a decision rule is expanded. Based on that, the procedure of ap-plying the new consistent analysis method is developed. Final y, a case study of the consistency analysis of a reliability enhancement testing is conducted to demonstrate and validate the proposed method.
Consistency of Social Sensing Signatures Across Major US Cities
Soliman, Aiman; Padmanabhan, Anand; Wang, Shaowen
2016-01-01
Previous studies have shown that Twitter users have biases to tweet from certain locations, locational bias, and during certain hours, temporal bias. We used three years of geolocated Twitter Data to quantify these biases and test our central hypothesis that Twitter users biases are consistent across US cities. Our results suggest that temporal and locational bias of Twitter users are inconsistent between three US metropolitan cities. We derive conclusions about the role of the complexity of the underlying data producing process on its consistency and argue for the potential research avenue for Geospatial Data Science to test and quantify these inconsistencies in the class of organically evolved Big Data.
On the Initial State and Consistency Relations
Berezhiani, Lasha
2014-01-01
We study the effect of the initial state on the consistency conditions for adiabatic perturbations. In order to be consistent with the constraints of General Relativity, the initial state must be diffeomorphism invariant. As a result, we show that initial wavefunctional/density matrix has to satisfy a Slavnov-Taylor identity similar to that of the action. We then investigate the precise ways in which modified initial states can lead to violations of the consistency relations. We find two independent sources of violations: i) the state can include initial non-Gaussianities; ii) even if the initial state is Gaussian, such as a Bogoliubov state, the modified 2-point function can modify the q->0 analyticity properties of the vertex functional and result in violations of the consistency relations.
On the initial state and consistency relations
Berezhiani, Lasha; Khoury, Justin, E-mail: lashaber@sas.upenn.edu, E-mail: jkhoury@sas.upenn.edu [Center for Particle Cosmology, Department of Physics and Astronomy, University of Pennsylvania, 209 South 33rd Street, Philadelphia, PA 19104 (United States)
2014-09-01
We study the effect of the initial state on the consistency conditions for adiabatic perturbations. In order to be consistent with the constraints of General Relativity, the initial state must be diffeomorphism invariant. As a result, we show that initial wavefunctional/density matrix has to satisfy a Slavnov-Taylor identity similar to that of the action. We then investigate the precise ways in which modified initial states can lead to violations of the consistency relations. We find two independent sources of violations: i) the state can include initial non-Gaussianities; ii) even if the initial state is Gaussian, such as a Bogoliubov state, the modified 2-point function can modify the q-vector → 0 analyticity properties of the vertex functional and result in violations of the consistency relations.
Self-Consistent Asset Pricing Models
Malevergne, Y
2006-01-01
We discuss the foundations of factor or regression models in the light of the self-consistency condition that the market portfolio (and more generally the risk factors) is (are) constituted of the assets whose returns it is (they are) supposed to explain. As already reported in several articles, self-consistency implies correlations between the return disturbances. As a consequence, the alpha's and beta's of the factor model are unobservable. Self-consistency leads to renormalized beta's with zero effective alpha's, which are observable with standard OLS regressions. Analytical derivations and numerical simulations show that, for arbitrary choices of the proxy which are different from the true market portfolio, a modified linear regression holds with a non-zero value $\\alpha_i$ at the origin between an asset $i$'s return and the proxy's return. Self-consistency also introduces ``orthogonality'' and ``normality'' conditions linking the beta's, alpha's (as well as the residuals) and the weights of the proxy por...
Quasiparticle self-consistent GW theory.
van Schilfgaarde, M; Kotani, Takao; Faleev, S
2006-06-09
In past decades the scientific community has been looking for a reliable first-principles method to predict the electronic structure of solids with high accuracy. Here we present an approach which we call the quasiparticle self-consistent approximation. It is based on a kind of self-consistent perturbation theory, where the self-consistency is constructed to minimize the perturbation. We apply it to selections from different classes of materials, including alkali metals, semiconductors, wide band gap insulators, transition metals, transition metal oxides, magnetic insulators, and rare earth compounds. Apart from some mild exceptions, the properties are very well described, particularly in weakly correlated cases. Self-consistency dramatically improves agreement with experiment, and is sometimes essential. Discrepancies with experiment are systematic, and can be explained in terms of approximations made.
Consistency in the World Wide Web
Thomsen, Jakob Grauenkjær
Tim Berners-Lee envisioned that computers will behave as agents of humans on the World Wide Web, where they will retrieve, extract, and interact with information from the World Wide Web. A step towards this vision is to make computers capable of extracting this information in a reliable...... and consistent way. In this dissertation we study steps towards this vision by showing techniques for the specication, the verication and the evaluation of the consistency of information in the World Wide Web. We show how to detect certain classes of errors in a specication of information, and we show how...... the World Wide Web, in order to help perform consistent evaluations of web extraction techniques. These contributions are steps towards having computers reliable and consistently extract information from the World Wide Web, which in turn are steps towards achieving Tim Berners-Lee's vision. ii...
Consistency Relations for Large Field Inflation
Chiba, Takeshi
2014-01-01
Consistency relations for chaotic inflation with a monomial potential and natural inflation and hilltop inflation are given which involve the scalar spectral index $n_s$, the tensor-to-scalar ratio $r$ and the running of the spectral index $\\alpha$. The measurement of $\\alpha$ with $O(10^{-3})$ and the improvement in the measurement of $n_s$ could discriminate monomial model from natural/hilltop inflation models. A consistency region for general large field models is also presented.
Entropy-based consistent model driven architecture
Niepostyn, Stanisław Jerzy
2016-09-01
A description of software architecture is a plan of the IT system construction, therefore any architecture gaps affect the overall success of an entire project. The definitions mostly describe software architecture as a set of views which are mutually unrelated, hence potentially inconsistent. Software architecture completeness is also often described in an ambiguous way. As a result most methods of IT systems building comprise many gaps and ambiguities, thus presenting obstacles for software building automation. In this article the consistency and completeness of software architecture are mathematically defined based on calculation of entropy of the architecture description. Following this approach, in this paper we also propose our method of automatic verification of consistency and completeness of the software architecture development method presented in our previous article as Consistent Model Driven Architecture (CMDA). The proposed FBS (Functionality-Behaviour-Structure) entropy-based metric applied in our CMDA approach enables IT architects to decide whether the modelling process is complete and consistent. With this metric, software architects could assess the readiness of undergoing modelling work for the start of IT system building. It even allows them to assess objectively whether the designed software architecture of the IT system could be implemented at all. The overall benefit of such an approach is that it facilitates the preparation of complete and consistent software architecture more effectively as well as it enables assessing and monitoring of the ongoing modelling development status. We demonstrate this with a few industry examples of IT system designs.
A Nonparametric Approach to Estimate Classification Accuracy and Consistency
Lathrop, Quinn N.; Cheng, Ying
2014-01-01
When cut scores for classifications occur on the total score scale, popular methods for estimating classification accuracy (CA) and classification consistency (CC) require assumptions about a parametric form of the test scores or about a parametric response model, such as item response theory (IRT). This article develops an approach to estimate CA…
Consistency and Derangements in Brane Tilings
Hanany, Amihay; Ramgoolam, Sanjaye; Seong, Rak-Kyeong
2015-01-01
Brane tilings describe Lagrangians (vector multiplets, chiral multiplets, and the superpotential) of four dimensional $\\mathcal{N}=1$ supersymmetric gauge theories. These theories, written in terms of a bipartite graph on a torus, correspond to worldvolume theories on $N$ D$3$-branes probing a toric Calabi-Yau threefold singularity. A pair of permutations compactly encapsulates the data necessary to specify a brane tiling. We show that geometric consistency for brane tilings, which ensures that the corresponding quantum field theories are well behaved, imposes constraints on the pair of permutations, restricting certain products constructed from the pair to have no one-cycles. Permutations without one-cycles are known as derangements. We illustrate this formulation of consistency with known brane tilings. Counting formulas for consistent brane tilings with an arbitrary number of chiral bifundamental fields are written down in terms of delta functions over symmetric groups.
Quantifying the consistency of scientific databases
Šubelj, Lovro; Boshkoska, Biljana Mileva; Kastrin, Andrej; Levnajić, Zoran
2015-01-01
Science is a social process with far-reaching impact on our modern society. In the recent years, for the first time we are able to scientifically study the science itself. This is enabled by massive amounts of data on scientific publications that is increasingly becoming available. The data is contained in several databases such as Web of Science or PubMed, maintained by various public and private entities. Unfortunately, these databases are not always consistent, which considerably hinders this study. Relying on the powerful framework of complex networks, we conduct a systematic analysis of the consistency among six major scientific databases. We found that identifying a single "best" database is far from easy. Nevertheless, our results indicate appreciable differences in mutual consistency of different databases, which we interpret as recipes for future bibliometric studies.
Self-consistent Green's function approaches
Barbieri, Carlo
2016-01-01
We present the fundamental techniques and working equations of many-body Green's function theory for calculating ground state properties and the spectral strength. Green's function methods closely relate to other polynomial scaling approaches discussed in chapters~8 and ~10. However, here we aim directly at a global view of the many-fermion structure. We derive the working equations for calculating many-body propagators, using both the Algebraic Diagrammatic Construction technique and the self-consistent formalism at finite temperature. Their implementation is discussed, as well as the the inclusion of three-nucleon interactions. The self-consistency feature is essential to guarantee thermodynamic consistency. The paring and neutron matter models introduced in previous chapters are solved and compared with the other methods in this book.
Personalized recommendation based on unbiased consistence
Zhu, Xuzhen; Tian, Hui; Zhang, Ping; Hu, Zheng; Zhou, Tao
2015-08-01
Recently, in physical dynamics, mass-diffusion-based recommendation algorithms on bipartite network provide an efficient solution by automatically pushing possible relevant items to users according to their past preferences. However, traditional mass-diffusion-based algorithms just focus on unidirectional mass diffusion from objects having been collected to those which should be recommended, resulting in a biased causal similarity estimation and not-so-good performance. In this letter, we argue that in many cases, a user's interests are stable, and thus bidirectional mass diffusion abilities, no matter originated from objects having been collected or from those which should be recommended, should be consistently powerful, showing unbiased consistence. We further propose a consistence-based mass diffusion algorithm via bidirectional diffusion against biased causality, outperforming the state-of-the-art recommendation algorithms in disparate real data sets, including Netflix, MovieLens, Amazon and Rate Your Music.
A Revisit to Probability - Possibility Consistency Principles
Mamoni Dhar
2013-03-01
Full Text Available In this article, our main intention is to highlight the fact that the probable links between probability and possibility which were established by different authors at different point of time on the basis of some well known consistency principles cannot provide the desired result. That is why the paper discussed some prominent works for transformations between probability and possibility and finally aimed to suggest a new principle because none of the existing principles because none of them found the unique transformation. The new consistency principle which is suggested hereby would in turn replace all others that exist in the literature references by providing a reliable estimate of consistency between the two.Furthermore some properties of entropy of fuzzy numbers are also presented in this article.
Consistent matter couplings for Plebanski gravity
Tennie, Felix
2010-01-01
We develop a scheme for the minimal coupling of all standard types of tensor and spinor field matter to Plebanski gravity. This theory is a geometric reformulation of vacuum general relativity in terms of two-form frames and connection one-forms, and provides a covariant basis for various quantization approaches. Using the spinor formalism we prove the consistency of the newly proposed matter coupling by demonstrating the full equivalence of Plebanski gravity plus matter to Einstein--Cartan gravity. As a byproduct we also show the consistency of some previous suggestions for matter actions.
Consistent matter couplings for Plebanski gravity
Tennie, Felix; Wohlfarth, Mattias N. R.
2010-11-01
We develop a scheme for the minimal coupling of all standard types of tensor and spinor field matter to Plebanski gravity. This theory is a geometric reformulation of vacuum general relativity in terms of two-form frames and connection one-forms, and provides a covariant basis for various quantization approaches. Using the spinor formalism we prove the consistency of the newly proposed matter coupling by demonstrating the full equivalence of Plebanski gravity plus matter to Einstein-Cartan gravity. As a by-product we also show the consistency of some previous suggestions for matter actions.
Personality and Situation Predictors of Consistent Eating Patterns.
Uku Vainik
Full Text Available A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studied.A community-based sample of 164 women completed various personality tests, and 139 of them also reported their eating behaviour 6 times/day over 10 observational days. We focused on observations with meals (breakfast, lunch, or dinner. The participants indicated if their momentary eating patterns were consistent with their own baseline eating patterns in terms of healthiness or size of the meal. Further, participants described various characteristics of each eating situation.Eating consistency was positively predicted by trait self-control. Eating consistency was undermined by eating in the evening, eating with others, eating away from home, having consumed alcohol and having undertaken physical exercise. Interactions emerged between personality traits and situations, including punishment sensitivity, restraint, physical activity and alcohol consumption.Trait self-control and several eating situation variables were related to eating consistency. These findings provide a starting point for targeting interventions to improve consistency, suggesting that a focus on self-control skills, together with addressing contextual factors such as social situations and time of day, may be most promising. This work is a first step to provide people with the tools they need to maintain a consistently healthy lifestyle in a food-rich environment.
Personality and Situation Predictors of Consistent Eating Patterns.
Vainik, Uku; Dubé, Laurette; Lu, Ji; Fellows, Lesley K
2015-01-01
A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studied. A community-based sample of 164 women completed various personality tests, and 139 of them also reported their eating behaviour 6 times/day over 10 observational days. We focused on observations with meals (breakfast, lunch, or dinner). The participants indicated if their momentary eating patterns were consistent with their own baseline eating patterns in terms of healthiness or size of the meal. Further, participants described various characteristics of each eating situation. Eating consistency was positively predicted by trait self-control. Eating consistency was undermined by eating in the evening, eating with others, eating away from home, having consumed alcohol and having undertaken physical exercise. Interactions emerged between personality traits and situations, including punishment sensitivity, restraint, physical activity and alcohol consumption. Trait self-control and several eating situation variables were related to eating consistency. These findings provide a starting point for targeting interventions to improve consistency, suggesting that a focus on self-control skills, together with addressing contextual factors such as social situations and time of day, may be most promising. This work is a first step to provide people with the tools they need to maintain a consistently healthy lifestyle in a food-rich environment.
Consistency in multi-viewpoint architectural design
Dijkman, R.M.; Dijkman, Remco Matthijs
2006-01-01
This thesis presents a framework that aids in preserving consistency in multi-viewpoint designs. In a multi-viewpoint design each stakeholder constructs his own design part. We call each stakeholder’s design part the view of that stakeholder. To construct his view, a stakeholder has a viewpoint.
Consistency and stability of recombinant fermentations.
Wiebe, M E; Builder, S E
1994-01-01
Production of proteins of consistent quality in heterologous, genetically-engineered expression systems is dependent upon identifying the manufacturing process parameters which have an impact on product structure, function, or purity, validating acceptable ranges for these variables, and performing the manufacturing process as specified. One of the factors which may affect product consistency is genetic instability of the primary product sequence, as well as instability of genes which code for proteins responsible for post-translational modification of the product. Approaches have been developed for mammalian expression systems to assure that product quality is not changing through mechanisms of genetic instability. Sensitive protein analytical methods, particularly peptide mapping, are used to evaluate product structure directly, and are more sensitive in detecting genetic instability than is direct genetic analysis by nucleotide sequencing of the recombinant gene or mRNA. These methods are being employed to demonstrate that the manufacturing process consistently yields a product of defined structure from cells cultured through the range of cell ages used in the manufacturing process and well beyond the maximum cell age defined for the process. The combination of well designed validation studies which demonstrate consistent product quality as a function of cell age, and rigorous quality control of every product lot by sensitive protein analytical methods provide the necessary assurance that product structure is not being altered through mechanisms of mutation and selection.
Developing consistent time series landsat data products
The Landsat series satellite has provided earth observation data record continuously since early 1970s. There are increasing demands on having a consistent time series of Landsat data products. In this presentation, I will summarize the work supported by the USGS Landsat Science Team project from 20...
Developing consistent pronunciation models for phonemic variants
Davel, M
2006-09-01
Full Text Available from a lexicon containing variants. In this paper we (the authors) address both these issues by creating ‘pseudo-phonemes’ associated with sets of ‘generation restriction rules’ to model those pronunciations that are consistently realised as two or more...
On Consistency Maintenance In Service Discovery
Sundramoorthy, V.; Hartel, Pieter H.; Scholten, Johan
2005-01-01
Communication and node failures degrade the ability of a service discovery protocol to ensure Users receive the correct service information when the service changes. We propose that service discovery protocols employ a set of recovery techniques to recover from failures and regain consistency. We
On Consistency Maintenance In Service Discovery
Sundramoorthy, V.; Hartel, Pieter H.; Scholten, Johan
Communication and node failures degrade the ability of a service discovery protocol to ensure Users receive the correct service information when the service changes. We propose that service discovery protocols employ a set of recovery techniques to recover from failures and regain consistency. We
Consistent feeding positions of great tit parents
Lessells, C.M.; Poelman, E.H.; Mateman, A.C.; Cassey, P.
2006-01-01
When parent birds arrive at the nest to provision their young, their position on the nest rim may influence which chick or chicks are fed. As a result, the consistency of feeding positions of the individual parents, and the difference in position between the parents, may affect how equitably food is
Computing Rooted and Unrooted Maximum Consistent Supertrees
van Iersel, Leo
2009-01-01
A chief problem in phylogenetics and database theory is the computation of a maximum consistent tree from a set of rooted or unrooted trees. A standard input are triplets, rooted binary trees on three leaves, or quartets, unrooted binary trees on four leaves. We give exact algorithms constructing rooted and unrooted maximum consistent supertrees in time O(2^n n^5 m^2 log(m)) for a set of m triplets (quartets), each one distinctly leaf-labeled by some subset of n labels. The algorithms extend to weighted triplets (quartets). We further present fast exact algorithms for constructing rooted and unrooted maximum consistent trees in polynomial space. Finally, for a set T of m rooted or unrooted trees with maximum degree D and distinctly leaf-labeled by some subset of a set L of n labels, we compute, in O(2^{mD} n^m m^5 n^6 log(m)) time, a tree distinctly leaf-labeled by a maximum-size subset X of L that all trees in T, when restricted to X, are consistent with.
Addendum to "On the consistency of MPS"
Souto-Iglesias, Antonio; González, Leo M; Cercos-Pita, Jose L
2013-01-01
The analogies between the Moving Particle Semi-implicit method (MPS) and Incompressible Smoothed Particle Hydrodynamics method (ISPH) are established in this note, as an extension of the MPS consistency analysis conducted in "Souto-Iglesias et al., Computer Physics Communications, 184(3), 2013."
Proteolysis and consistency of Meshanger cheese
Jong, de L.
1978-01-01
Proteolysis in Meshanger cheese, estimated by quantitative polyacrylamide gel electrophoresis is discussed. The conversion of α _{s1} -casein was proportional to rennet concentration in the cheese. Changes in consistency, after a maximum, were correlated to breakdown of
On the existence of consistent price systems
Bayraktar, Erhan; Pakkanen, Mikko S.; Sayit, Hasanjan
2014-01-01
We formulate a sufficient condition for the existence of a consistent price system (CPS), which is weaker than the conditional full support condition (CFS). We use the new condition to show the existence of CPSs for certain processes that fail to have the CFS property. In particular this condition...
A self-consistent Maltsev pulse model
Buneman, O.
1985-04-01
A self-consistent model for an electron pulse propagating through a plasma is presented. In this model, the charge imbalance between plasma ions, plasma electrons and pulse electrons creates the travelling potential well in which the pulse electrons are trapped.
Consistent implementation of decisions in the brain.
James A R Marshall
Full Text Available Despite the complexity and variability of decision processes, motor responses are generally stereotypical and independent of decision difficulty. How is this consistency achieved? Through an engineering analogy we consider how and why a system should be designed to realise not only flexible decision-making, but also consistent decision implementation. We specifically consider neurobiologically-plausible accumulator models of decision-making, in which decisions are made when a decision threshold is reached. To trade-off between the speed and accuracy of the decision in these models, one can either adjust the thresholds themselves or, equivalently, fix the thresholds and adjust baseline activation. Here we review how this equivalence can be implemented in such models. We then argue that manipulating baseline activation is preferable as it realises consistent decision implementation by ensuring consistency of motor inputs, summarise empirical evidence in support of this hypothesis, and suggest that it could be a general principle of decision making and implementation. Our goal is therefore to review how neurobiologically-plausible models of decision-making can manipulate speed-accuracy trade-offs using different mechanisms, to consider which of these mechanisms has more desirable decision-implementation properties, and then review the relevant neuroscientific data on which mechanism brains actually use.
Consistency in multi-viewpoint architectural design
Dijkman, Remco Matthijs
2006-01-01
This thesis presents a framework that aids in preserving consistency in multi-viewpoint designs. In a multi-viewpoint design each stakeholder constructs his own design part. We call each stakeholder¿s design part the view of that stakeholder. To construct his view, a stakeholder has a viewpoint. Thi
Properties and Update Semantics of Consistent Views
1985-09-01
8217 PR.OPERTIES AND UPDATE SEMANTICS OF CONSISTENT VIEWS G. Gottlob Institute for Applied Mathematics C.N.H.., G<•nova, Italy Compnt.<•r Sden... Gottlob G., Paolini P., Zicari R., "Proving Properties of Programs ou Database Views", Dipartiuwnto di Elcttronica, Politecnko di Milano (in
Consistency Analysis of Network Traffic Repositories
Lastdrager, Elmer; Pras, Aiko
2009-01-01
Traffic repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffic that has been flowing over the network; little thoughts are made regarding the consistency of these repositories. Still, for var
Consistency of Network Traffic Repositories: An Overview
Lastdrager, E.; Pras, A.
2009-01-01
Traffc repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffc that has been ﬂowing over the network; little thoughts are made regarding the consistency of these repositories. Still, for vario
Consistency and variability in functional localisers
Duncan, Keith J.; Pattamadilok, Chotiga; Knierim, Iris; Devlin, Joseph T.
2009-01-01
A critical assumption underlying the use of functional localiser scans is that the voxels identified as the functional region-of-interest (fROI) are essentially the same as those activated by the main experimental manipulation. Intra-subject variability in the location of the fROI violates this assumption, reducing the sensitivity of the analysis and biasing the results. Here we investigated consistency and variability in fROIs in a set of 45 volunteers. They performed two functional localiser scans to identify word- and object-sensitive regions of ventral and lateral occipito-temporal cortex, respectively. In the main analyses, fROIs were defined as the category-selective voxels in each region and consistency was measured as the spatial overlap between scans. Consistency was greatest when minimally selective thresholds were used to define “active” voxels (p < 0.05 uncorrected), revealing that approximately 65% of the voxels were commonly activated by both scans. In contrast, highly selective thresholds (p < 10− 4 to 10− 6) yielded the lowest consistency values with less than 25% overlap of the voxels active in both scans. In other words, intra-subject variability was surprisingly high, with between one third and three quarters of the voxels in a given fROI not corresponding to those activated in the main task. This level of variability stands in striking contrast to the consistency seen in retinotopically-defined areas and has important implications for designing robust but efficient functional localiser scans. PMID:19289173
Cross-situational consistency in recognition memory response bias.
Kantner, Justin; Lindsay, D Stephen
2014-10-01
Individuals taking an old-new recognition memory test differ widely in their bias to respond "old," ranging from strongly conservative to strongly liberal, even without any manipulation intended to affect bias. Kantner and Lindsay (2012) found stability of bias across study-test cycles, suggesting that bias is a cognitive trait. That consistency, however, could have arisen because participants perceived the two tests as being part of the same experiment in the same context. In the present study, we tested for stability across two recognition study-test procedures embedded in markedly different experiments, held weeks apart, that participants did not know were connected. Bias showed substantial cross-situational stability. Moreover, bias weakly predicted identifications on an eyewitness memory task and accuracy on a go-no-go task. Although we found little in the way of relationships between bias and five personality measures, these findings suggest that response bias is a stable and broadly influential characteristic of recognizers.
Self-consistency in Capital Markets
Benbrahim, Hamid
2013-03-01
Capital Markets are considered, at least in theory, information engines whereby traders contribute to price formation with their diverse perspectives. Regardless whether one believes in efficient market theory on not, actions by individual traders influence prices of securities, which in turn influence actions by other traders. This influence is exerted through a number of mechanisms including portfolio balancing, margin maintenance, trend following, and sentiment. As a result market behaviors emerge from a number of mechanisms ranging from self-consistency due to wisdom of the crowds and self-fulfilling prophecies, to more chaotic behavior resulting from dynamics similar to the three body system, namely the interplay between equities, options, and futures. This talk will address questions and findings regarding the search for self-consistency in capital markets.
Consistence beats causality in recommender systems
Zhu, Xuzhen; Hu, Zheng; Zhang, Ping; Zhou, Tao
2015-01-01
The explosive growth of information challenges people's capability in finding out items fitting to their own interests. Recommender systems provide an efficient solution by automatically push possibly relevant items to users according to their past preferences. Recommendation algorithms usually embody the causality from what having been collected to what should be recommended. In this article, we argue that in many cases, a user's interests are stable, and thus the previous and future preferences are highly consistent. The temporal order of collections then does not necessarily imply a causality relationship. We further propose a consistence-based algorithm that outperforms the state-of-the-art recommendation algorithms in disparate real data sets, including \\textit{Netflix}, \\textit{MovieLens}, \\textit{Amazon} and \\textit{Rate Your Music}.
A supersymmetric consistent truncation for conifold solutions
Cassani, Davide
2010-01-01
We establish a supersymmetric consistent truncation of type IIB supergravity on the T^{1,1} coset space, based on extending the Papadopoulos-Tseytlin ansatz to the full set of SU(2)xSU(2) invariant Kaluza-Klein modes. The five-dimensional model is a gauged N=4 supergravity with three vector multiplets, which incorporates various conifold solutions and is suitable for the study of their dynamics. By analysing the scalar potential we find a family of new non-supersymmetric AdS_5 extrema interpolating between a solution obtained long ago by Romans and a solution employing an Einstein metric on T^{1,1} different from the standard one. Finally, we discuss some simple consistent subtruncations preserving N=2 supersymmetry. One of them is compatible with the inclusion of smeared D7-branes.
Temporally consistent segmentation of point clouds
Owens, Jason L.; Osteen, Philip R.; Daniilidis, Kostas
2014-06-01
We consider the problem of generating temporally consistent point cloud segmentations from streaming RGB-D data, where every incoming frame extends existing labels to new points or contributes new labels while maintaining the labels for pre-existing segments. Our approach generates an over-segmentation based on voxel cloud connectivity, where a modified k-means algorithm selects supervoxel seeds and associates similar neighboring voxels to form segments. Given the data stream from a potentially mobile sensor, we solve for the camera transformation between consecutive frames using a joint optimization over point correspondences and image appearance. The aligned point cloud may then be integrated into a consistent model coordinate frame. Previously labeled points are used to mask incoming points from the new frame, while new and previous boundary points extend the existing segmentation. We evaluate the algorithm on newly-generated RGB-D datasets.
Foundations of consistent couple stress theory
Hadjesfandiari, Ali R
2015-01-01
In this paper, we examine the recently developed skew-symmetric couple stress theory and demonstrate its inner consistency, natural simplicity and fundamental connection to classical mechanics. This hopefully will help the scientific community to overcome any ambiguity and skepticism about this theory, especially the validity of the skew-symmetric character of the couple-stress tensor. We demonstrate that in a consistent continuum mechanics, the response of infinitesimal elements of matter at each point decomposes naturally into a rigid body portion, plus the relative translation and rotation of these elements at adjacent points of the continuum. This relative translation and rotation captures the deformation in terms of stretches and curvatures, respectively. As a result, the continuous displacement field and its corresponding rotation field are the primary variables, which remarkably is in complete alignment with rigid body mechanics, thus providing a unifying basis. For further clarification, we also exami...
Consistent Linearized Gravity in Brane Backgrounds
Aref'eva, I Ya; Mück, W; Viswanathan, K S; Volovich, I V
2000-01-01
A globally consistent treatment of linearized gravity in the Randall-Sundrum background with matter on the brane is formulated. Using a novel gauge, in which the transverse components of the metric are non-vanishing, the brane is kept straight. We analyze the gauge symmetries and identify the physical degrees of freedom of gravity. Our results underline the necessity for non-gravitational confinement of matter to the brane.
Self-consistent model of fermions
Yershov, V N
2002-01-01
We discuss a composite model of fermions based on three-flavoured preons. We show that the opposite character of the Coulomb and strong interactions between these preons lead to formation of complex structures reproducing three generations of quarks and leptons with all their quantum numbers and masses. The model is self-consistent (it doesn't use input parameters). Nevertheless, the masses of the generated structures match the experimental values.
Consistent formulation of the spacelike axial gauge
Burnel, A.; Van der Rest-Jaspers, M.
1983-12-15
The usual formulation of the spacelike axial gauge is afflicted with the difficulty that the metric is indefinite while no ghost is involved. We solve this difficulty by introducing a ghost whose elimination is such that the metric becomes positive for physical states. The technique consists in the replacement of the gauge condition nxA = 0 by the weaker one partial/sub 0/nxAroughly-equal0.
Security Policy: Consistency, Adjustments and Restraining Factors
Yang; Jiemian
2004-01-01
In the 2004 U.S. presidential election, despite well-divided domestic opinions and Kerry's appealing slogan of "Reversing the Trend," a slight majority still voted for George W. Bush in the end. It is obvious that, based on the author's analysis, security agenda such as counter-terrorism and Iraqi issue has contributed greatly to the reelection of Mr. Bush. This also indicates that the security policy of Bush's second term will basically be consistent.……
Self-consistent structure of metallic hydrogen
Straus, D. M.; Ashcroft, N. W.
1977-01-01
A calculation is presented of the total energy of metallic hydrogen for a family of face-centered tetragonal lattices carried out within the self-consistent phonon approximation. The energy of proton motion is large and proper inclusion of proton dynamics alters the structural dependence of the total energy, causing isotropic lattices to become favored. For the dynamic lattice the structural dependence of terms of third and higher order in the electron-proton interaction is greatly reduced from static lattice equivalents.
Radiometric consistency assessment of hyperspectral infrared sounders
Wang, L.; Y. Han; Jin, X.; Y. Chen; D. A. Tremblay
2015-01-01
The radiometric and spectral consistency among the Atmospheric Infrared Sounder (AIRS), the Infrared Atmospheric Sounding Interferometer (IASI), and the Cross-track Infrared Sounder (CrIS) is fundamental for the creation of long-term infrared (IR) hyperspectral radiance benchmark datasets for both inter-calibration and climate-related studies. In this study, the CrIS radiance measurements on Suomi National Polar-orbiting Partnership (SNPP) satellite are directly com...
The internal consistency of perfect competition
Jakob Kapeller; Stephan Pühringer
2010-01-01
This article surveys some arguments brought forward in defense of the theory of perfect competition. While some critics propose that the theory of perfect competition, and thus also the theory of the firm, are logically flawed, (mainstream) economists defend their most popular textbook model by a series of apparently different arguments. Here it is examined whether these arguments are comparable, consistent and convincing from the point of view of philosophy of science.
Cloud Standardization: Consistent Business Processes and Information
Razvan Daniel ZOTA
2013-01-01
Full Text Available Cloud computing represents one of the latest emerging trends in distributed computing that enables the existence of hardware infrastructure and software applications as services. The present paper offers a general approach to the cloud computing standardization as a mean of improving the speed of adoption for the cloud technologies. Moreover, this study tries to show out how organizations may achieve more consistent business processes while operating with cloud computing technologies.
Dynamic consistency for Stochastic Optimal Control problems
Carpentier, Pierre; Cohen, Guy; De Lara, Michel; Girardeau, Pierre
2010-01-01
For a sequence of dynamic optimization problems, we aim at discussing a notion of consistency over time. This notion can be informally introduced as follows. At the very first time step $t_0$, the decision maker formulates an optimization problem that yields optimal decision rules for all the forthcoming time step $t_0, t_1, ..., T$; at the next time step $t_1$, he is able to formulate a new optimization problem starting at time $t_1$ that yields a new sequence of optimal decision rules. This process can be continued until final time $T$ is reached. A family of optimization problems formulated in this way is said to be time consistent if the optimal strategies obtained when solving the original problem remain optimal for all subsequent problems. The notion of time consistency, well-known in the field of Economics, has been recently introduced in the context of risk measures, notably by Artzner et al. (2007) and studied in the Stochastic Programming framework by Shapiro (2009) and for Markov Decision Processes...
Consistency Relations for the Conformal Mechanism
Creminelli, Paolo; Khoury, Justin; Simonović, Marko
2012-01-01
We systematically derive the consistency relations associated to the non-linearly realized symmetries of theories with spontaneously broken conformal symmetry but with a linearly-realized de Sitter subalgebra. These identities relate (N+1)-point correlation functions with a soft external Goldstone to N-point functions. These relations have direct implications for the recently proposed conformal mechanism for generating density perturbations in the early universe. We study the observational consequences, in particular a novel one-loop contribution to the four-point function, relevant for the stochastic scale-dependent bias and CMB mu-distortion.
Consistency relations for the conformal mechanism
Creminelli, Paolo [Abdus Salam International Centre for Theoretical Physics, Strada Costiera 11, 34151, Trieste (Italy); Joyce, Austin; Khoury, Justin [Center for Particle Cosmology, Department of Physics and Astronomy, University of Pennsylvania, Philadelphia, PA 19104 (United States); Simonović, Marko, E-mail: creminel@ictp.it, E-mail: joyceau@sas.upenn.edu, E-mail: jkhoury@sas.upenn.edu, E-mail: marko.simonovic@sissa.it [SISSA, via Bonomea 265, 34136, Trieste (Italy)
2013-04-01
We systematically derive the consistency relations associated to the non-linearly realized symmetries of theories with spontaneously broken conformal symmetry but with a linearly-realized de Sitter subalgebra. These identities relate (N+1)-point correlation functions with a soft external Goldstone to N-point functions. These relations have direct implications for the recently proposed conformal mechanism for generating density perturbations in the early universe. We study the observational consequences, in particular a novel one-loop contribution to the four-point function, relevant for the stochastic scale-dependent bias and CMB μ-distortion.
Improving analytical tomographic reconstructions through consistency conditions
Arcadu, Filippo; Stampanoni, Marco; Marone, Federica
2016-01-01
This work introduces and characterizes a fast parameterless filter based on the Helgason-Ludwig consistency conditions, used to improve the accuracy of analytical reconstructions of tomographic undersampled datasets. The filter, acting in the Radon domain, extrapolates intermediate projections between those existing. The resulting sinogram, doubled in views, is then reconstructed by a standard analytical method. Experiments with simulated data prove that the peak-signal-to-noise ratio of the results computed by filtered backprojection is improved up to 5-6 dB, if the filter is used prior to reconstruction.
Consistency of non-minimal renormalisation schemes
Jack, I
2016-01-01
Non-minimal renormalisation schemes such as the momentum subtraction scheme (MOM) have frequently been used for physical computations. The consistency of such a scheme relies on the existence of a coupling redefinition linking it to MSbar. We discuss the implementation of this procedure in detail for a general theory and show how to construct the relevant redefinition up to three-loop order, for the case of a general theory of fermions and scalars in four dimensions and a general scalar theory in six dimensions.
Gentzen's centenary the quest for consistency
Rathjen, Michael
2015-01-01
Gerhard Gentzen has been described as logic’s lost genius, whom Gödel called a better logician than himself. This work comprises articles by leading proof theorists, attesting to Gentzen’s enduring legacy to mathematical logic and beyond. The contributions range from philosophical reflections and re-evaluations of Gentzen’s original consistency proofs to the most recent developments in proof theory. Gentzen founded modern proof theory. His sequent calculus and natural deduction system beautifully explain the deep symmetries of logic. They underlie modern developments in computer science such as automated theorem proving and type theory.
Consistent Predictions of Future Forest Mortality
McDowell, N. G.
2014-12-01
We examined empirical and model based estimates of current and future forest mortality of conifers in the northern hemisphere. Consistent water potential thresholds were found that resulted in mortality of our case study species, pinon pine and one-seed juniper. Extending these results with IPCC climate scenarios suggests that most existing trees in this region (SW USA) will be dead by 2050. Further, independent estimates of future mortality for the entire coniferous biome suggest widespread mortality by 2100. The validity and assumptions and implications of these results are discussed.
Surface consistent finite frequency phase corrections
Kimman, W. P.
2016-07-01
Static time-delay corrections are frequency independent and ignore velocity variations away from the assumed vertical ray path through the subsurface. There is therefore a clear potential for improvement if the finite frequency nature of wave propagation can be properly accounted for. Such a method is presented here based on the Born approximation, the assumption of surface consistency and the misfit of instantaneous phase. The concept of instantaneous phase lends itself very well for sweep-like signals, hence these are the focus of this study. Analytical sensitivity kernels are derived that accurately predict frequency-dependent phase shifts due to P-wave anomalies in the near surface. They are quick to compute and robust near the source and receivers. An additional correction is presented that re-introduces the nonlinear relation between model perturbation and phase delay, which becomes relevant for stronger velocity anomalies. The phase shift as function of frequency is a slowly varying signal, its computation therefore does not require fine sampling even for broad-band sweeps. The kernels reveal interesting features of the sensitivity of seismic arrivals to the near surface: small anomalies can have a relative large impact resulting from the medium field term that is dominant near the source and receivers. Furthermore, even simple velocity anomalies can produce a distinct frequency-dependent phase behaviour. Unlike statics, the predicted phase corrections are smooth in space. Verification with spectral element simulations shows an excellent match for the predicted phase shifts over the entire seismic frequency band. Applying the phase shift to the reference sweep corrects for wavelet distortion, making the technique akin to surface consistent deconvolution, even though no division in the spectral domain is involved. As long as multiple scattering is mild, surface consistent finite frequency phase corrections outperform traditional statics for moderately large
Are there consistent models giving observable NSI ?
Martinez, Enrique Fernandez
2013-01-01
While the existing direct bounds on neutrino NSI are rather weak, order 10(−)(1) for propagation and 10(−)(2) for production and detection, the close connection between these interactions and new NSI affecting the better-constrained charged letpon sector through gauge invariance make these bounds hard to saturate in realistic models. Indeed, Standard Model extensions leading to neutrino NSI typically imply constraints at the 10(−)(3) level. The question of whether consistent models leading to observable neutrino NSI naturally arises and was discussed in a dedicated session at NUFACT 11. Here we summarize that discussion.
Consistency Checking of Web Service Contracts
Cambronero, M. Emilia; Okika, Joseph C.; Ravn, Anders Peter
2008-01-01
Behavioural properties are analyzed for web service contracts formulated in Business Process Execution Language (BPEL) and Choreography Description Language (CDL). The key result reported is an automated technique to check consistency between protocol aspects of the contracts. The contracts...... are abstracted to (timed) automata and from there a simulation is set up, which is checked using automated tools for analyzing networks of finite state processes. Here we use the Concurrency Work Bench. The proposed techniques are illustrated with a case study that include otherwise difficult to analyze fault...
Consistency relation for cosmic magnetic fields
Jain, R. K.; Sloth, M. S.
2012-01-01
to be extremely useful to test some recent calculations in the literature. Apart from primordial non-Gaussianity induced by the curvature perturbations, such a cross correlation might provide a new observational probe of inflation and can in principle reveal the primordial nature of cosmic magnetic fields. DOI...
Consistent Partial Least Squares Path Modeling
Dijkstra, Theo K.; Henseler, Jörg
2015-01-01
This paper resumes the discussion in information systems research on the use of partial least squares (PLS) path modeling and shows that the inconsistency of PLS path coefficient estimates in the case of reflective measurement can have adverse consequences for hypothesis testing. To remedy this, the
Probability-consistent spectrum and code spectrum
沈建文; 石树中
2004-01-01
In the seismic safety evaluation (SSE) for key projects, the probability-consistent spectrum (PCS), usually obtained from probabilistic seismic hazard analysis (PSHA), is not consistent with the design response spectrum given by Code for Seismic Design of Buildings (GB50011-2001). Sometimes, there may be a remarkable difference between them. If the PCS is lower than the corresponding code design response spectrum (CDS), the seismic fortification criterion for the key projects would be lower than that for the general industry and civil buildings. In the paper, the relation between PCS and CDS is discussed by using the ideal simple potential seismic source. The results show that in the most areas influenced mainly by the potential sources of the epicentral earthquakes and the regional earthquakes, PCS is generally lower than CDS in the long periods. We point out that the long-period response spectra of the code should be further studied and combined with the probability method of seismic zoning as much as possible. Because of the uncertainties in SSE, it should be prudent to use the long-period response spectra given by SSE for key projects when they are lower than CDS.
Consistent mutational paths predict eukaryotic thermostability
van Noort Vera
2013-01-01
Full Text Available Abstract Background Proteomes of thermophilic prokaryotes have been instrumental in structural biology and successfully exploited in biotechnology, however many proteins required for eukaryotic cell function are absent from bacteria or archaea. With Chaetomium thermophilum, Thielavia terrestris and Thielavia heterothallica three genome sequences of thermophilic eukaryotes have been published. Results Studying the genomes and proteomes of these thermophilic fungi, we found common strategies of thermal adaptation across the different kingdoms of Life, including amino acid biases and a reduced genome size. A phylogenetics-guided comparison of thermophilic proteomes with those of other, mesophilic Sordariomycetes revealed consistent amino acid substitutions associated to thermophily that were also present in an independent lineage of thermophilic fungi. The most consistent pattern is the substitution of lysine by arginine, which we could find in almost all lineages but has not been extensively used in protein stability engineering. By exploiting mutational paths towards the thermophiles, we could predict particular amino acid residues in individual proteins that contribute to thermostability and validated some of them experimentally. By determining the three-dimensional structure of an exemplar protein from C. thermophilum (Arx1, we could also characterise the molecular consequences of some of these mutations. Conclusions The comparative analysis of these three genomes not only enhances our understanding of the evolution of thermophily, but also provides new ways to engineer protein stability.
Subgame consistent cooperation a comprehensive treatise
Yeung, David W K
2016-01-01
Strategic behavior in the human and social world has been increasingly recognized in theory and practice. It is well known that non-cooperative behavior could lead to suboptimal or even highly undesirable outcomes. Cooperation suggests the possibility of obtaining socially optimal solutions and the calls for cooperation are prevalent in real-life problems. Dynamic cooperation cannot be sustainable if there is no guarantee that the agreed upon optimality principle at the beginning is maintained throughout the cooperation duration. It is due to the lack of this kind of guarantees that cooperative schemes fail to last till its end or even fail to get started. The property of subgame consistency in cooperative dynamic games and the corresponding solution mechanism resolve this “classic” problem in game theory. This book is a comprehensive treatise on subgame consistent dynamic cooperation covering the up-to-date state of the art analyses in this important topic. It sets out to provide the theory, solution tec...
Personalities in great tits, Parus major: stability and consistency
Carere, C; Drent, PJ; Privitera, L; Koolhaas, JM; Groothuis, TGG; Drent, Piet J; Koolhaas, Jaap M.; Groothuis, Ton G.G.
2005-01-01
We carried out a longitudinal study on great tits from two lines bidirectionally selected for fast or slow exploratory performance during the juvenile phase, a trait thought to reflect different personalities. We analysed temporal stability and consistency of responses within and between situations involving exploratory and sociosexual behaviour. Exploratory behaviour was assessed both in the juvenile phase and in adulthood (2-3-year interval) by means of a novel object test and an open field...
Consistent evolution in a pedestrian flow
Guan, Junbiao; Wang, Kaihua
2016-03-01
In this paper, pedestrian evacuation considering different human behaviors is studied by using a cellular automaton (CA) model combined with the snowdrift game theory. The evacuees are divided into two types, i.e. cooperators and defectors, and two different human behaviors, herding behavior and independent behavior, are investigated. It is found from a large amount of numerical simulations that the ratios of the corresponding evacuee clusters are evolved to consistent states despite 11 typically different initial conditions, which may largely owe to self-organization effect. Moreover, an appropriate proportion of initial defectors who are of herding behavior, coupled with an appropriate proportion of initial defectors who are of rationally independent thinking, are two necessary factors for short evacuation time.
Consistency of warm k-inflation
Peng, Zhi-Peng; Zhang, Xiao-Min; Zhu, Jian-Yang
2016-01-01
We extend the k-inflation which is a type of kinetically driven inflationary model under the standard inflationary scenario to a possible warm inflationary scenario. The dynamical equations of this warm k-inflation model are obtained. We rewrite the slow-roll parameters which are different from the usual potential driven inflationary models and perform a linear stability analysis to give the proper slow-roll conditions in the warm k-inflation. Two cases, a power-law kinetic function and an exponential kinetic function, are studied, when the dissipative coefficient $\\Gamma=\\Gamma_0$ and $\\Gamma=\\Gamma(\\phi)$, respectively. A proper number of e-folds is obtained in both concrete cases of warm k-inflation. We find a constant dissipative coefficient ($\\Gamma=\\Gamma_0$) is not a workable choice for these two cases while the two cases with $\\Gamma=\\Gamma(\\phi)$ are self-consistent warm inflationary models.
Compact difference approximation with consistent boundary condition
FU Dexun; MA Yanwen; LI Xinliang; LIU Mingyu
2003-01-01
For simulating multi-scale complex flow fields it should be noted that all the physical quantities we are interested in must be simulated well. With limitation of the computer resources it is preferred to use high order accurate difference schemes. Because of their high accuracy and small stencil of grid points computational fluid dynamics (CFD) workers pay more attention to compact schemes recently. For simulating the complex flow fields the treatment of boundary conditions at the far field boundary points and near far field boundary points is very important. According to authors' experience and published results some aspects of boundary condition treatment for far field boundary are presented, and the emphasis is on treatment of boundary conditions for the upwind compact schemes. The consistent treatment of boundary conditions at the near boundary points is also discussed. At the end of the paper are given some numerical examples. The computed results with presented method are satisfactory.
Reliability and Consistency of Surface Contamination Measurements
Rouppert, F.; Rivoallan, A.; Largeron, C.
2002-02-26
Surface contamination evaluation is a tough problem since it is difficult to isolate the radiations emitted by the surface, especially in a highly irradiating atmosphere. In that case the only possibility is to evaluate smearable (removeable) contamination since ex-situ countings are possible. Unfortunately, according to our experience at CEA, these values are not consistent and thus non relevant. In this study, we show, using in-situ Fourier Transform Infra Red spectrometry on contaminated metal samples, that fixed contamination seems to be chemisorbed and removeable contamination seems to be physisorbed. The distribution between fixed and removeable contamination appears to be variable. Chemical equilibria and reversible ion exchange mechanisms are involved and are closely linked to environmental conditions such as humidity and temperature. Measurements of smearable contamination only give an indication of the state of these equilibria between fixed and removeable contamination at the time and in the environmental conditions the measurements were made.
Evaluating the hydrological consistency of evaporation products
López, Oliver
2017-01-18
Advances in space-based observations have provided the capacity to develop regional- to global-scale estimates of evaporation, offering insights into this key component of the hydrological cycle. However, the evaluation of large-scale evaporation retrievals is not a straightforward task. While a number of studies have intercompared a range of these evaporation products by examining the variance amongst them, or by comparison of pixel-scale retrievals against ground-based observations, there is a need to explore more appropriate techniques to comprehensively evaluate remote-sensing-based estimates. One possible approach is to establish the level of product agreement between related hydrological components: for instance, how well do evaporation patterns and response match with precipitation or water storage changes? To assess the suitability of this "consistency"-based approach for evaluating evaporation products, we focused our investigation on four globally distributed basins in arid and semi-arid environments, comprising the Colorado River basin, Niger River basin, Aral Sea basin, and Lake Eyre basin. In an effort to assess retrieval quality, three satellite-based global evaporation products based on different methodologies and input data, including CSIRO-PML, the MODIS Global Evapotranspiration product (MOD16), and Global Land Evaporation: the Amsterdam Methodology (GLEAM), were evaluated against rainfall data from the Global Precipitation Climatology Project (GPCP) along with Gravity Recovery and Climate Experiment (GRACE) water storage anomalies. To ensure a fair comparison, we evaluated consistency using a degree correlation approach after transforming both evaporation and precipitation data into spherical harmonics. Overall we found no persistent hydrological consistency in these dryland environments. Indeed, the degree correlation showed oscillating values between periods of low and high water storage changes, with a phase difference of about 2–3 months
Consistency of canonical formulation of Horava gravity
Soo, Chopin, E-mail: cpsoo@mail.ncku.edu.tw [Department of Physics, National Cheng Kung University, Tainan, Taiwan (China)
2011-09-22
Both the non-projectable and projectable version of Horava gravity face serious challenges. In the non-projectable version, the constraint algebra is seemingly inconsistent. The projectable version lacks a local Hamiltonian constraint, thus allowing for an extra graviton mode which can be problematic. A new formulation (based on arXiv:1007.1563) of Horava gravity which is naturally realized as a representation of the master constraint algebra (instead of the Dirac algebra) studied by loop quantum gravity researchers is presented. This formulation yields a consistent canonical theory with first class constraints; and captures the essence of Horava gravity in retaining only spatial diffeomorphisms as the physically relevant non-trivial gauge symmetry. At the same time the local Hamiltonian constraint is equivalently enforced by the master constraint.
Trisomy 21 consistently activates the interferon response.
Sullivan, Kelly D; Lewis, Hannah C; Hill, Amanda A; Pandey, Ahwan; Jackson, Leisa P; Cabral, Joseph M; Smith, Keith P; Liggett, L Alexander; Gomez, Eliana B; Galbraith, Matthew D; DeGregori, James; Espinosa, Joaquín M
2016-07-29
Although it is clear that trisomy 21 causes Down syndrome, the molecular events acting downstream of the trisomy remain ill defined. Using complementary genomics analyses, we identified the interferon pathway as the major signaling cascade consistently activated by trisomy 21 in human cells. Transcriptome analysis revealed that trisomy 21 activates the interferon transcriptional response in fibroblast and lymphoblastoid cell lines, as well as circulating monocytes and T cells. Trisomy 21 cells show increased induction of interferon-stimulated genes and decreased expression of ribosomal proteins and translation factors. An shRNA screen determined that the interferon-activated kinases JAK1 and TYK2 suppress proliferation of trisomy 21 fibroblasts, and this defect is rescued by pharmacological JAK inhibition. Therefore, we propose that interferon activation, likely via increased gene dosage of the four interferon receptors encoded on chromosome 21, contributes to many of the clinical impacts of trisomy 21, and that interferon antagonists could have therapeutic benefits.
On the consistent use of Constructed Observables
Trott, Michael
2015-01-01
We define "constructed observables" as relating experimental measurements to terms in a Lagrangian while simultaneously making assumptions about possible deviations from the Standard Model (SM), in other Lagrangian terms. Ensuring that the SM effective field theory (EFT) is constrained correctly when using constructed observables requires that their defining conditions are imposed on the EFT in a manner that is consistent with the equations of motion. Failing to do so can result in a "functionally redundant" operator basis and the wrong expectation as to how experimental quantities are related in the EFT. We illustrate the issues involved considering the $\\rm S$ parameter and the off shell triple gauge coupling (TGC) verticies. We show that the relationships between $h \\rightarrow V \\bar{f} \\, f$ decay and the off shell TGC verticies are subject to these subtleties, and how the connections between these observables vanish in the limit of strong bounds due to LEP. The challenge of using constructed observables...
Consistently weighted measures for complex network topologies
Heitzig, Jobst; Zou, Yong; Marwan, Norbert; Kurths, Jürgen
2011-01-01
When network and graph theory are used in the study of complex systems, a typically finite set of nodes of the network under consideration is frequently either explicitly or implicitly considered representative of a much larger finite or infinite set of objects of interest. The selection procedure, e.g., formation of a subset or some kind of discretization or aggregation, typically results in individual nodes of the studied network representing quite differently sized parts of the domain of interest. This heterogeneity may induce substantial bias and artifacts in derived network statistics. To avoid this bias, we propose an axiomatic scheme based on the idea of {\\em node splitting invariance} to derive consistently weighted variants of various commonly used statistical network measures. The practical relevance and applicability of our approach is demonstrated for a number of example networks from different fields of research, and is shown to be of fundamental importance in particular in the study of climate n...
Consistent 4-form fluxes for maximal supergravity
Godazgar, Hadi; Krueger, Olaf; Nicolai, Hermann
2015-01-01
We derive new ansaetze for the 4-form field strength of D=11 supergravity corresponding to uplifts of four-dimensional maximal gauged supergravity. In particular, the ansaetze directly yield the components of the 4-form field strength in terms of the scalars and vectors of the four-dimensional maximal gauged supergravity---in this way they provide an explicit uplift of all four-dimensional consistent truncations of D=11 supergravity. The new ansaetze provide a substantially simpler method for uplifting d=4 flows compared to the previously available method using the 3-form and 6-form potential ansaetze. The ansatz for the Freund-Rubin term allows us to conjecture a `master formula' for the latter in terms of the scalar potential of d=4 gauged supergravity and its first derivative. We also resolve a long-standing puzzle concerning the antisymmetry of the flux obtained from uplift ansaetze.
Quantum cosmological consistency condition for inflation
Calcagni, Gianluca [Instituto de Estructura de la Materia, CSIC, calle Serrano 121, 28006 Madrid (Spain); Kiefer, Claus [Institut für Theoretische Physik, Universität zu Köln, Zülpicher Strasse 77, 50937 Köln (Germany); Steinwachs, Christian F., E-mail: calcagni@iem.cfmac.csic.es, E-mail: kiefer@thp.uni-koeln.de, E-mail: christian.steinwachs@physik.uni-freiburg.de [Physikalisches Institut, Albert-Ludwigs-Universität Freiburg, Hermann-Herder-Str. 3, 79104 Freiburg (Germany)
2014-10-01
We investigate the quantum cosmological tunneling scenario for inflationary models. Within a path-integral approach, we derive the corresponding tunneling probability distribution. A sharp peak in this distribution can be interpreted as the initial condition for inflation and therefore as a quantum cosmological prediction for its energy scale. This energy scale is also a genuine prediction of any inflationary model by itself, as the primordial gravitons generated during inflation leave their imprint in the B-polarization of the cosmic microwave background. In this way, one can derive a consistency condition for inflationary models that guarantees compatibility with a tunneling origin and can lead to a testable quantum cosmological prediction. The general method is demonstrated explicitly for the model of natural inflation.
Consistent Stochastic Modelling of Meteocean Design Parameters
Sørensen, John Dalsgaard; Sterndorff, M. J.
2000-01-01
Consistent stochastic models of metocean design parameters and their directional dependencies are essential for reliability assessment of offshore structures. In this paper a stochastic model for the annual maximum values of the significant wave height, and the associated wind velocity, current...... velocity, and water level is presented. The stochastic model includes statistical uncertainty and dependency between the four stochastic variables. Further, a new stochastic model for annual maximum directional significant wave heights is presented. The model includes dependency between the maximum wave...... height from neighboring directional sectors. Numerical examples are presented where the models are calibrated using the Maximum Likelihood method to data from the central part of the North Sea. The calibration of the directional distributions is made such that the stochastic model for the omnidirectional...
Internal Branding and Employee Brand Consistent Behaviours
Mazzei, Alessandra; Ravazzani, Silvia
2017-01-01
Employee behaviours conveying brand values, named brand consistent behaviours, affect the overall brand evaluation. Internal branding literature highlights a knowledge gap in terms of communication practices intended to sustain such behaviours. This study contributes to the development of a non......-normative and constitutive approach to internal branding by proposing an enablement-oriented communication approach. The conceptual background presents a holistic model of the inside-out process of brand building. This model adopts a theoretical approach to internal branding as a nonnormative practice that facilitates...... constitutive processes. In particular, the paper places emphasis on the role and kinds of communication practices as a central part of the nonnormative and constitutive internal branding process. The paper also discusses an empirical study based on interviews with 32 Italian and American communication managers...
Quantum cosmological consistency condition for inflation
Calcagni, Gianluca; Steinwachs, Christian F
2014-01-01
We investigate the quantum cosmological tunneling scenario for inflationary models. Within a path-integral approach, we derive the corresponding tunneling probability distribution. A sharp peak in this distribution can be interpreted as the initial condition for inflation and therefore as a quantum cosmological prediction for its energy scale. This energy scale is also a genuine prediction of any inflationary model by itself, as the primordial gravitons generated during inflation leave their imprint in the B-polarization of the cosmic microwave background. In this way, one can derive a consistency condition for inflationary models that guarantees compatibility with a tunneling origin and can lead to a testable quantum cosmological prediction. The general method is demonstrated explicitly for the model of natural inflation.
[Training of sensorial panels consisting of children].
Wittig de Penna, E; Bunger Timermann, A; Serrano Valdés, L
2000-03-01
In the development of food products for children, it is advisable to establish the characteristics of the product with groups of children that represent the target population. To ensure the success of the products, the quality and hedonic satisfaction expectatives must be considered. In order to accomplish this premises, a group of children under the Program of Complementary Feeding of the Health Ministry--was selected and trained. The project was developed with a group of 33 children ages 9 to 12 years--from the Republica of Colombia School of Santiago, whose parents agreed and supported the participation of their children in this project. The first step was teaching the technics and methodology of Sensory Evaluation, and increasing their sensitivity. After the 8 programmed sessions, those children who met the minimal requirements for a training group were chosen. The second step was performed during 12 sessions, working with 14 children. The training was aimed at the development of the vocabulary to describe quality and defects, ranking tests, discriminative tests and the use of different scales. Tests to verify reliability, veracity and reproducibility of judgements (p < 0.05) were carried out. The trained group was able to assess different meals of the Program. The obtained results allowed to propose the improvement of some quality criteria of the Program meals.
Thermodynamically consistent model calibration in chemical kinetics
Goutsias John
2011-05-01
Full Text Available Abstract Background The dynamics of biochemical reaction systems are constrained by the fundamental laws of thermodynamics, which impose well-defined relationships among the reaction rate constants characterizing these systems. Constructing biochemical reaction systems from experimental observations often leads to parameter values that do not satisfy the necessary thermodynamic constraints. This can result in models that are not physically realizable and may lead to inaccurate, or even erroneous, descriptions of cellular function. Results We introduce a thermodynamically consistent model calibration (TCMC method that can be effectively used to provide thermodynamically feasible values for the parameters of an open biochemical reaction system. The proposed method formulates the model calibration problem as a constrained optimization problem that takes thermodynamic constraints (and, if desired, additional non-thermodynamic constraints into account. By calculating thermodynamically feasible values for the kinetic parameters of a well-known model of the EGF/ERK signaling cascade, we demonstrate the qualitative and quantitative significance of imposing thermodynamic constraints on these parameters and the effectiveness of our method for accomplishing this important task. MATLAB software, using the Systems Biology Toolbox 2.1, can be accessed from http://www.cis.jhu.edu/~goutsias/CSS lab/software.html. An SBML file containing the thermodynamically feasible EGF/ERK signaling cascade model can be found in the BioModels database. Conclusions TCMC is a simple and flexible method for obtaining physically plausible values for the kinetic parameters of open biochemical reaction systems. It can be effectively used to recalculate a thermodynamically consistent set of parameter values for existing thermodynamically infeasible biochemical reaction models of cellular function as well as to estimate thermodynamically feasible values for the parameters of new
Cognitive consistency and math-gender stereotypes in Singaporean children.
Cvencek, Dario; Meltzoff, Andrew N; Kapur, Manu
2014-01-01
In social psychology, cognitive consistency is a powerful principle for organizing psychological concepts. There have been few tests of cognitive consistency in children and no research about cognitive consistency in children from Asian cultures, who pose an interesting developmental case. A sample of 172 Singaporean elementary school children completed implicit and explicit measures of math-gender stereotype (male=math), gender identity (me=male), and math self-concept (me=math). Results showed strong evidence for cognitive consistency; the strength of children's math-gender stereotypes, together with their gender identity, significantly predicted their math self-concepts. Cognitive consistency may be culturally universal and a key mechanism for developmental change in social cognition. We also discovered that Singaporean children's math-gender stereotypes increased as a function of age and that boys identified with math more strongly than did girls despite Singaporean girls' excelling in math. The results reveal both cultural universals and cultural variation in developing social cognition. Copyright © 2013 Elsevier Inc. All rights reserved.
On the internal consistency of the term structure of forecasts of housing starts
Pierdzioch, C.; Rulke, J. C.; Stadtmann, G.
2013-01-01
We use the term structure of forecasts of housing starts to test for rationality of forecasts. Our test is based on the idea that short-term and long-term forecasts should be internally consistent. We test the internal consistency of forecasts using data for Australia, Canada, Japan and the Unite...... States. Using a simple model of forecast formation, we find that forecasts are not internally consistent, leading to a rejection of forecast rationality....
Consistent lattice Boltzmann equations for phase transitions.
Siebert, D N; Philippi, P C; Mattila, K K
2014-11-01
Unlike conventional computational fluid dynamics methods, the lattice Boltzmann method (LBM) describes the dynamic behavior of fluids in a mesoscopic scale based on discrete forms of kinetic equations. In this scale, complex macroscopic phenomena like the formation and collapse of interfaces can be naturally described as related to source terms incorporated into the kinetic equations. In this context, a novel athermal lattice Boltzmann scheme for the simulation of phase transition is proposed. The continuous kinetic model obtained from the Liouville equation using the mean-field interaction force approach is shown to be consistent with diffuse interface model using the Helmholtz free energy. Density profiles, interface thickness, and surface tension are analytically derived for a plane liquid-vapor interface. A discrete form of the kinetic equation is then obtained by applying the quadrature method based on prescribed abscissas together with a third-order scheme for the discretization of the streaming or advection term in the Boltzmann equation. Spatial derivatives in the source terms are approximated with high-order schemes. The numerical validation of the method is performed by measuring the speed of sound as well as by retrieving the coexistence curve and the interface density profiles. The appearance of spurious currents near the interface is investigated. The simulations are performed with the equations of state of Van der Waals, Redlich-Kwong, Redlich-Kwong-Soave, Peng-Robinson, and Carnahan-Starling.
Exploring the Consistent behavior of Information Services
Kapidakis Sarantos
2016-01-01
Full Text Available Computer services are normally assumed to work well all the time. This usually happens for crucial services like bank electronic services, but not necessarily so for others, that there is no commercial interest in their operation. In this work we examined the operation and the errors of information services and tried to find clues that will help predicting the consistency of the behavior and the quality of the harvesting, which is harder because of the transient conditions and the many services and the huge amount of harvested information. We found many unexpected situations. The services that always successfully satisfy a request may in fact return part of it. A significant part of the OAI services have ceased working while many other serves occasionally fail to respond. Some services fail in the same way each time, and we pronounce them dead, as we do not see a way to overcome that. Others also always, or sometimes fail, but not in the same way, and we hope that their behavior is affected by temporary factors, that may improve later on. We categorized the services into classes, to study their behavior in more detail.
Consistent quadrupole-octupole collective model
Dobrowolski, A.; Mazurek, K.; Góźdź, A.
2016-11-01
Within this work we present a consistent approach to quadrupole-octupole collective vibrations coupled with the rotational motion. A realistic collective Hamiltonian with variable mass-parameter tensor and potential obtained through the macroscopic-microscopic Strutinsky-like method with particle-number-projected BCS (Bardeen-Cooper-Schrieffer) approach in full vibrational and rotational, nine-dimensional collective space is diagonalized in the basis of projected harmonic oscillator eigensolutions. This orthogonal basis of zero-, one-, two-, and three-phonon oscillator-like functions in vibrational part, coupled with the corresponding Wigner function is, in addition, symmetrized with respect to the so-called symmetrization group, appropriate to the collective space of the model. In the present model it is D4 group acting in the body-fixed frame. This symmetrization procedure is applied in order to provide the uniqueness of the Hamiltonian eigensolutions with respect to the laboratory coordinate system. The symmetrization is obtained using the projection onto the irreducible representation technique. The model generates the quadrupole ground-state spectrum as well as the lowest negative-parity spectrum in 156Gd nucleus. The interband and intraband B (E 1 ) and B (E 2 ) reduced transition probabilities are also calculated within those bands and compared with the recent experimental results for this nucleus. Such a collective approach is helpful in searching for the fingerprints of the possible high-rank symmetries (e.g., octahedral and tetrahedral) in nuclear collective bands.
A Consistent Phylogenetic Backbone for the Fungi
Ebersberger, Ingo; de Matos Simoes, Ricardo; Kupczok, Anne; Gube, Matthias; Kothe, Erika; Voigt, Kerstin; von Haeseler, Arndt
2012-01-01
The kingdom of fungi provides model organisms for biotechnology, cell biology, genetics, and life sciences in general. Only when their phylogenetic relationships are stably resolved, can individual results from fungal research be integrated into a holistic picture of biology. However, and despite recent progress, many deep relationships within the fungi remain unclear. Here, we present the first phylogenomic study of an entire eukaryotic kingdom that uses a consistency criterion to strengthen phylogenetic conclusions. We reason that branches (splits) recovered with independent data and different tree reconstruction methods are likely to reflect true evolutionary relationships. Two complementary phylogenomic data sets based on 99 fungal genomes and 109 fungal expressed sequence tag (EST) sets analyzed with four different tree reconstruction methods shed light from different angles on the fungal tree of life. Eleven additional data sets address specifically the phylogenetic position of Blastocladiomycota, Ustilaginomycotina, and Dothideomycetes, respectively. The combined evidence from the resulting trees supports the deep-level stability of the fungal groups toward a comprehensive natural system of the fungi. In addition, our analysis reveals methodologically interesting aspects. Enrichment for EST encoded data—a common practice in phylogenomic analyses—introduces a strong bias toward slowly evolving and functionally correlated genes. Consequently, the generalization of phylogenomic data sets as collections of randomly selected genes cannot be taken for granted. A thorough characterization of the data to assess possible influences on the tree reconstruction should therefore become a standard in phylogenomic analyses. PMID:22114356
Volume Haptics with Topology-Consistent Isosurfaces.
Corenthy, Loc; Otaduy, Miguel A; Pastor, Luis; Garcia, Marcos
2015-01-01
Haptic interfaces offer an intuitive way to interact with and manipulate 3D datasets, and may simplify the interpretation of visual information. This work proposes an algorithm to provide haptic feedback directly from volumetric datasets, as an aid to regular visualization. The haptic rendering algorithm lets users perceive isosurfaces in volumetric datasets, and it relies on several design features that ensure a robust and efficient rendering. A marching tetrahedra approach enables the dynamic extraction of a piecewise linear continuous isosurface. Robustness is achieved using a continuous collision detection step coupled with state-of-the-art proxy-based rendering methods over the extracted isosurface. The introduced marching tetrahedra approach guarantees that the extracted isosurface will match the topology of an equivalent isosurface computed using trilinear interpolation. The proposed haptic rendering algorithm improves the consistency between haptic and visual cues computing a second proxy on the isosurface displayed on screen. Our experiments demonstrate the improvements on the isosurface extraction stage as well as the robustness and the efficiency of the complete algorithm.
Consistency between GRUAN sondes, LBLRTM and IASI
X. Calbet
2017-06-01
Full Text Available Radiosonde soundings from the GCOS Reference Upper-Air Network (GRUAN data record are shown to be consistent with Infrared Atmospheric Sounding Instrument (IASI-measured radiances via LBLRTM (Line-By-Line Radiative Transfer Model in the part of the spectrum that is mostly affected by water vapour absorption in the upper troposphere (from 700 hPa up. This result is key for climate data records, since GRUAN, IASI and LBLRTM constitute reference measurements or a reference radiative transfer model in each of their fields. This is specially the case for night-time radiosonde measurements. Although the sample size is small (16 cases, daytime GRUAN radiosonde measurements seem to have a small dry bias of 2.5 % in absolute terms of relative humidity, located mainly in the upper troposphere, with respect to LBLRTM and IASI. Full metrological closure is not yet possible and will not be until collocation uncertainties are better characterized and a full uncertainty covariance matrix is clarified for GRUAN.
Ciliate communities consistently associated with coral diseases
Sweet, M. J.; Séré, M. G.
2016-07-01
Incidences of coral disease are increasing. Most studies which focus on diseases in these organisms routinely assess variations in bacterial associates. However, other microorganism groups such as viruses, fungi and protozoa are only recently starting to receive attention. This study aimed at assessing the diversity of ciliates associated with coral diseases over a wide geographical range. Here we show that a wide variety of ciliates are associated with all nine coral diseases assessed. Many of these ciliates such as Trochilia petrani and Glauconema trihymene feed on the bacteria which are likely colonizing the bare skeleton exposed by the advancing disease lesion or the necrotic tissue itself. Others such as Pseudokeronopsis and Licnophora macfarlandi are common predators of other protozoans and will be attracted by the increase in other ciliate species to the lesion interface. However, a few ciliate species (namely Varistrombidium kielum, Philaster lucinda, Philaster guamense, a Euplotes sp., a Trachelotractus sp. and a Condylostoma sp.) appear to harbor symbiotic algae, potentially from the coral themselves, a result which may indicate that they play some role in the disease pathology at the very least. Although, from this study alone we are not able to discern what roles any of these ciliates play in disease causation, the consistent presence of such communities with disease lesion interfaces warrants further investigation.
A New Heteroskedastic Consistent Covariance Matrix Estimator using Deviance Measure
Nuzhat Aftab
2016-06-01
Full Text Available In this article we propose a new heteroskedastic consistent covariance matrix estimator, HC6, based on deviance measure. We have studied and compared the finite sample behavior of the new test and compared it with other this kind of estimators, HC1, HC3 and HC4m, which are used in case of leverage observations. Simulation study is conducted to study the effect of various levels of heteroskedasticity on the size and power of quasi-t test with HC estimators. Results show that the test statistic based on our new suggested estimator has better asymptotic approximation and less size distortion as compared to other estimators for small sample sizes when high level ofheteroskedasticity is present in data.
Improving electrofishing catch consistency by standardizing power
Burkhardt, Randy W.; Gutreuter, Steve
1995-01-01
The electrical output of electrofishing equipment is commonly standardized by using either constant voltage or constant amperage, However, simplified circuit and wave theories of electricity suggest that standardization of power (wattage) available for transfer from water to fish may be critical for effective standardization of electrofishing. Electrofishing with standardized power ensures that constant power is transferable to fish regardless of water conditions. The in situ performance of standardized power output is poorly known. We used data collected by the interagency Long Term Resource Monitoring Program (LTRMP) in the upper Mississippi River system to assess the effectiveness of standardizing power output. The data consisted of 278 electrofishing collections, comprising 9,282 fishes in eight species groups, obtained during 1990 from main channel border, backwater, and tailwater aquatic areas in four reaches of the upper Mississippi River and one reach of the Illinois River. Variation in power output explained an average of 14.9% of catch variance for night electrofishing and 12.1 % for day electrofishing. Three patterns in catch per unit effort were observed for different species: increasing catch with increasing power, decreasing catch with increasing power, and no power-related pattern. Therefore, in addition to reducing catch variation, controlling power output may provide some capability to select particular species. The LTRMP adopted standardized power output beginning in 1991; standardized power output is adjusted for variation in water conductivity and water temperature by reference to a simple chart. Our data suggest that by standardizing electrofishing power output, the LTRMP has eliminated substantial amounts of catch variation at virtually no additional cost.
Bayesian detection of causal rare variants under posterior consistency.
Liang, Faming
2013-07-26
Identification of causal rare variants that are associated with complex traits poses a central challenge on genome-wide association studies. However, most current research focuses only on testing the global association whether the rare variants in a given genomic region are collectively associated with the trait. Although some recent work, e.g., the Bayesian risk index method, have tried to address this problem, it is unclear whether the causal rare variants can be consistently identified by them in the small-n-large-P situation. We develop a new Bayesian method, the so-called Bayesian Rare Variant Detector (BRVD), to tackle this problem. The new method simultaneously addresses two issues: (i) (Global association test) Are there any of the variants associated with the disease, and (ii) (Causal variant detection) Which variants, if any, are driving the association. The BRVD ensures the causal rare variants to be consistently identified in the small-n-large-P situation by imposing some appropriate prior distributions on the model and model specific parameters. The numerical results indicate that the BRVD is more powerful for testing the global association than the existing methods, such as the combined multivariate and collapsing test, weighted sum statistic test, RARECOVER, sequence kernel association test, and Bayesian risk index, and also more powerful for identification of causal rare variants than the Bayesian risk index method. The BRVD has also been successfully applied to the Early-Onset Myocardial Infarction (EOMI) Exome Sequence Data. It identified a few causal rare variants that have been verified in the literature.
Interdisciplinary research has consistently lower funding success.
Bromham, Lindell; Dinnage, Russell; Hua, Xia
2016-06-30
Interdisciplinary research is widely considered a hothouse for innovation, and the only plausible approach to complex problems such as climate change. One barrier to interdisciplinary research is the widespread perception that interdisciplinary projects are less likely to be funded than those with a narrower focus. However, this commonly held belief has been difficult to evaluate objectively, partly because of lack of a comparable, quantitative measure of degree of interdisciplinarity that can be applied to funding application data. Here we compare the degree to which research proposals span disparate fields by using a biodiversity metric that captures the relative representation of different fields (balance) and their degree of difference (disparity). The Australian Research Council's Discovery Programme provides an ideal test case, because a single annual nationwide competitive grants scheme covers fundamental research in all disciplines, including arts, humanities and sciences. Using data on all 18,476 proposals submitted to the scheme over 5 consecutive years, including successful and unsuccessful applications, we show that the greater the degree of interdisciplinarity, the lower the probability of being funded. The negative impact of interdisciplinarity is significant even when number of collaborators, primary research field and type of institution are taken into account. This is the first broad-scale quantitative assessment of success rates of interdisciplinary research proposals. The interdisciplinary distance metric allows efficient evaluation of trends in research funding, and could be used to identify proposals that require assessment strategies appropriate to interdisciplinary research.
Consistency of Repeated Naming in Aphasia
Elizabeth E. Galletta
2015-09-01
Full Text Available Background People with mild aphasia and healthy elderly often exhibit similar impairments on language tests of word retrieval. However, variable practice effects in object naming by three individuals with aphasia compared to young and elderly adults have been reported (Wingfield et al. 2006. Wingfield et al. (2006 found that naming of the same pictures of objects over five trials demonstrated decreasing response latencies over repeated trials for both older and younger adults, but not for individuals with aphasia. In fact, among their three participants with aphasia, response latencies in the consecutive trials differed considerably. The authors suggested that different underlying processes may be involved in word retrieval for people with aphasia compared to adults without brain injuries. In our study we aimed to further consider the effect of practice on both object and action naming in individuals with mild aphasia. Method One woman with anomic aphasia (age 38 years; WAB Aphasia Quotient = 88 and one healthy woman (age 25 years participated. Both were native English speakers and reported 18 years of formal education. Participants were tested individually, with a set of 27 object pictures and a set of 27 action pictures presented one at a time on a computer screen. The participants were instructed to name each picture as quickly as possible as soon as each picture appeared on the screen. There were 10 trials of each set of pictures, with different random orders for each trial. The order of presentation of the object and action picture sets alternated across participants. Naming responses were recorded to computer sound files for later measurements of response latencies. A brief tone was presented simultaneous with the picture onset, allowing later measurement of response latencies from the onset of picture presentation to the onset of the participant’s correct response. Results Our findings resembled those reported in Wingfield et al. (2006
Current Status Of Velocity Field Surveys: A Consistency Check
Sarkar, D; Watkins, R; Sarkar, Devdeep; Feldman, Hume A.
2006-01-01
We present a statistical analysis comparing the bulk--flow measurements for six recent peculiar velocity surveys, namely, ENEAR, SFI, RFGC, SBF and the Mark III singles and group catalogs. We study whether the bulk--flow estimates are consistent with each other and construct the full three dimensional bulk--flow vectors. The method we discuss could be used to test the consistency of all velocity field surveys. We show that although these surveys differ in their geometry and measurement errors, their bulk flow vectors are expected to be highly correlated and in fact show impressive agreement in all cases. Our results suggest that even though the surveys we study target galaxies of different morphology and use different distance measures, they all reliably reflect the same underlying large-scale flow.
Consistency and sealing of advanced bipolar tissue sealers.
Chekan, Edward G; Davison, Mark A; Singleton, David W; Mennone, John Z; Hinoul, Piet
2015-01-01
The aim of this study was to evaluate two commonly used advanced bipolar devices (ENSEAL(®) G2 Tissue Sealers and LigaSure™ Blunt Tip) for compression uniformity, vessel sealing strength, and consistency in bench-top analyses. Compression analysis was performed with a foam pad/sensor apparatus inserted between closed jaws of the instruments. Average pressures (psi) were recorded across the entire inside surface of the jaws, and over the distal one-third of jaws. To test vessel sealing strength, ex vivo pig carotid arteries were sealed and transected and left and right (sealed) halves of vessels were subjected to burst pressure testing. The maximum bursting pressures of each half of vessels were averaged to obtain single data points for analysis. The absence or presence of tissue sticking to device jaws was noted for each transected vessel. Statistically higher average compression values were found for ENSEAL(®) instruments (curved jaw and straight jaw) compared to LigaSure™, Pbench-top testing, ENSEAL(®) G2 sealers produced more uniform compression, stronger and more consistent vessel sealing, and reduced tissue sticking relative to LigaSure™.
Structural Consistency: Enabling XML Keyword Search to Eliminate Spurious Results Consistently
Lee, Ki-Hoon; Han, Wook-Shin; Kim, Min-Soo
2009-01-01
XML keyword search is a user-friendly way to query XML data using only keywords. In XML keyword search, to achieve high precision without sacrificing recall, it is important to remove spurious results not intended by the user. Efforts to eliminate spurious results have enjoyed some success by using the concepts of LCA or its variants, SLCA and MLCA. However, existing methods still could find many spurious results. The fundamental cause for the occurrence of spurious results is that the existing methods try to eliminate spurious results locally without global examination of all the query results and, accordingly, some spurious results are not consistently eliminated. In this paper, we propose a novel keyword search method that removes spurious results consistently by exploiting the new concept of structural consistency.
Consistency and sealing of advanced bipolar tissue sealers
Chekan EG
2015-04-01
Full Text Available Edward G Chekan , Mark A Davison, David W Singleton, John Z Mennone, Piet Hinoul Ethicon, Inc., Cincinnati, OH, USA Objectives: The aim of this study was to evaluate two commonly used advanced bipolar devices (ENSEAL® G2 Tissue Sealers and LigaSure™ Blunt Tip for compression uniformity, vessel sealing strength, and consistency in bench-top analyses. Methods: Compression analysis was performed with a foam pad/sensor apparatus inserted between closed jaws of the instruments. Average pressures (psi were recorded across the entire inside surface of the jaws, and over the distal one-third of jaws. To test vessel sealing strength, ex vivo pig carotid arteries were sealed and transected and left and right (sealed halves of vessels were subjected to burst pressure testing. The maximum bursting pressures of each half of vessels were averaged to obtain single data points for analysis. The absence or presence of tissue sticking to device jaws was noted for each transected vessel. Results: Statistically higher average compression values were found for ENSEAL® instruments (curved jaw and straight jaw compared to LigaSure™, P<0.05. Moreover, the ENSEAL® devices retained full compression at the distal end of jaws. Significantly higher and more consistent median burst pressures were noted for ENSEAL® devices relative to LigaSure™ through 52 firings of each device (P<0.05. LigaSure™ showed a significant reduction in median burst pressure for the final three firings (cycles 50–52 versus the first three firings (cycles 1–3, P=0.027. Tissue sticking was noted for 1.39% and 13.3% of vessels transected with ENSEAL® and LigaSure™, respectively. Conclusion: In bench-top testing, ENSEAL® G2 sealers produced more uniform compression, stronger and more consistent vessel sealing, and reduced tissue sticking relative to LigaSure™. Keywords: ENSEAL, sealing, burst pressure, laparoscopic, compression, LigaSure
Performance and consistency of indicator groups in two biodiversity hotspots.
Joaquim Trindade-Filho
Full Text Available BACKGROUND: In a world limited by data availability and limited funds for conservation, scientists and practitioners must use indicator groups to define spatial conservation priorities. Several studies have evaluated the effectiveness of indicator groups, but still little is known about the consistency in performance of these groups in different regions, which would allow their a priori selection. METHODOLOGY/PRINCIPAL FINDINGS: We systematically examined the effectiveness and the consistency of nine indicator groups in representing mammal species in two top-ranked Biodiversity Hotspots (BH: the Brazilian Cerrado and the Atlantic Forest. To test for group effectiveness we first found the best sets of sites able to maximize the representation of each indicator group in the BH and then calculated the average representation of different target species by the indicator groups in the BH. We considered consistent indicator groups whose representation of target species was not statistically different between BH. We called effective those groups that outperformed the target-species representation achieved by random sets of species. Effective indicator groups required the selection of less than 2% of the BH area for representing target species. Restricted-range species were the most effective indicators for the representation of all mammal diversity as well as target species. It was also the only group with high consistency. CONCLUSIONS/SIGNIFICANCE: We show that several indicator groups could be applied as shortcuts for representing mammal species in the Cerrado and the Atlantic Forest to develop conservation plans, however, only restricted-range species consistently held as the most effective indicator group for such a task. This group is of particular importance in conservation planning as it captures high diversity of endemic and endangered species.
Consistent SPH Simulations of Protostellar Collapse and Fragmentation
Gabbasov, Ruslan; Sigalotti, Leonardo Di G.; Cruz, Fidel; Klapp, Jaime; Ramírez-Velasquez, José M.
2017-02-01
We study the consistency and convergence of smoothed particle hydrodynamics (SPH) as a function of the interpolation parameters, namely the number of particles N, the number of neighbors n, and the smoothing length h, using simulations of the collapse and fragmentation of protostellar rotating cores. The calculations are made using a modified version of the GADGET-2 code that employs an improved scheme for the artificial viscosity and power-law dependences of n and h on N, as was recently proposed by Zhu et al., which comply with the combined limit N\\to ∞ , h\\to 0, and n\\to ∞ with n/N\\to 0 for full SPH consistency as the domain resolution is increased. We apply this realization to the “standard isothermal test case” in the variant calculated by Burkert & Bodenheimer and the Gaussian cloud model of Boss to investigate the response of the method to adaptive smoothing lengths in the presence of large density and pressure gradients. The degree of consistency is measured by tracking how well the estimates of the consistency integral relations reproduce their continuous counterparts. In particular, C 0 and C 1 particle consistency is demonstrated, meaning that the calculations are close to second-order accuracy. As long as n is increased with N, mass resolution also improves as the minimum resolvable mass {M}\\min ∼ {n}-1. This aspect allows proper calculation of small-scale structures in the flow associated with the formation and instability of protostellar disks around the growing fragments, which are seen to develop a spiral structure and fragment into close binary/multiple systems as supported by recent observations.
Consistency of Holland Code and Its Relation to Persistence in a College Major.
Latona, Janet R.
1989-01-01
Studied consistency of Holland hexagon code and persistence in college major among 3,612 subjects using the American College Testing Program interest inventory and World of Work Map score reporting instrument. Examined consistency of individual interest scores and consistency of chosen environment. Results indicated no difference in persistence…
Time-Consistent and Market-Consistent Evaluations (Revised version of 2012-086)
Stadje, M.A.; Pelsser, A.
2014-01-01
Abstract: We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from
Nonlinear smoothing identification algorithm with application to data consistency checks
Idan, M.
1993-01-01
A parameter identification algorithm for nonlinear systems is presented. It is based on smoothing test data with successively improved sets of model parameters. The smoothing, which is iterative, provides all of the information needed to compute the gradients of the smoothing performance measure with respect to the parameters. The parameters are updated using a quasi-Newton procedure, until convergence is achieved. The advantage of this algorithm over standard maximum likelihood identification algorithms is the computational savings in calculating the gradient. This algorithm was used for flight-test data consistency checks based on a nonlinear model of aircraft kinematics. Measurement biases and scale factors were identified. The advantages of the presented algorithm and model are discussed.
On the kernel and particle consistency in smoothed particle hydrodynamics
Sigalotti, Leonardo Di G; Rendón, Otto; Vargas, Carlos A; Peña-Polo, Franklin
2016-01-01
The problem of consistency of smoothed particle hydrodynamics (SPH) has demanded considerable attention in the past few years due to the ever increasing number of applications of the method in many areas of science and engineering. A loss of consistency leads to an inevitable loss of approximation accuracy. In this paper, we revisit the issue of SPH kernel and particle consistency and demonstrate that SPH has a limiting second-order convergence rate. Numerical experiments with suitably chosen test functions validate this conclusion. In particular, we find that when using the root mean square error as a model evaluation statistics, well-known corrective SPH schemes, which were thought to converge to second, or even higher order, are actually first-order accurate, or at best close to second order. We also find that observing the joint limit when $N\\to\\infty$, $h\\to 0$, and $n\\to\\infty$, as was recently proposed by Zhu et al., where $N$ is the total number of particles, $h$ is the smoothing length, and $n$ is th...
Consistently Trained Artificial Neural Network for Automatic Ship Berthing Control
Y.A. Ahmed
2015-09-01
Full Text Available In this paper, consistently trained Artificial Neural Network controller for automatic ship berthing is discussed. Minimum time course changing manoeuvre is utilised to ensure such consistency and a new concept named ‘virtual window’ is introduced. Such consistent teaching data are then used to train two separate multi-layered feed forward neural networks for command rudder and propeller revolution output. After proper training, several known and unknown conditions are tested to judge the effectiveness of the proposed controller using Monte Carlo simulations. After getting acceptable percentages of success, the trained networks are implemented for the free running experiment system to judge the network’s real time response for Esso Osaka 3-m model ship. The network’s behaviour during such experiments is also investigated for possible effect of initial conditions as well as wind disturbances. Moreover, since the final goal point of the proposed controller is set at some distance from the actual pier to ensure safety, therefore a study on automatic tug assistance is also discussed for the final alignment of the ship with actual pier.
Consistency Relation and Non-Gaussianity in a Galileon Inflation
Asadi, Kosar
2016-01-01
We study a Galileon inflation in the light of Planck 2015 observational data in order to constraint the model parameter space. We study the spectrum of the primordial modes of the density perturbations by expanding the action up to the second order in perturbations. Then we expand the action up to the third order and find the three point correlation functions to find the amplitude of the non-Gaussianity of the primordial perturbations. We study the amplitude of the non-Gaussianity both in equilateral and orthogonal configurations in this setup and test the model with recent observational data. Our analysis shows that for some ranges of the non-minimal coupling parameter, the model is consistent with observation and it is also possible to have large non-Gaussianity which would be observable by future improvements in experiments. Moreover, we obtain the tilt of the tensor power spectrum and test the standard inflationary consistency relation ($r=-8n_T$) against the latest bounds from the Planck 2015 dataset. We...
Consistently Showing Your Best Side? Intra-individual Consistency in #Selfie Pose Orientation
Lindell, Annukka K.
2017-01-01
Painted and photographic portraits of others show an asymmetric bias: people favor their left cheek. Both experimental and database studies confirm that the left cheek bias extends to selfies. To date all such selfie studies have been cross-sectional; whether individual selfie-takers tend to consistently favor the same pose orientation, or switch between multiple poses, remains to be determined. The present study thus examined intra-individual consistency in selfie pose orientations. Two hundred selfie-taking participants (100 male and 100 female) were identified by searching #selfie on Instagram. The most recent 10 single-subject selfies for the each of the participants were selected and coded for type of selfie (normal; mirror) and pose orientation (left, midline, right), resulting in a sample of 2000 selfies. Results indicated that selfie-takers do tend to consistently adopt a preferred pose orientation (α = 0.72), with more participants showing an overall left cheek bias (41%) than would be expected by chance (overall right cheek bias = 31.5%; overall midline bias = 19.5%; no overall bias = 8%). Logistic regression modellng, controlling for the repeated measure of participant identity, indicated that sex did not affect pose orientation. However, selfie type proved a significant predictor when comparing left and right cheek poses, with a stronger left cheek bias for mirror than normal selfies. Overall, these novel findings indicate that selfie-takers show intra-individual consistency in pose orientation, and in addition, replicate the previously reported left cheek bias for selfies and other types of portrait, confirming that the left cheek bias also presents within individuals’ selfie corpora.
Consistently Showing Your Best Side? Intra-individual Consistency in #Selfie Pose Orientation.
Lindell, Annukka K
2017-01-01
Painted and photographic portraits of others show an asymmetric bias: people favor their left cheek. Both experimental and database studies confirm that the left cheek bias extends to selfies. To date all such selfie studies have been cross-sectional; whether individual selfie-takers tend to consistently favor the same pose orientation, or switch between multiple poses, remains to be determined. The present study thus examined intra-individual consistency in selfie pose orientations. Two hundred selfie-taking participants (100 male and 100 female) were identified by searching #selfie on Instagram. The most recent 10 single-subject selfies for the each of the participants were selected and coded for type of selfie (normal; mirror) and pose orientation (left, midline, right), resulting in a sample of 2000 selfies. Results indicated that selfie-takers do tend to consistently adopt a preferred pose orientation (α = 0.72), with more participants showing an overall left cheek bias (41%) than would be expected by chance (overall right cheek bias = 31.5%; overall midline bias = 19.5%; no overall bias = 8%). Logistic regression modellng, controlling for the repeated measure of participant identity, indicated that sex did not affect pose orientation. However, selfie type proved a significant predictor when comparing left and right cheek poses, with a stronger left cheek bias for mirror than normal selfies. Overall, these novel findings indicate that selfie-takers show intra-individual consistency in pose orientation, and in addition, replicate the previously reported left cheek bias for selfies and other types of portrait, confirming that the left cheek bias also presents within individuals' selfie corpora.
Consistency in performance evaluation reports and medical records.
Lu, Mingshan; Ma, Ching-to Albert
2002-12-01
In the health care market managed care has become the latest innovation for the delivery of services. For efficient implementation, the managed care organization relies on accurate information. So clinicians are often asked to report on patients before referrals are approved, treatments authorized, or insurance claims processed. What are clinicians responses to solicitation for information by managed care organizations? The existing health literature has already pointed out the importance of provider gaming, sincere reporting, nudging, and dodging the rules. We assess the consistency of clinicians reports on clients across administrative data and clinical records. For about 1,000 alcohol abuse treatment episodes, we compare clinicians reports across two data sets. The first one, the Maine Addiction Treatment System (MATS), was an administrative data set; the state government used it for program performance monitoring and evaluation. The second was a set of medical record abstracts, taken directly from the clinical records of treatment episodes. A clinician s reporting practice exhibits an inconsistency if the information reported in MATS differs from the information reported in the medical record in a statistically significant way. We look for evidence of inconsistencies in five categories: admission alcohol use frequency, discharge alcohol use frequency, termination status, admission employment status, and discharge employment status. Chi-square tests, Kappa statistics, and sensitivity and specificity tests are used for hypothesis testing. Multiple imputation methods are employed to address the problem of missing values in the record abstract data set. For admission and discharge alcohol use frequency measures, we find, respectively, strong and supporting evidence for inconsistencies. We find equally strong evidence for consistency in reports of admission and discharge employment status, and mixed evidence on report consistency on termination status. Patterns of
Measuring consistency of web page design and its effects on performance and satisfaction.
Ozok, A A; Salvendy, G
2000-04-01
This study examines the methods for measuring the consistency levels of web pages and the effect of consistency on the performance and satisfaction of the world-wide web (WWW) user. For clarification, a home page is referred to as a single page that is the default page of a web site on the WWW. A web page refers to a single screen that indicates a specific address on the WWW. This study has tested a series of web pages that were mostly hyperlinked. Therefore, the term 'web page' has been adopted for the nomenclature while referring to the objects of which the features were tested. It was hypothesized that participants would perform better and be more satisfied using web pages that have consistent rather than inconsistent interface design; that the overall consistency level of an interface design would significantly correlate with the three elements of consistency, physical, communicational and conceptual consistency; and that physical and communicational consistencies would interact with each other. The hypotheses were tested in a four-group, between-subject design, with 10 participants in each group. The results partially support the hypothesis regarding error rate, but not regarding satisfaction and performance time. The results also support the hypothesis that each of the three elements of consistency significantly contribute to the overall consistency of a web page, and that physical and communicational consistencies interact with each other, while conceptual consistency does not interact with them.
Finding consistent gene transmission patterns on large and complex pedigrees.
Pirinen, Matti; Gasbarra, Dario
2006-01-01
A heuristic algorithm for finding gene transmission patterns on large and complex pedigrees with partially observed genotype data is proposed. The method can be used to generate an initial point for a Markov chain Monte Carlo simulation or to check that the given pedigree and the genotype data are consistent. In small pedigrees, the algorithm is exact by exhaustively enumerating all possibilities, but, in large pedigrees, with a considerable amount of unknown data, only a subset of promising configurations can actually be checked. For that purpose, the configurations are ordered by combining the approximative conditional probability distribution of the unknown genotypes with the information on the relationships between individuals. We also introduce a way to divide the task into subparts, which has been shown to be useful in large pedigrees. The algorithm has been implemented in a program called APE (Allelic Path Explorer) and tested in three different settings with good results.
GW method with the self-consistent Sternheimer equation
Giustino, Feliciano; Cohen, Marvin L.; Louie, Steven G.
2010-03-01
We propose an approach to quasiparticle GW calculations which does not require the computation of unoccupied electronic states. In our approach the screened Coulomb interaction is evaluated by solving self-consistent linear-response Sternheimer equations and the noninteracting Green’s function is evaluated by solving inhomogeneous linear systems. The frequency dependence of the screened Coulomb interaction is explicitly taken into account. In order to avoid the singularities of the screened Coulomb interaction the calculations are performed along the imaginary axis, and the results are analytically continued to the real axis through Padé approximants. As a proof of concept we implemented the proposed methodology within the empirical pseudopotential formalism and we validated our implementation using silicon as a test case. We examine the advantages and limitations of our method and describe promising future directions.
Wyse, Adam E.; Babcock, Ben
2016-01-01
A common suggestion made in the psychometric literature for fixed-length classification tests is that one should design tests so that they have maximum information at the cut score. Designing tests in this way is believed to maximize the classification accuracy and consistency of the assessment. This article uses simulated examples to illustrate…
Are paleoclimate model ensembles consistent with the MARGO data synthesis?
J. C. Hargreaves
2011-03-01
Full Text Available We investigate the consistency of various ensembles of model simulations with the Multiproxy Approach for the Reconstruction of the Glacial Ocean Surface (MARGO sea surface temperature data synthesis. We discover that while two multi-model ensembles, created through the Paleoclimate Model Intercomparison Projects (PMIP and PMIP2, pass our simple tests of reliability, an ensemble based on parameter variation in a single model does not perform so well. We show that accounting for observational uncertainty in the MARGO database is of prime importance for correctly evaluating the ensembles. Perhaps surprisingly, the inclusion of a coupled dynamical ocean (compared to the use of a slab ocean does not appear to cause a wider spread in the sea surface temperature anomalies, but rather causes systematic changes with more heat transported north in the Atlantic. There is weak evidence that the sea surface temperature data may be more consistent with meridional overturning in the North Atlantic being similar for the LGM and the present day, however, the small size of the PMIP2 ensemble prevents any statistically significant results from being obtained.
Are paleoclimate model ensembles consistent with the MARGO data synthesis?
J. C. Hargreaves
2011-08-01
Full Text Available We investigate the consistency of various ensembles of climate model simulations with the Multiproxy Approach for the Reconstruction of the Glacial Ocean Surface (MARGO sea surface temperature data synthesis. We discover that while two multi-model ensembles, created through the Paleoclimate Model Intercomparison Projects (PMIP and PMIP2, pass our simple tests of reliability, an ensemble based on parameter variation in a single model does not perform so well. We show that accounting for observational uncertainty in the MARGO database is of prime importance for correctly evaluating the ensembles. Perhaps surprisingly, the inclusion of a coupled dynamical ocean (compared to the use of a slab ocean does not appear to cause a wider spread in the sea surface temperature anomalies, but rather causes systematic changes with more heat transported north in the Atlantic. There is weak evidence that the sea surface temperature data may be more consistent with meridional overturning in the North Atlantic being similar for the LGM and the present day. However, the small size of the PMIP2 ensemble prevents any statistically significant results from being obtained.
High accuracy and visibility-consistent dense multiview stereo.
Vu, Hoang-Hiep; Labatut, Patrick; Pons, Jean-Philippe; Keriven, Renaud
2012-05-01
Since the initial comparison of Seitz et al., the accuracy of dense multiview stereovision methods has been increasing steadily. A number of limitations, however, make most of these methods not suitable to outdoor scenes taken under uncontrolled imaging conditions. The present work consists of a complete dense multiview stereo pipeline which circumvents these limitations, being able to handle large-scale scenes without sacrificing accuracy. Highly detailed reconstructions are produced within very reasonable time thanks to two key stages in our pipeline: a minimum s-t cut optimization over an adaptive domain that robustly and efficiently filters a quasidense point cloud from outliers and reconstructs an initial surface by integrating visibility constraints, followed by a mesh-based variational refinement that captures small details, smartly handling photo-consistency, regularization, and adaptive resolution. The pipeline has been tested over a wide range of scenes: from classic compact objects taken in a laboratory setting, to outdoor architectural scenes, landscapes, and cultural heritage sites. The accuracy of its reconstructions has also been measured on the dense multiview benchmark proposed by Strecha et al., showing the results to compare more than favorably with the current state-of-the-art methods.
A formulation of consistent particle hydrodynamics in strong form
Yamamoto, Satoko; Makino, Junichiro
2017-03-01
In fluid dynamical simulations in astrophysics, large deformations are common and surface tracking is sometimes necessary. The smoothed particle hydrodynamics (SPH) method has been used in many such simulations. Recently, however, it has been shown that SPH cannot handle contact discontinuities or free surfaces accurately. There are several reasons for this problem. The first one is that SPH requires that the density is continuous and differentiable. The second one is that SPH does not have consistency, and thus the accuracy is of the zeroth-order in space. In addition, we cannot express accurate boundary conditions with SPH. In this paper, we propose a novel, high-order scheme for particle-based hydrodynamics of compressible fluid. Our method is based on a kernel-weighted high-order fitting polynomial for intensive variables. With this approach, we can construct a scheme which solves all of the three problems described above. For shock capturing, we use a tensor form of von Neumann-Richtmyer artificial viscosity. We have applied our method to many test problems and obtained excellent results. Our method is not conservative, since particles do not have mass or energy, but only their densities. However, because of the Lagrangian nature of our scheme, the violation of the conservation laws turned out to be small. We name this method Consistent Particle Hydrodynamics in Strong Form (CPHSF).
Planck 2013 results. XXXI. Consistency of the Planck data
Ade, P A R; Ashdown, M; Aumont, J; Baccigalupi, C; Banday, A.J; Barreiro, R.B; Battaner, E; Benabed, K; Benoit-Levy, A; Bernard, J.P; Bersanelli, M; Bielewicz, P; Bond, J.R; Borrill, J; Bouchet, F.R; Burigana, C; Cardoso, J.F; Catalano, A; Challinor, A; Chamballu, A; Chiang, H.C; Christensen, P.R; Clements, D.L; Colombi, S; Colombo, L.P.L; Couchot, F; Coulais, A; Crill, B.P; Curto, A; Cuttaia, F; Danese, L; Davies, R.D; Davis, R.J; de Bernardis, P; de Rosa, A; de Zotti, G; Delabrouille, J; Desert, F.X; Dickinson, C; Diego, J.M; Dole, H; Donzelli, S; Dore, O; Douspis, M; Dupac, X; Ensslin, T.A; Eriksen, H.K; Finelli, F; Forni, O; Frailis, M; Fraisse, A A; Franceschi, E; Galeotta, S; Ganga, K; Giard, M; Gonzalez-Nuevo, J; Gorski, K.M.; Gratton, S.; Gregorio, A; Gruppuso, A; Gudmundsson, J E; Hansen, F.K; Hanson, D; Harrison, D; Henrot-Versille, S; Herranz, D; Hildebrandt, S.R; Hivon, E; Hobson, M; Holmes, W.A.; Hornstrup, A; Hovest, W.; Huffenberger, K.M; Jaffe, T.R; Jaffe, A.H; Jones, W.C; Keihanen, E; Keskitalo, R; Knoche, J; Kunz, M; Kurki-Suonio, H; Lagache, G; Lahteenmaki, A; Lamarre, J.M; Lasenby, A; Lawrence, C.R; Leonardi, R; Leon-Tavares, J; Lesgourgues, J; Liguori, M; Lilje, P.B; Linden-Vornle, M; Lopez-Caniego, M; Lubin, P.M; Macias-Perez, J.F; Maino, D; Mandolesi, N; Maris, M; Martin, P.G; Martinez-Gonzalez, E; Masi, S; Matarrese, S; Mazzotta, P; Meinhold, P.R; Melchiorri, A; Mendes, L; Mennella, A; Migliaccio, M; Mitra, S; Miville-Deschenes, M.A; Moneti, A; Montier, L; Morgante, G; Mortlock, D; Moss, A; Munshi, D; Murphy, J A; Naselsky, P; Nati, F; Natoli, P; Norgaard-Nielsen, H.U; Noviello, F; Novikov, D; Novikov, I; Oxborrow, C.A; Pagano, L; Pajot, F; Paoletti, D; Partridge, B; Pasian, F; Patanchon, G; Pearson, D; Pearson, T.J; Perdereau, O; Perrotta, F; Piacentini, F; Piat, M; Pierpaoli, E; Pietrobon, D; Plaszczynski, S; Pointecouteau, E; Polenta, G; Ponthieu, N; Popa, L; Pratt, G.W; Prunet, S; Puget, J.L; Rachen, J.P; Reinecke, M; Remazeilles, M; Renault, C; Ricciardi, S.; Ristorcelli, I; Rocha, G.; Roudier, G; Rubino-Martin, J.A; Rusholme, B; Sandri, M; Scott, D; Stolyarov, V; Sudiwala, R; Sutton, D; Suur-Uski, A.S; Sygnet, J.F; Tauber, J.A; Terenzi, L; Toffolatti, L; Tomasi, M; Tristram, M; Tucci, M; Valenziano, L; Valiviita, J; Van Tent, B; Vielva, P; Villa, F; Wade, L.A; Wandelt, B.D; Wehus, I K; White, S D M; Yvon, D; Zacchei, A; Zonca, A
2014-01-01
The Planck design and scanning strategy provide many levels of redundancy that can be exploited to provide tests of internal consistency. One of the most important is the comparison of the 70 GHz (amplifier) and 100 GHz (bolometer) channels. Based on different instrument technologies, with feeds located differently in the focal plane, analysed independently by different teams using different software, and near the minimum of diffuse foreground emission, these channels are in effect two different experiments. The 143 GHz channel has the lowest noise level on Planck, and is near the minimum of unresolved foreground emission. In this paper, we analyse the level of consistency achieved in the 2013 Planck data. We concentrate on comparisons between the 70, 100, and 143 GHz channel maps and power spectra, particularly over the angular scales of the first and second acoustic peaks, on maps masked for diffuse Galactic emission and for strong unresolved sources. Difference maps covering angular scales from 8°...
Creation of Consistent Burn Wounds: A Rat Model
Elijah Zhengyang Cai
2014-07-01
Full Text Available Background Burn infliction techniques are poorly described in rat models. An accurate study can only be achieved with wounds that are uniform in size and depth. We describe a simple reproducible method for creating consistent burn wounds in rats. Methods Ten male Sprague-Dawley rats were anesthetized and dorsum shaved. A 100 g cylindrical stainless-steel rod (1 cm diameter was heated to 100℃ in boiling water. Temperature was monitored using a thermocouple. We performed two consecutive toe-pinch tests on different limbs to assess the depth of sedation. Burn infliction was limited to the loin. The skin was pulled upwards, away from the underlying viscera, creating a flat surface. The rod rested on its own weight for 5, 10, and 20 seconds at three different sites on each rat. Wounds were evaluated for size, morphology and depth. Results Average wound size was 0.9957 cm2 (standard deviation [SD] 0.1845 (n=30. Wounds created with duration of 5 seconds were pale, with an indistinct margin of erythema. Wounds of 10 and 20 seconds were well-defined, uniformly brown with a rim of erythema. Average depths of tissue damage were 1.30 mm (SD 0.424, 2.35 mm (SD 0.071, and 2.60 mm (SD 0.283 for duration of 5, 10, 20 seconds respectively. Burn duration of 5 seconds resulted in full-thickness damage. Burn duration of 10 seconds and 20 seconds resulted in full-thickness damage, involving subjacent skeletal muscle. Conclusions This is a simple reproducible method for creating burn wounds consistent in size and depth in a rat burn model.
Characterization of consistent triggers of migraine with aura
Hauge, Anne Werner; Kirchmann, Malene; Olesen, Jes
2011-01-01
The aim of the present study was to characterize perceived consistent triggers of migraine with aura (MA).......The aim of the present study was to characterize perceived consistent triggers of migraine with aura (MA)....
Characterization of consistent triggers of migraine with aura
Hauge, Anne Werner; Kirchmann, Malene; Olesen, Jes
2011-01-01
The aim of the present study was to characterize perceived consistent triggers of migraine with aura (MA).......The aim of the present study was to characterize perceived consistent triggers of migraine with aura (MA)....
The utility of theory of planned behavior in predicting consistent ...
admin
outcomes of the behavior and the evaluations of these outcomes (behavioral beliefs) ... belief towards consistent condom use and motivation for compliance with .... consistency of the items used before constructing a scale. Results. All of the ...
Longitudinal tDCS: Consistency across Working Memory Training Studies
Marian E. Berryhill
2017-04-01
Full Text Available There is great interest in enhancing and maintaining cognitive function. In recent years, advances in noninvasive brain stimulation devices, such as transcranial direct current stimulation (tDCS, have targeted working memory in particular. Despite controversy surrounding outcomes of single-session studies, a growing field of working memory training studies incorporate multiple sessions of tDCS. It is useful to take stock of these findings because there is a diversity of paradigms employed and the outcomes observed between research groups. This will be important in assessing cognitive training programs paired with stimulation techniques and identifying the more useful and less effective approaches. Here, we treat the tDCS+ working memory training field as a case example, but also survey training benefits in other neuromodulatory techniques (e.g., tRNS, tACS. There are challenges associated with the broad parameter space including: individual differences, stimulation intensity, duration, montage, session number, session spacing, training task selection, timing of follow up testing, near and far transfer tasks. In summary, although the field of assisted cognitive training is young, some design choices are more favorable than others. By way of heuristic, the current evidence supports including more training/tDCS sessions (5+, applying anodal tDCS targeting prefrontal regions, including follow up testing on trained and transfer tasks after a period of no contact. What remains unclear, but important for future translational value is continuing work to pinpoint optimal values for the tDCS parameters on a per cognitive task basis. Importantly the emerging literature shows notable consistency in the application of tDCS for WM across various participant populations compared to single session experimental designs.
Research on consistency of identifying solitary pulmonary masses with CT
Qiuping Wang; Gang Niu; Yun Zhang; Yongqian Qiang; Zicheng Li; Youmin Guo
2008-01-01
Objective:To research on consistency of identifying solitary pulmonary masses with CT.Methods:Three observers with different working backgrounds in imaging diagnosis individually interpreted the same group images of solitary pulmonary mass, by 12 indexes of objective signs.The differences in interpretation resulted in ante- and post-interpretations were assessed by the x2 test.The agreement of two interpretations from the same observer was confirmed with the kappa test.A double-blind method was adopted for analysis.Results:The agreement rates of ante- and post-interpreting from the three observers were respectively 82.65%(486/588) 80.27%(472/588) and 84.86% (499/588) while their interpreting results were generally accordant without significant difference(x2 ＝ 4.975, df＝ 2,P ＝ 0.083) however there was difference between the observer 2 and observer 3(x2 ＝ 4.875, df＝ 1, P ＝ 0.027).There were five indexes with k > 0.40 of ante- and post-interpreting results of the three observers, including clarity of nodule borderline, presence of sentus, uniformity of density, existence of cavity and calcification in pathological region, among them, the agreement rate of interpreting borderline and cavity was higher(k > 0.07); the blood vessel convergence poorer(0 < k ≤ 0.40); the other six CT signs of interpretation were slightly different.Conclusion:The ability to identify solitary pulmonary mass was inconsistent, and needs to be improved further.
Generalized contexts and consistent histories in quantum mechanics
Losada, Marcelo; Laura, Roberto
2014-05-01
We analyze a restriction of the theory of consistent histories by imposing that a valid description of a physical system must include quantum histories which satisfy the consistency conditions for all states. We prove that these conditions are equivalent to imposing the compatibility conditions of our formalism of generalized contexts. Moreover, we show that the theory of consistent histories with the consistency conditions for all states and the formalism of generalized context are equally useful representing expressions which involve properties at different times.
Study on consistent query answering in inconsistent databases
XIE Dong; YANG Luming
2007-01-01
Consistent query answering is an approach to retrieving consistent answers over databases that might be inconsistent with respect to some given integrity constraints The approach is based on a concept of repair.This paper surveys several recent researches on obtaining consistent information from inconsistent databases,such as the underlying semantic model,a number of approaches to computing consistent query answers and the computational complexity of this problem.Furthermore,the work outlines potential research directions in this area.
CONSISTENCY OF LS ESTIMATOR IN SIMPLE LINEAR EV REGRESSION MODELS
Liu Jixue; Chen Xiru
2005-01-01
Consistency of LS estimate of simple linear EV model is studied. It is shown that under some common assumptions of the model, both weak and strong consistency of the estimate are equivalent but it is not so for quadratic-mean consistency.
Checking Consistency of Pedigree Information is NP-complete
Aceto, Luca; Hansen, Jens A.; Ingolfsdottir, Anna
Consistency checking is a fundamental computational problem in genetics. Given a pedigree and information on the genotypes of some of the individuals in it, the aim of consistency checking is to determine whether these data are consistent with the classic Mendelian laws of inheritance. This probl...
Non-Numeric Intrajudge Consistency Feedback in an Angoff Procedure
Harrison, George M.
2015-01-01
The credibility of standard-setting cut scores depends in part on two sources of consistency evidence: intrajudge and interjudge consistency. Although intrajudge consistency feedback has often been provided to Angoff judges in practice, more evidence is needed to determine whether it achieves its intended effect. In this randomized experiment with…
Multi-wavelength constraints on the inflationary consistency relation
Meerburg, P Daniel; Hadzhiyska, Boryana; Meyers, Joel
2015-01-01
We present the first attempt to use a combination of CMB, LIGO, and PPTA data to constrain both the tilt and the running of primordial tensor power spectrum through constraints on the gravitational wave energy density generated in the early universe. Combining measurements at different cosmological scales highlights how complementary data can be used to test the predictions of early universe models including the inflationary consistency relation. Current data prefers a slightly positive tilt ($n_t = 0.13^{+0.54}_{-0.75}$) and a negative running ($n_{t, {\\rm run}} < -0.25$) for the tensor power spectrum spectrum. Interestingly, the addition of direct gravitational wave detector data puts strong bounds on the tensor-to-scalar ratio $r < 0.2 $ since the large positive tensor tilt preferred by the Planck temperature power spectrum is no longer allowed. We comment on possible effects of a large positive tilt on the background expansion and show that depending on the assumptions regarding the UV cutoff ($k_{\\...
Nonlinear cosmological consistency relations and effective matter stresses
Ballesteros, Guillermo [Museo Storico della Fisica e Centro Studi e Ricerche ' ' Enrico Fermi' ' , Piazza del Viminale 1, I-00184 Rome (Italy); Hollenstein, Lukas; Jain, Rajeev Kumar; Kunz, Martin, E-mail: guillermo.ballesteros@pd.infn.it, E-mail: lukas.hollenstein@unige.ch, E-mail: rajeev.jain@unige.ch, E-mail: martin.kunz@unige.ch [Département de Physique Théorique and Center for Astroparticle Physics, Université de Genève, Quai E. Ansermet 24, CH-1211 Genève 4 (Switzerland)
2012-05-01
We propose a fully nonlinear framework to construct consistency relations for testing generic cosmological scenarios using the evolution of large scale structure. It is based on the covariant approach in combination with a frame that is purely given by the metric, the normal frame. As an example, we apply this framework to the ΛCDM model, by extending the usual first order conditions on the metric potentials to second order, where the two potentials start to differ from each other. We argue that working in the normal frame is not only a practical choice but also helps with the physical interpretation of nonlinear dynamics. In this frame, effective pressures and anisotropic stresses appear at second order in perturbation theory, even for ''pressureless'' dust. We quantify their effect and compare them, for illustration, to the pressure of a generic clustering dark energy fluid and the anisotropic stress in the DGP model. Besides, we also discuss the effect of a mismatch of the potentials on the determination of galaxy bias.
Globfit: Consistently fitting primitives by discovering global relations
Li, Yangyan
2011-07-01
Given a noisy and incomplete point set, we introduce a method that simultaneously recovers a set of locally fitted primitives along with their global mutual relations. We operate under the assumption that the data corresponds to a man-made engineering object consisting of basic primitives, possibly repeated and globally aligned under common relations. We introduce an algorithm to directly couple the local and global aspects of the problem. The local fit of the model is determined by how well the inferred model agrees to the observed data, while the global relations are iteratively learned and enforced through a constrained optimization. Starting with a set of initial RANSAC based locally fitted primitives, relations across the primitives such as orientation, placement, and equality are progressively learned and conformed to. In each stage, a set of feasible relations are extracted among the candidate relations, and then aligned to, while best fitting to the input data. The global coupling corrects the primitives obtained in the local RANSAC stage, and brings them to precise global alignment. We test the robustness of our algorithm on a range of synthesized and scanned data, with varying amounts of noise, outliers, and non-uniform sampling, and validate the results against ground truth, where available. © 2011 ACM.
A framework to facilitate consistent characterization of read across uncertainty.
Blackburn, Karen; Stuard, Sharon B
2014-04-01
A process for evaluating analogues for use in structure activity relationship (SAR) assessments was previously published (Wu et al., 2010) and tested using a series of case studies (Blackburn et al., 2011). SAR-based "read across" approaches continue to be broadly used to address toxicological data gaps. The potential additional uncertainty introduced into risk assessments as a result of application of read across approaches to fill data gaps has been widely discussed (OECD, 2007; ECETOC, 2012; Patlewicz et al., 2013), but to date a systematic framework to guide the characterization of uncertainty in read across assessments has not been proposed. The current manuscript presents both a systematic framework to describe potential areas of additional uncertainty that may arise in read across (evaluated based on the number and suitability of analogues contributing data, severity of the critical effect, and effects and potency concordance), as well as a questionnaire for evaluating and documenting consideration of these potential additional sources of uncertainty by risk assessors. Application of this framework represents a next step in standardizing the read across process, both by providing a means to transparently assign a level of uncertainty to a SAR-based read across assessment and by facilitating consistency in read across conclusions drawn by different risk assessors.
An internally consistent gamma ray burst time history phenomenology
Cline, T. L.
1985-01-01
A phenomenology for gamma ray burst time histories is outlined. Order of their generally chaotic appearance is attempted, based on the speculation that any one burst event can be represented above 150 keV as a superposition of similarly shaped increases of varying intensity. The increases can generally overlap, however, confusing the picture, but a given event must at least exhibit its own limiting characteristic rise and decay times if the measurements are made with instruments having adequate temporal resolution. Most catalogued observations may be of doubtful or marginal utility to test this hypothesis, but some time histories from Helios-2, Pioneer Venus Orbiter and other instruments having one-to several-millisecond capabilities appear to provide consistency. Also, recent studies of temporally resolved Solar Maximum Mission burst energy spectra are entirely compatible with this picture. The phenomenology suggested here, if correct, may assist as an analytic tool for modelling of burst processes and possibly in the definition of burst source populations.
Classical and Quantum Consistency of the DGP Model
Nicolis, A; Nicolis, Alberto; Rattazzi, Riccardo
2004-01-01
We study the Dvali-Gabadadze-Porrati model by the method of the boundary effective action. The truncation of this action to the bending mode \\pi consistently describes physics in a wide range of regimes both at the classical and at the quantum level. The Vainshtein effect, which restores agreement with precise tests of general relativity, follows straightforwardly. We give a simple and general proof of stability, i.e. absence of ghosts in the fluctuations, valid for most of the relevant cases, like for instance the spherical source in asymptotically flat space. However we confirm that around certain interesting self-accelerating cosmological solutions there is a ghost. We consider the issue of quantum corrections. Around flat space \\pi becomes strongly coupled below a macroscopic length of 1000 km, thus impairing the predictivity of the model. Indeed the tower of higher dimensional operators which is expected by a generic UV completion of the model limits predictivity at even larger length scales. We outline ...
Consistency check of photon beam physical data after recommissioning process
Kadman, B.; Chawapun, N.; Ua-apisitwong, S.; Asakit, T.; Chumpu, N.; Rueansri, J.
2016-03-01
In radiotherapy, medical linear accelerator (Linac) is the key system used for radiation treatments delivery. Although, recommissioning was recommended after major modification of the machine by AAPM TG53, but it might not be practical in radiotherapy center with heavy workloads. The main purpose of this study was to compare photon beam physical data between initial commissioning and recommissioning of 6 MV Elekta Precise linac. The parameters for comparing were the percentage depth dose (PDD) and beam profiles. The clinical commissioning test cases followed IAEA-TECDOC-1583 were planned on REF 91230 IMRT Dose Verification Phantom by Philips’ Pinnacle treatment planning system. The Delta4PT was used for dose distribution verification with 90% passing criteria of the gamma index (3%/3mm). Our results revealed that the PDDs and beam profiles agreed within a tolerance limit recommended by TRS430. Most of the point doses and dose distribution verification passed the acceptance criteria. This study showed the consistency of photon beam physical data after recommissioning process. There was a good agreement between initial commissioning and recommissioning within a tolerance limit, demonstrated that the full recommissioning process might not be required. However, in the complex treatment planning geometry, the initial data should be applied with great caution.
The self consistent expansion applied to the factorial function
Cohen, Alon; Bialy, Shmuel; Schwartz, Moshe
2016-12-01
Most of the interesting systems in statistical physics can be described as nonlinear stochastic field theories. A common feature in the theoretical study of such systems is that ordinary perturbation theory seldom works. On the other hand, there exists a useful tool for the study of systems of that generic nature. That tool, the Self Consistent Expansion (SCE) is technically similar to the ordinary perturbation expansion, in the sense that it is an expansion around a solvable problem. The key point which distinguishes the SCE from an ordinary perturbation expansion, is that the small parameter of the expansion is adjustable and determined inherently by optimization of the expansion. Therefore, it allows the adaptive SCE to remain accurate relative to the inflexible ordinary expansion. The goal of the present paper is to present the SCE by applying it to a well-known zero dimensional problem. We choose the evaluation of the factorial function, x!, as the test case for the SCE, because the Stirling approximation for that function is one of the best known asymptotic expansions, with a very wide use in statistical physics. We show that the SCE approximation holds for small and even negative arguments of the factorial function, where the Stirling expansion fails miserably. It does so without paying any penalty at high values of the argument, where the Stirling formula is excellent. We present numerical as well as analytic SCE approximations of the factorial function.
On consistency of the weighted arithmetical mean complex judgement matrix
无
2007-01-01
The weighted arithmetical mean complex judgement matrix(WAMCJM)is the most common method for aggregating group opinions,but it has a shortcoming,namely the WAMCJM of the perfectly consistent judgement matrices given by experts canot guarantee its perfect consistency.An upper bound of the WAMCJM's consistency is presented.Simultaneously,a compatibility index of judging the aggregating extent of group opinions is also introduced.The WAMCJM is of acceptable consistency and is proved provided the compatibilities of all judgement matrices given by experts are smaller than the threshold value of acceptable consistency.These conclusions are important to group decision making.
On multidimensional consistent systems of asymmetric quad-equations
Boll, Raphael
2012-01-01
Multidimensional Consistency becomes more and more important in the theory of discrete integrable systems. Recently, we gave a classification of all 3D consistent 6-tuples of equations with the tetrahedron property, where several novel asymmetric systems have been found. In the present paper we discuss higher-dimensional consistency for 3D consistent systems coming up with this classification. In addition, we will give a classification of certain 4D consistent systems of quad-equations. The results of this paper allow for a proof of the Bianchi permutability among other applications.
Consistency and Raw Scores Survive Another Test: A Last Response to Prediger and His Colleagues
Holland, John L.
1976-01-01
Prediger confuses observations about the data with Holland's theoretical statement, performs some uninterpretable analyses, omits much relevant data, and provides an incomplete account of what psychometric authorities have said about raw scores in interest inventories. (Author)
Requirements for UML and OWL Integration Tool for User Data Consistency Modeling and Testing
Nytun, J. P.; Jensen, Christian Søndergaard; Oleshchuk, V. A.
2003-01-01
. In this paper we analyze requirements for a tool that support integration of UML models and ontologies written in languages like the W3C Web Ontology Language (OWL). The tool can be used in the following way: after loading two legacy models into the tool, the tool user connects them by inserting modeling...
A Consistency Test of Spectroscopic Gravities for Late-Type Stars
Allende-Prieto, C; Lambert, D L; Gustafsson, B; Prieto, Carlos Allende; Lopez, Ramon J. Garcia; Lambert, David L.; Gustafsson, Bengt
1999-01-01
Chemical analyses of late-type stars are usually carried out following the classical recipe: LTE line formation and homogeneous, plane-parallel, flux-constant, and LTE model atmospheres. We review different results in the literature that have suggested significant inconsistencies in the spectroscopic analyses, pointing out the difficulties in deriving independent estimates of the stellar fundamental parameters and hence,detecting systematic errors. The trigonometric parallaxes measured by the HIPPARCOS mission provide accurate appraisals of the stellar surface gravity for nearby stars, which are used here to check the gravities obtained from the photospheric iron ionization balance. We find an approximate agreement for stars in the metallicity range -1 <= [Fe/H] <= 0, but the comparison shows that the differences between the spectroscopic and trigonometric gravities decrease towards lower metallicities for more metal-deficient dwarfs (-2.5 <= [Fe/H] <= -1.0), which casts a shadow upon the abundanc...
Visualizing Changes in Pretest and Post-Test Student Responses with Consistency Plots
Wittmann, Michael C.; Black, Katrina E.
2014-01-01
Tabular presentations of student data often hide information about the switches in responses by individual students over the course of a semester. We extend unpublished work by Kanim on "escalator diagrams," which show changes in student responses from correct to incorrect (and vice versa) while representing pre- and postinstruction…
Evaluating the hydrological consistency of satellite based water cycle components
Lopez Valencia, Oliver M.
2016-06-15
Advances in multi-satellite based observations of the earth system have provided the capacity to retrieve information across a wide-range of land surface hydrological components and provided an opportunity to characterize terrestrial processes from a completely new perspective. Given the spatial advantage that space-based observations offer, several regional-to-global scale products have been developed, offering insights into the multi-scale behaviour and variability of hydrological states and fluxes. However, one of the key challenges in the use of satellite-based products is characterizing the degree to which they provide realistic and representative estimates of the underlying retrieval: that is, how accurate are the hydrological components derived from satellite observations? The challenge is intrinsically linked to issues of scale, since the availability of high-quality in-situ data is limited, and even where it does exist, is generally not commensurate to the resolution of the satellite observation. Basin-scale studies have shown considerable variability in achieving water budget closure with any degree of accuracy using satellite estimates of the water cycle. In order to assess the suitability of this type of approach for evaluating hydrological observations, it makes sense to first test it over environments with restricted hydrological inputs, before applying it to more hydrological complex basins. Here we explore the concept of hydrological consistency, i.e. the physical considerations that the water budget impose on the hydrologic fluxes and states to be temporally and spatially linked, to evaluate the reproduction of a set of large-scale evaporation (E) products by using a combination of satellite rainfall (P) and Gravity Recovery and Climate Experiment (GRACE) observations of storage change, focusing on arid and semi-arid environments, where the hydrological flows can be more realistically described. Our results indicate no persistent hydrological
Incompatible multiple consistent sets of histories and measures of quantumness
Halliwell, J. J.
2017-07-01
In the consistent histories approach to quantum theory probabilities are assigned to histories subject to a consistency condition of negligible interference. The approach has the feature that a given physical situation admits multiple sets of consistent histories that cannot in general be united into a single consistent set, leading to a number of counterintuitive or contrary properties if propositions from different consistent sets are combined indiscriminately. An alternative viewpoint is proposed in which multiple consistent sets are classified according to whether or not there exists any unifying probability for combinations of incompatible sets which replicates the consistent histories result when restricted to a single consistent set. A number of examples are exhibited in which this classification can be made, in some cases with the assistance of the Bell, Clauser-Horne-Shimony-Holt, or Leggett-Garg inequalities together with Fine's theorem. When a unifying probability exists logical deductions in different consistent sets can in fact be combined, an extension of the "single framework rule." It is argued that this classification coincides with intuitive notions of the boundary between classical and quantum regimes and in particular, the absence of a unifying probability for certain combinations of consistent sets is regarded as a measure of the "quantumness" of the system. The proposed approach and results are closely related to recent work on the classification of quasiprobabilities and this connection is discussed.
Saraiva, Fabio Petersen; Costa, Patricia Grativol; Inomata, Daniela Lumi; Melo, Carlos Sergio Nascimento; Helal Junior, John; Nakashima, Yoshitaka [Universidade de Sao Paulo (USP), SP (Brazil). Hospital das Clinicas. Dept. de Oftalmologia]. E-mail: fabiopetersen@yahoo.com.br
2007-07-01
Objectives: To investigate optical coherence tomography consistency on foveal thickness, foveal volume, and macular volume measurements in patients with and without diffuse diabetic macular edema. Introduction: Optical coherence tomography represents an objective technique that provides cross-sectional tomographs of retinal structure in vivo. However, it is expected that poor fixation ability, as seen in diabetic macular edema, could alter its results. Several authors have discussed the reproducibility of optical coherence tomography, but only a few have addressed the topic with respect to diabetic maculopathy. Methods: The study recruited diabetic patients without clinically evident retinopathy (control group) and with diffuse macular edema (case group). Only one eye of each patient was evaluated. Five consecutive fast macular scans were taken using Ocular Coherence Tomography 3; the 6 mm macular map was chosen. The consistency in measurements of foveal thickness, foveal volume, and total macular volume for both groups was evaluated using the Pearson's coefficient of variation. The T-test for independent samples was used in order to compare measurements of both groups. Results: Each group consisted of 20 patients. All measurements had a coefficient of variation less than 10%. The most consistent parameter for both groups was the total macular volume. Discussion: Consistency in measurement is a mainstay of any test. A test is unreliable if its measurements can not be correctly repeated. We found a good index of consistency, even considering patients with an unstable gaze. Conclusions: Optical coherence tomography is a consistent method for diabetic subjects with diffuse macular edema. (author)
Integrable Heisenberg Ferromagnet Equations with self-consistent potentials
Zhunussova, Zh Kh; Tungushbaeva, D I; Mamyrbekova, G K; Nugmanova, G N; Myrzakulov, R
2013-01-01
In this paper, we consider some integrable Heisenberg Ferromagnet Equations with self-consistent potentials. We study their Lax representations. In particular we give their equivalent counterparts which are nonlinear Schr\\"odinger type equations. We present the integrable reductions of the Heisenberg Ferromagnet Equations with self-consistent potentials. These integrable Heisenberg Ferromagnet Equations with self-consistent potentials describe nonlinear waves in ferromagnets with magnetic fields.
Behavioural consistency and life history of Rana dalmatina tadpoles
Urszán, Tamás Janós; Török, János; Hettyey, Attila; Garamszegi, László Z; Herczeg, Gábor
2015-01-01
The focus of evolutionary behavioural ecologists has recently turned towards understanding the causes and consequences of behavioural consistency, manifesting either as animal personality (consistency in a single behaviour) or behavioural syndrome (consistency across more behaviours). Behavioural type (mean individual behaviour) has been linked to life-history strategies, leading to the emergence of the integrated pace-of-life syndrome (POLS) theory. Using Rana dalmatina tadpoles as models, w...
Consistency of muscle synergies during pedaling across different mechanical constraints.
Hug, François; Turpin, Nicolas A; Couturier, Antoine; Dorel, Sylvain
2011-07-01
The purpose of the present study was to determine whether muscle synergies are constrained by changes in the mechanics of pedaling. The decomposition algorithm used to identify muscle synergies was based on two components: "muscle synergy vectors," which represent the relative weighting of each muscle within each synergy, and "synergy activation coefficients," which represent the relative contribution of muscle synergy to the overall muscle activity pattern. We hypothesized that muscle synergy vectors would remain fixed but that synergy activation coefficients could vary, resulting in observed variations in individual electromyographic (EMG) patterns. Eleven cyclists were tested during a submaximal pedaling exercise and five all-out sprints. The effects of torque, maximal torque-velocity combination, and posture were studied. First, muscle synergies were extracted from each pedaling exercise independently using non-negative matrix factorization. Then, to cross-validate the results, muscle synergies were extracted from the entire data pooled across all conditions, and muscle synergy vectors extracted from the submaximal exercise were used to reconstruct EMG patterns of the five all-out sprints. Whatever the mechanical constraints, three muscle synergies accounted for the majority of variability [mean variance accounted for (VAF) = 93.3 ± 1.6%, VAF (muscle) > 82.5%] in the EMG signals of 11 lower limb muscles. In addition, there was a robust consistency in the muscle synergy vectors. This high similarity in the composition of the three extracted synergies was accompanied by slight adaptations in their activation coefficients in response to extreme changes in torque and posture. Thus, our results support the hypothesis that these muscle synergies reflect a neural control strategy, with only a few timing adjustments in their activation regarding the mechanical constraints.
Bootstrap-Based Inference for Cube Root Consistent Estimators
Cattaneo, Matias D.; Jansson, Michael; Nagasawa, Kenichi
This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known to be inconsis......This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known...
Affective-cognitive consistency and thought-induced attitude polarization.
Chaiken, S; Yates, S
1985-12-01
Subjects whose preexperimental attitudes toward either capital punishment or censorship were high or low in affective-cognitive consistency were identified. These four groups thought about their attitudes by writing two essays, one on the topic for which consistency had been assessed (relevant essay) and one on the unassessed topic (distractor essay). In accord with the hypothesis that thought-induced attitude polarization requires the presence of a well-developed knowledge structure, high-consistency subjects evidenced greater polarization than low-consistency subjects only on the relevant topic after writing the relevant essay. Content analyses of subjects' relevant essays yielded additional data confirming Tesser's ideas regarding mediation: High (vs. low) consistency subjects expressed a greater proportion of cognitions that were evaluatively consistent with their prior affect toward the attitude object and a smaller proportion of evaluatively inconsistent and neutral cognitions. Moreover, although high-and low-consistency subjects did not differ in the amount of attitudinally relevant information they possessed or their awareness of inconsistent cognitions, their method of dealing with discrepant information diverged: High-consistency subjects evidenced a greater tendency to assimilate discrepant information by generating refutational thoughts that discredited or minimized the importance of inconsistent information.
Philosophical and Methodological Problem of Consistency of Mathematical Theories
Michailova N. V.
2013-01-01
Full Text Available Increased abstraction of modern mathematical theories has revived interest in traditional philosophical and methodological problem of internally consistent system of axioms where the contradicting each other statements can’t be deduced. If we are talking about axioms describing a well-known area of mathematical objects from the standpoint of local consistency this problem does not appear to be as relevant. But these problems are associated with the various attempts of formalists to explain the mathematical existence through consistency. But, for example, with regard to the problem of establishing of consistency of mathematical analysis the solution of which would clarify the fate of Hilbert's proof theory it has not solved yet so as the problem of the consistency of axiomatic set theory. Therefore it can be assumed that the criterion of consistency despite its essential role in axiomatic systems both formal and substantive nature is the same auxiliary logical criterion as well as mathematical provability. An adequate solution of the problem of consistency of mathematics can be achieved in the area of methodological and substantive arguments revealing the mechanism of appearance of contradictions in the mathematical theory. The paper shows that from a systemic point of view in the context of philosophical and methodological synthesis of various directions of justification of modern mathematics it can’t insist on only the rationale for consistency of mathematical theories.
MANUFACTURE OF THE FERMENTED SAUSAGES WITH THE SMEARED CONSISTENCE
Nesterenko A. A.
2014-10-01
Full Text Available In foreign practice we have a great demand of using smoked sausage products with a smeared consistence. In the article the basic aspects of manufacturing smoked sausages with a smeared consistence are resulted: the choice of spices, starting cultures and the way of drawing up of forcemeat
Uniform Consistency for Nonparametric Estimators in Null Recurrent Time Series
Gao, Jiti; Kanaya, Shin; Li, Degui
2015-01-01
This paper establishes uniform consistency results for nonparametric kernel density and regression estimators when time series regressors concerned are nonstationary null recurrent Markov chains. Under suitable regularity conditions, we derive uniform convergence rates of the estimators. Our...... results can be viewed as a nonstationary extension of some well-known uniform consistency results for stationary time series....
Delimiting Coefficient a from Internal Consistency and Unidimensionality
Sijtsma, Klaas
2015-01-01
I discuss the contribution by Davenport, Davison, Liou, & Love (2015) in which they relate reliability represented by coefficient a to formal definitions of internal consistency and unidimensionality, both proposed by Cronbach (1951). I argue that coefficient a is a lower bound to reliability and that concepts of internal consistency and…
Carl Rogers during Initial Interviews: A Moderate and Consistent Therapist.
Edwards, H. P.; And Others
1982-01-01
Analyzed two initial interviews by Carl Rogers in their entirety using the Carkhuff scales, Hill's category system, and a brief grammatical analysis to establish the level and consistency with which Rogers provides facilitative conditions. Results indicated his behavior as counselor was stable and consistent within and across interviews. (Author)
Decentralized Consistency Checking in Cross-organizational Workflows
Wombacher, Andreas
Service Oriented Architectures facilitate loosely coupled composed services, which are established in a decentralized way. One challenge for such composed services is to guarantee consistency, i.e., deadlock-freeness. This paper presents a decentralized approach to consistency checking, which
The Self-Consistency Model of Subjective Confidence
Koriat, Asher
2012-01-01
How do people monitor the correctness of their answers? A self-consistency model is proposed for the process underlying confidence judgments and their accuracy. In answering a 2-alternative question, participants are assumed to retrieve a sample of representations of the question and base their confidence on the consistency with which the chosen…
Dynamic Consistency between Value and Coordination Models - Research Issues.
Bodenstaff, L.; Wombacher, Andreas; Reichert, M.U.; meersman, R; Tari, Z; herrero, p
Inter-organizational business cooperations can be described from different viewpoints each fulfilling a specific purpose. Since all viewpoints describe the same system they must not contradict each other, thus, must be consistent. Consistency can be checked based on common semantic concepts of the
Decentralized Consistency Checking in Cross-organizational Workflows
Wombacher, A.
2006-01-01
Service Oriented Architectures facilitate loosely coupled composed services, which are established in a decentralized way. One challenge for such composed services is to guarantee consistency, i.e., deadlock-freeness. This paper presents a decentralized approach to consistency checking, which utiliz
Logical consistency and sum-constrained linear models
van Perlo -ten Kleij, Frederieke; Steerneman, A.G.M.; Koning, Ruud H.
2006-01-01
A topic that has received quite some attention in the seventies and eighties is logical consistency of sum-constrained linear models. Loosely defined, a sum-constrained model is logically consistent if the restrictions on the parameters and explanatory variables are such that the sum constraint is a
Consistent Sets and Contrary Inferences Reply to Griffiths and Hartle
Kent, A
1998-01-01
It was pointed out recently [A. Kent, Phys. Rev. Lett. 78 (1997) 2874] that the consistent histories approach allows contrary inferences to be made from the same data, corresponding to commuting orthogonal projections in different consistent sets. To many, this seems undesirable in a theory of physical inferences. It also raises a specific problem for the consistent histories formalism, since that formalism is set up so as to eliminate contradictory inferences, yet there seems to be no sensible physical distinction between contradictory and contrary inferences. It seems particularly hard to defend this asymmetry, since (i) there is a well-defined quantum histories formalisms which admits both contradictory and contrary inferences, and (ii) there is also a well-defined formalism, based on ordered consistent sets of histories, which excludes both. In a recent comment, Griffiths and Hartle, while accepting the validity of the examples given in the above paper, restate their own preference for the consistent hist...
The construction and combined operation for fuzzy consistent matrixes
YAO Min; SHEN Bin; LUO Jian-hua
2005-01-01
Fuzziness is one of the general characteristics of human thinking and objective things. Introducing fuzzy techniques into decision-making yields very good results. Fuzzy consistent matrix has many excellent characteristics, especially center-division transitivity conforming to the reality of the human thinking process in decision-making. This paper presents a new approach for creating fuzzy consistent matrix from mutual supplementary matrix in fuzzy decision-making. At the same time,based on the distance between individual fuzzy consistent matrix and average fuzzy consistent matrix, a kind of combined operation for several fuzzy consistent matrixes is presented which reflects most opinions of experienced experts. Finally, a practical example shows its flexibility and practicability further.
Repeatability and consistency of individual behaviour in juvenile and adult Eurasian harvest mice
Schuster, Andrea C.; Carl, Teresa; Foerster, Katharina
2017-04-01
Knowledge on animal personality has provided new insights into evolutionary biology and animal ecology, as behavioural types have been shown to affect fitness. Animal personality is characterized by repeatable and consistent between-individual behavioural differences throughout time and across different situations. Behavioural repeatability within life history stages and consistency between life history stages should be checked for the independence of sex and age, as recent data have shown that males and females in some species may differ in the repeatability of behavioural traits, as well as in their consistency. We measured the repeatability and consistency of three behavioural and one cognitive traits in juvenile and adult Eurasian harvest mice ( Micromys minutus). We found that exploration, activity and boldness were repeatable in juveniles and adults. Spatial recognition measured in a Y Maze was only repeatable in adult mice. Exploration, activity and boldness were consistent before and after maturation, as well as before and after first sexual contact. Data on spatial recognition provided little evidence for consistency. Further, we found some evidence for a litter effect on behaviours by comparing different linear mixed models. We concluded that harvest mice express animal personality traits as behaviours were repeatable across sexes and consistent across life history stages. The tested cognitive trait showed low repeatability and was less consistent across life history stages. Given the rising interest in individual variation in cognitive performance, and in its relationship to animal personality, we suggest that it is important to gather more data on the repeatability and consistency of cognitive traits.
Kvavilashvili, Lia; Mirani, Jennifer; Schlagman, Simone; Foley, Kerry; Kornbrot, Diana E.
2009-01-01
The consistency of flashbulb memories over long delays provides a test of theories of memory for highly emotional events. This study used September 11, 2001 as the target event, with test-retest delays of 2 and 3 years. The nature and consistency of flashbulb memories were examined as a function of delay between the target event and an initial…
Consistent assignment of nurse aides: association with turnover and absenteeism.
Castle, Nicholas G
2013-01-01
Consistent assignment refers to the same caregivers consistently caring for the same residents almost every time caregivers are on duty. This article examines the association of consistent assignment of nurse aides with turnover and absenteeism. Data came from a survey of nursing home administrators, the Online Survey Certification and Reporting data, and the Area Resource File. The measures were from 2007 and came from 3,941 nursing homes. Multivariate logistic regression models were used to examine turnover and absenteeism. An average of 68% of nursing homes reported using consistent assignment, with 28% of nursing homes using nurse aides consistent assignment at the often recommended level of 85% (or more). Nursing homes using recommended levels of consistent assignment had significantly lower rates of turnover and of absenteeism. In the multivariate analyses, consistent assignment was significantly associated with both lower turnover and lower absenteeism (p assignment is a practice recommended by many policy makers, government agencies, and industry advocates. The findings presented here provide some evidence that the use of this staffing practice can be beneficial.
THE CONSISTENCY OF STATISTICAL ESTIMATES OF THURSTONE-MOSTELLER
Y. V. Bugaev
2015-01-01
Full Text Available The traditional method of analysis procedures of collective choice involves three different approaches: investigation operator voting against the characteristic conditions, investigation the properties of the function of choice, analysis on the possibility of manipulating (verification the stability of the voting process under the influence of negative impacts from voters or organizer. Research team of the department ITMU VSUET proposed and implemented a fourth approach, which is to research the probabilistic characteristics the results of the procedures (value of the displacement valuation of estimate the usefulness of specific alternative to its true value, the standard deviation of evaluating of estimate the usefulness of alternative from its true value, the probability of correct ranking of alternatives at the output the procedure of choice, etc.. This article is dedicated to the analysis of the consistency of estimates the usefulness to compare alternatives, obtained at the output of the traditional procedure Thurstone-Mosteller and its generalizations, created by the authors of. In the general, the term of consistency of estimator of statistical estimation assumes tending to zero error of estimation by increasing the sample size. However, depending on the interpretation of "calculation errors" in science are the following main types of consistency: the weak consistency of statistical estimation, based on the notion of convergence in probability of the random quantity; the strong consistency, based on the concept of convergence with probability to one; the consistency of statistical estimation in the mean square. The variance of this assessment tends to zero. This article provides a proof of the theorem, according to which the assumptions rather general nature of estimates the usefulness being ranked alternatives obtained using the procedure Thurstone-Mosteller satisfied the consistency of statistical estimation in the mean square. In this
Effect of word-list consistency on the correlation between group memory and group polarization.
Arima, Yoshiko
2013-04-01
Two studies investigated the effect of shared knowledge, manipulated using associated or randomly ordered word lists, on the correlation between group remembering and group polarization. Group polarization due to accumulation of information was expected only if it was consistent with shared knowledge among group members (the knowledge shared among group members before discussion). Consistency of information with shared knowledge was manipulated by lists of words that were ordered either randomly or in a manner consistent along with four stereotype categories. In Experiment 1, 159 college students answered a questionnaire about the common stereotype that blood type determines personality; half were given lists of words that were consistent with the stereotype (consistent condition) and the other half, randomly ordered word lists (inconsistent condition). After completion of the questionnaire, they were given, a surprise free-recall test including words from the lists that had appeared in the questionnaire; the test was administered in a group (group condition) or individual (individual condition) setting. The results indicated that stereotype-consistency of the word list reduced the groups' ability to detect incorrect answers compared with the individual condition. In Experiment 2 (N = 132), the divergence of memory among group members was manipulated by altering the constitution of each group with regard to members' blood type. The results showed that the shift in the score representing belief in the blood-type stereotype correlated with the number of words recalled in the stereotype-consistent word-list condition.
The Consistent Preferences Approach to Deductive Reasoning in Games
Asheim, Geir B
2006-01-01
"The Consistent Preferences Approach to Deductive Reasoning in Games" presents, applies, and synthesizes what my co-authors and I have called the 'consistent preferences' approach to deductive reasoning in games. Briefly described, this means that the object of the analysis is the ranking by each player of his own strategies, rather than his choice. The ranking can be required to be consistent (in different senses) with his beliefs about the opponent's ranking of her strategies. This can be contrasted to the usual 'rational choice' approach where a player's strategy choice is (in dif
On the Consistency of ZFn in ZFn＋3
李旭华
1993-01-01
By restricting the common replacement axiom schema of ZF to ∑n-formulae,Profexxor Zhang Jinwen constructed a series of subsystems of Zermelo-Frankel set theory ZF and he called them ZFn.Zhao Xi shun show that the consistency of ZFn can be deducted from ZF.Porfessor Zhang Jinwen raised the question whether the consistency of ZFn can be deducted from ZFn+m(n) for some m(n)≥1.In this paper,we get a positive solution to Professor Zhang's problem.Moreover,we show that the consistency of ZFn can be deducted from ZFn+3.
Model Checking Data Consistency for Cache Coherence Protocols
Hong Pan; Hui-Min Lin; Yi Lv
2006-01-01
A method for automatic verification of cache coherence protocols is presented, in which cache coherence protocols are modeled as concurrent value-passing processes, and control and data consistency requirement are described as formulas in first-orderμ-calculus. A model checker is employed to check if the protocol under investigation satisfies the required properties. Using this method a data consistency error has been revealed in a well-known cache coherence protocol.The error has been corrected, and the revised protocol has been shown free from data consistency error for any data domain size, by appealing to data independence technique.
Consistency of assertive, aggressive, and submissive behavior for children.
Deluty, R H
1985-10-01
The interpersonal behavior of 50 third- through fifth-grade children was assessed over an 8-month period in a wide variety of naturally occurring school activities. The consistency of the children's behavior was found to vary as a function of the child's sex, the class of behavior examined, and the similarity/dissimilarity of the contexts in which the behaviors occurred. Boys demonstrated remarkable consistency in their aggressive expression; 46 of 105 intercorrelations for the aggressiveness dimensions were statistically significant. In general, the consistency of assertive behavior for both boys and girls was unexpectedly high.
On exact triangles consisting of stable vector bundles on tori
Kobayashi, Kazushi
2016-01-01
In this paper, we consider the exact triangles consisting of stable holomorphic vector bundles on one-dimensional complex tori, and discuss their relations with the corresponding Fukaya category via the homological mirror symmetry.
A new insight into the consistency of smoothed particle hydrodynamics
Sigalotti, Leonardo Di G; Klapp, Jaime; Vargas, Carlos A; Campos, Kilver
2016-01-01
In this paper the problem of consistency of smoothed particle hydrodynamics (SPH) is solved. A novel error analysis is developed in $n$-dimensional space using the Poisson summation formula, which enables the treatment of the kernel and particle approximation errors in combined fashion. New consistency integral relations are derived for the particle approximation which correspond to the cosine Fourier transform of the classically known consistency conditions for the kernel approximation. The functional dependence of the error bounds on the SPH interpolation parameters, namely the smoothing length $h$ and the number of particles within the kernel support ${\\cal{N}}$ is demonstrated explicitly from which consistency conditions are seen to follow naturally. As ${\\cal{N}}\\to\\infty$, the particle approximation converges to the kernel approximation independently of $h$ provided that the particle mass scales with $h$ as $m\\propto h^{\\beta}$, with $\\beta >n$. This implies that as $h\\to 0$, the joint limit $m\\to 0$, $...
Island of Stability for Consistent Deformations of Einstein's Gravity
Dietrich, Dennis D.; Berkhahn, Felix; Hofmann, Stefan;
2012-01-01
We construct deformations of general relativity that are consistent and phenomenologically viable, since they respect, in particular, cosmological backgrounds. These deformations have unique symmetries in accordance with their Minkowski cousins (Fierz-Pauli theory for massive gravitons) and incor...
Consistency of Trend Break Point Estimator with Underspecified Break Number
Jingjing Yang
2017-01-01
Full Text Available This paper discusses the consistency of trend break point estimators when the number of breaks is underspecified. The consistency of break point estimators in a simple location model with level shifts has been well documented by researchers under various settings, including extensions such as allowing a time trend in the model. Despite the consistency of break point estimators of level shifts, there are few papers on the consistency of trend shift break point estimators in the presence of an underspecified break number. The simulation study and asymptotic analysis in this paper show that the trend shift break point estimator does not converge to the true break points when the break number is underspecified. In the case of two trend shifts, the inconsistency problem worsens if the magnitudes of the breaks are similar and the breaks are either both positive or both negative. The limiting distribution for the trend break point estimator is developed and closely approximates the finite sample performance.
Consistency in experiments on multistable driven delay systems
Oliver, Neus; Larger, Laurent; Fischer, Ingo
2016-10-01
We investigate the consistency properties in the responses of a nonlinear delay optoelectronic intensity oscillator subject to different drives, in particular, harmonic and self-generated waveforms. This system, an implementation of the Ikeda oscillator, is operating in a closed-loop configuration, exhibiting its autonomous dynamics while the drive signals are additionally introduced. Applying the same drive multiple times, we compare the dynamical responses of the optoelectronic oscillator and quantify the degree of consistency among them via their correlation. Our results show that consistency is not restricted to conditions close to the first Hopf bifurcation but can be found in a broad range of dynamical regimes, even in the presence of multistability. Finally, we discuss the dependence of consistency on the nature of the drive signal.
Energy-Consistent Multiscale Algorithms for Granular Flows
2014-08-07
8-98) v Prescribed by ANSI Std. Z39.18 30-07-2014 Final 01-MAY-2011 - 30-APR-2014 AFOSR YIP Energy-Consistent Multiscale Algorithms for Granular...document the achievements made as a result of this Young Investigator Program ( YIP ) project. We worked on the development of multi scale energy... YIP ) project. We worked on the development of multi scale energy-consistent algorithms to simulate and capture flow phenomena in granular
Design of a Turbulence Generator of Medium Consistency Pulp Pumps
Hong Li; Haifei Zhuang; Weihao Geng
2012-01-01
The turbulence generator is a key component of medium consistency centrifugal pulp pumps, with functions to fluidize the medium consistency pulp and to separate gas from the liquid. Structure sizes of the generator affect the hydraulic performance. The radius and the blade laying angle are two important structural sizes of a turbulence generator. Starting with the research on the flow inside and shearing characteristics of the MC pulp, a simple mathematical model at the flow section of the sh...
Consistent Reconstruction of Cortical Surfaces from Longitudinal Brain MR Images
Li, Gang; Nie, Jingxin; Shen, Dinggang
2011-01-01
Accurate and consistent reconstruction of cortical surfaces from longitudinal human brain MR images is of great importance in studying subtle morphological changes of the cerebral cortex. This paper presents a new deformable surface method for consistent and accurate reconstruction of inner, central and outer cortical surfaces from longitudinal MR images. Specifically, the cortical surfaces of the group-mean image of all aligned longitudinal images of the same subject are first reconstructed ...
Consistent Reconstruction of Cortical Surfaces from Longitudinal Brain MR Images
Li, Gang; Nie, Jingxin; Wu, Guorong; Wang, Yaping; Shen, Dinggang
2011-01-01
Accurate and consistent reconstruction of cortical surfaces from longitudinal human brain MR images is of great importance in studying longitudinal subtle change of the cerebral cortex. This paper presents a novel deformable surface method for consistent and accurate reconstruction of inner, central and outer cortical surfaces from longitudinal brain MR images. Specifically, the cortical surfaces of the group-mean image of all aligned longitudinal images of the same subject are first reconstr...
On the consistency of coset space dimensional reduction
Chatzistavrakidis, A. [Institute of Nuclear Physics, NCSR DEMOKRITOS, GR-15310 Athens (Greece); Physics Department, National Technical University of Athens, GR-15780 Zografou Campus, Athens (Greece)], E-mail: cthan@mail.ntua.gr; Manousselis, P. [Physics Department, National Technical University of Athens, GR-15780 Zografou Campus, Athens (Greece); Department of Engineering Sciences, University of Patras, GR-26110 Patras (Greece)], E-mail: pman@central.ntua.gr; Prezas, N. [CERN PH-TH, 1211 Geneva (Switzerland)], E-mail: nikolaos.prezas@cern.ch; Zoupanos, G. [Physics Department, National Technical University of Athens, GR-15780 Zografou Campus, Athens (Greece)], E-mail: george.zoupanos@cern.ch
2007-11-15
In this Letter we consider higher-dimensional Yang-Mills theories and examine their consistent coset space dimensional reduction. Utilizing a suitable ansatz and imposing a simple set of constraints we determine the four-dimensional gauge theory obtained from the reduction of both the higher-dimensional Lagrangian and the corresponding equations of motion. The two reductions yield equivalent results and hence they constitute an example of a consistent truncation.
Truncations driven by constraints: consistency and conditions for correct upliftings
Pons, J M; Pons, Josep M.; Talavera, Pere
2004-01-01
We discuss the mechanism of truncations driven by the imposition of constraints. We show how the consistency of such truncations is controlled, and give general theorems that establish conditions for the correct uplifting of solutions. We show in some particular examples how one can get correct upliftings from 7d supergravities to 10d type IIB supergravity, even in cases when the truncation is not initially consistent by its own.
S Matrix Proof of Consistency Condition Derived from Mixed Anomaly
Bhansali, Vineer
For a confining quantum field theory with conserved current J and stress tensor T, the JJJ> and anomalies computed in terms of elementary quanta must be precisely equal to the same anomalies computed in terms of the exact physical spectrum if the conservation law corresponding to J is unbroken. These strongly constrain the allowed representations of the low energy spectrum. We present a proof of the latter consistency condition based on the proof by Coleman and Grossman of the former consistency condition.
Consistent histories, quantum truth functionals, and hidden variables
Griffiths, Robert B.
2000-01-01
A central principle of consistent histories quantum theory, the requirement that quantum descriptions be based upon a single framework (or family), is employed to show that there is no conflict between consistent histories and a no-hidden-variables theorem of Bell, and Kochen and Specker, contrary to a recent claim by Bassi and Ghirardi. The argument makes use of `truth functionals' defined on a Boolean algebra of classical or quantum properties.
Consistent histories, quantum truth functionals, and hidden variables
Griffiths, R B
1999-01-01
A central principle of consistent histories quantum theory, the requirement that quantum descriptions be based upon a single framework (or family), is employed to show that there is no conflict between consistent histories and a no-hidden-variables theorem of Bell, and Kochen and Specker, contrary to a recent claim by Bassi and Ghirardi. The argument makes use of ``truth functionals'' defined on a Boolean algebra of classical or quantum properties.
TWO APPROACHES TO IMPROVING THE CONSISTENCY OF COMPLEMENTARY JUDGEMENT MATRIX
XuZeshui
2002-01-01
By the transformation relations between complementary judgement matrix and reciprocal judgement matrix ,this paper proposes two methods for improving the consistency of complementary judgement matrix and gives two simple practical iterative algorithms. These two algorithms are easy to implement on computer,and the modified complementary judgement matrices remain most information that original matrix contains. Thus the methods supplement and develop the theory and methodology for improving consistency of complementary judgement matrix.
Autonomous Navigation with Constrained Consistency for C-Ranger
Shujing Zhang
2014-06-01
Full Text Available Autonomous underwater vehicles (AUVs have become the most widely used tools for undertaking complex exploration tasks in marine environments. Their synthetic ability to carry out localization autonomously and build an environmental map concurrently, in other words, simultaneous localization and mapping (SLAM, are considered to be pivotal requirements for AUVs to have truly autonomous navigation. However, the consistency problem of the SLAM system has been greatly ignored during the past decades. In this paper, a consistency constrained extended Kalman filter (EKF SLAM algorithm, applying the idea of local consistency, is proposed and applied to the autonomous navigation of the C-Ranger AUV, which is developed as our experimental platform. The concept of local consistency (LC is introduced after an explicit theoretical derivation of the EKF-SLAM system. Then, we present a locally consistency- constrained EKF-SLAM design, LC-EKF, in which the landmark estimates used for linearization are fixed at the beginning of each local time period, rather than evaluated at the latest landmark estimates. Finally, our proposed LC- EKF algorithm is experimentally verified, both in simulations and sea trials. The experimental results show that the LC-EKF performs well with regard to consistency, accuracy and computational efficiency.
Measuring consistency of autobiographical memory recall in depression.
Semkovska, Maria
2012-05-15
Autobiographical amnesia assessments in depression need to account for normal changes in consistency over time, contribution of mood and type of memories measured. We report herein validation studies of the Columbia Autobiographical Memory Interview - Short Form (CAMI-SF), exclusively used in depressed patients receiving electroconvulsive therapy (ECT) but without previous published report of normative data. The CAMI-SF was administered twice with a 6-month interval to 44 healthy volunteers to obtain normative data for retrieval consistency of its Semantic, Episodic-Extended and Episodic-Specific components and assess their reliability and validity. Healthy volunteers showed significant large decreases in retrieval consistency on all components. The Semantic and Episodic-Specific components demonstrated substantial construct validity. We then assessed CAMI-SF retrieval consistencies over a 2-month interval in 30 severely depressed patients never treated with ECT compared with healthy controls (n=19). On initial assessment, depressed patients produced less episodic-specific memories than controls. Both groups showed equivalent amounts of consistency loss over a 2-month interval on all components. At reassessment, only patients with persisting depressive symptoms were distinguishable from controls on episodic-specific memories retrieved. Research quantifying retrograde amnesia following ECT for depression needs to control for normal loss in consistency over time and contribution of persisting depressive symptoms.
Autonomous Navigation with Constrained Consistency for C-Ranger
Shujing Zhang
2014-06-01
Full Text Available Autonomous underwater vehicles (AUVs have become the most widely used tools for undertaking complex exploration tasks in marine environments. Their synthetic ability to carry out localization autonomously and build an environmental map concurrently, in other words, simultaneous localization and mapping (SLAM, are considered to be pivotal requirements for AUVs to have truly autonomous navigation. However, the consistency problem of the SLAM system has been greatly ignored during the past decades. In this paper, a consistency constrained extended Kalman filter (EKF SLAM algorithm, applying the idea of local consistency, is proposed and applied to the autonomous navigation of the C-Ranger AUV, which is developed as our experimental platform. The concept of local consistency (LC is introduced after an explicit theoretical derivation of the EKF-SLAM system. Then, we present a locally consistency-constrained EKF-SLAM design, LC-EKF, in which the landmark estimates used for linearization are fixed at the beginning of each local time period, rather than evaluated at the latest landmark estimates. Finally, our proposed LC-EKF algorithm is experimentally verified, both in simulations and sea trials. The experimental results show that the LC-EKF performs well with regard to consistency, accuracy and computational efficiency.
Zhang, Rui; Noels, Kimberly A.; Lalonde, Richard N.; Salas, S. J.
2017-01-01
Prior research differentiates dialectical (e.g., East Asian) from non-dialectical cultures (e.g., North American and Latino) and attributes cultural differences in self-concept consistency to naïve dialecticism. In this research, we explored the effects of managing two cultural identities on consistency within the bicultural self-concept via the role of dialectical beliefs. Because the challenge of integrating more than one culture within the self is common to biculturals of various heritage backgrounds, the effects of bicultural identity integration should not depend on whether the heritage culture is dialectical or not. In four studies across diverse groups of bicultural Canadians, we showed that having an integrated bicultural identity was associated with being more consistent across roles (Studies 1–3) and making less ambiguous self-evaluations (Study 4). Furthermore, dialectical self-beliefs mediated the effect of bicultural identity integration on self-consistency (Studies 2–4). Finally, Latino biculturals reported being more consistent across roles than did East Asian biculturals (Study 2), revealing the ethnic heritage difference between the two groups. We conclude that both the content of heritage culture and the process of integrating cultural identities influence the extent of self-consistency among biculturals. Thus, consistency within the bicultural self-concept can be understood, in part, to be a unique psychological product of bicultural experience. PMID:28326052
Does object view influence the scene consistency effect?
Sastyin, Gergo; Niimi, Ryosuke; Yokosawa, Kazuhiko
2015-04-01
Traditional research on the scene consistency effect only used clearly recognizable object stimuli to show mutually interactive context effects for both the object and background components on scene perception (Davenport & Potter in Psychological Science, 15, 559-564, 2004). However, in real environments, objects are viewed from multiple viewpoints, including an accidental, hard-to-recognize one. When the observers named target objects in scenes (Experiments 1a and 1b, object recognition task), we replicated the scene consistency effect (i.e., there was higher accuracy for the objects with consistent backgrounds). However, there was a significant interaction effect between consistency and object viewpoint, which indicated that the scene consistency effect was more important for identifying objects in the accidental view condition than in the canonical view condition. Therefore, the object recognition system may rely more on the scene context when the object is difficult to recognize. In Experiment 2, the observers identified the background (background recognition task) while the scene consistency and object views were manipulated. The results showed that object viewpoint had no effect, while the scene consistency effect was observed. More specifically, the canonical and accidental views both equally provided contextual information for scene perception. These findings suggested that the mechanism for conscious recognition of objects could be dissociated from the mechanism for visual analysis of object images that were part of a scene. The "context" that the object images provided may have been derived from its view-invariant, relatively low-level visual features (e.g., color), rather than its semantic information.
Inter-laboratory consistency of gait analysis measurements.
Benedetti, M G; Merlo, A; Leardini, A
2013-09-01
The dissemination of gait analysis as a clinical assessment tool requires the results to be consistent, irrespective of the laboratory. In this work a baseline assessment of between site consistency of one healthy subject examined at 7 different laboratories is presented. Anthropometric and spatio-temporal parameters, pelvis and lower limb joint rotations, joint sagittal moments and powers, and ground reaction forces were compared. The consistency between laboratories for single parameters was assessed by the median absolute deviation and maximum difference, for curves by linear regression. Twenty-one lab-to-lab comparisons were performed and averaged. Large differences were found between the characteristics of the laboratories (i.e. motion capture systems and protocols). Different values for the anthropometric parameters were found, with the largest variability for a pelvis measurement. The spatio-temporal parameters were in general consistent. Segment and joint kinematics consistency was in general high (R2>0.90), except for hip and knee joint rotations. The main difference among curves was a vertical shift associated to the corresponding value in the static position. The consistency between joint sagittal moments ranged form R2=0.90 at the ankle to R2=0.66 at the hip, the latter was increasing when comparing separately laboratories using the same protocol. Pattern similarity was good for ankle power but not satisfactory for knee and hip power. The force was found the most consistent, as expected. The differences found were in general lower than the established minimum detectable changes for gait kinematics and kinetics for healthy adults.
Norman, Patrick; Bishop, David M.; Jensen, Hans Jørgen Aa;
2001-01-01
Computationally tractable expressions for the evaluation of the linear response function in the multiconfigurational self-consistent field approximation were derived and implemented. The finite lifetime of the electronically excited states was considered and the linear response function was shown...
Martial arts striking hand peak acceleration, accuracy and consistency.
Neto, Osmar Pinto; Marzullo, Ana Carolina De Miranda; Bolander, Richard P; Bir, Cynthia A
2013-01-01
The goal of this paper was to investigate the possible trade-off between peak hand acceleration and accuracy and consistency of hand strikes performed by martial artists of different training experiences. Ten male martial artists with training experience ranging from one to nine years volunteered to participate in the experiment. Each participant performed 12 maximum effort goal-directed strikes. Hand acceleration during the strikes was obtained using a tri-axial accelerometer block. A pressure sensor matrix was used to determine the accuracy and consistency of the strikes. Accuracy was estimated by the radial distance between the centroid of each subject's 12 strikes and the target, whereas consistency was estimated by the square root of the 12 strikes mean squared distance from their centroid. We found that training experience was significantly correlated to hand peak acceleration prior to impact (r(2)=0.456, p =0.032) and accuracy (r(2)=0. 621, p=0.012). These correlations suggest that more experienced participants exhibited higher hand peak accelerations and at the same time were more accurate. Training experience, however, was not correlated to consistency (r(2)=0.085, p=0.413). Overall, our results suggest that martial arts training may lead practitioners to achieve higher striking hand accelerations with better accuracy and no change in striking consistency.
Analysis of Consistency of Printing Blankets using Correlation Technique
Lalitha Jayaraman
2010-01-01
Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thicknessdirection. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimizing the torque for blankets from different manufacturers.
GRAVITATIONALLY CONSISTENT HALO CATALOGS AND MERGER TREES FOR PRECISION COSMOLOGY
Behroozi, Peter S.; Wechsler, Risa H.; Wu, Hao-Yi [Kavli Institute for Particle Astrophysics and Cosmology, Department of Physics, Stanford University, Stanford, CA 94305 (United States); Busha, Michael T. [Institute for Theoretical Physics, University of Zurich, CH-8006 Zurich (Switzerland); Klypin, Anatoly A. [Astronomy Department, New Mexico State University, Las Cruces, NM 88003 (United States); Primack, Joel R., E-mail: behroozi@stanford.edu, E-mail: rwechsler@stanford.edu [Department of Physics, University of California at Santa Cruz, Santa Cruz, CA 95064 (United States)
2013-01-20
We present a new algorithm for generating merger trees and halo catalogs which explicitly ensures consistency of halo properties (mass, position, and velocity) across time steps. Our algorithm has demonstrated the ability to improve both the completeness (through detecting and inserting otherwise missing halos) and purity (through detecting and removing spurious objects) of both merger trees and halo catalogs. In addition, our method is able to robustly measure the self-consistency of halo finders; it is the first to directly measure the uncertainties in halo positions, halo velocities, and the halo mass function for a given halo finder based on consistency between snapshots in cosmological simulations. We use this algorithm to generate merger trees for two large simulations (Bolshoi and Consuelo) and evaluate two halo finders (ROCKSTAR and BDM). We find that both the ROCKSTAR and BDM halo finders track halos extremely well; in both, the number of halos which do not have physically consistent progenitors is at the 1%-2% level across all halo masses. Our code is publicly available at http://code.google.com/p/consistent-trees. Our trees and catalogs are publicly available at http://hipacc.ucsc.edu/Bolshoi/.
Self-consistent generalized Langevin equation for colloidal mixtures.
Chávez-Rojo, Marco Antonio; Medina-Noyola, Magdaleno
2005-09-01
A self-consistent theory of collective and tracer diffusion in colloidal mixtures is presented. This theory is based on exact results for the partial intermediate scattering functions derived within the framework of the generalized Langevin equation formalism, plus a number of conceptually simple and sensible approximations. The first of these consists of a Vineyard-like approximation between collective and tracer diffusion, which writes the collective dynamics in terms of the memory function related to tracer diffusion. The second consists of interpolating this only unknown memory function between its two exact limits at small and large wave vectors; for this, a phenomenologically determined, but not arbitrary, interpolating function is introduced: a Lorentzian with its inflection point located at the first minimum of the partial static structure factor. The small wave-vector exact limit involves a time-dependent friction function, for which we take a general approximate result, previously derived within the generalized Langevin equation formalism. This general result expresses the time-dependent friction function in terms of the partial intermediate scattering functions, thus closing the system of equations into a fully self-consistent scheme. This extends to mixtures a recently proposed self-consistent theory developed for monodisperse suspensions [Yeomans-Reyna and Medina-Noyola, Phys. Rev. E 64, 066114 (2001)]. As an illustration of its quantitative accuracy, its application to a simple model of a binary dispersion in the absence of hydrodynamic interactions is reported.
Pulsed laser photoacoustic monitoring of paper pulp consistency
Zhao, Zuomin; Törmänen, Matti; Myllylä, Risto
2008-06-01
This study involves measurements of pulp consistency in cuvette and by an online apparatus, by innovatively scattering photoacoustic (SPA) method. The theoretical aspects were described at first. Then, a few kinds of wood fiber suspensions with consistencies from 0.5% to 5% were studied in cuvette. After that, a pilot of online apparatus was built to measure suspensions with fiber consistency lower than 1% and filler content up to 3%. The results showed that although there were many fiber flocks in cuvette which strongly affected the measurement accuracy of samples consistencies, the apparatus can sense fiber types with different optical and acoustic properties. The measurement accuracy can be greatly improved in the online style apparatus, by pumping suspension fluids in a circulating system to improve the suspension homogeneity. The results demonstrated that wood fibers cause larger attenuation of acoustic waves but fillers do not. On the other hand, fillers cause stronger scattering of incident light. Therefore, our SPA apparatus has a potential ability to simultaneously determine fiber and filler fractions in pulp suspensions with consistency up to 5%.
Consistency of Scalar Potentials from Quantum de Sitter Space
Espinosa, José R; Trépanier, Maxime
2015-01-01
We derive constraints on the scalar potential of a quantum field theory in de Sitter space. The constraints, which we argue should be understood as consistency conditions for quantum field theories in dS space, originate from a consistent interpretation of quantum de Sitter space through its Coleman-De Luccia tunneling rate. Indeed, consistency of de Sitter space as a quantum theory of gravity with a finite number of degrees of freedom suggests the tunneling rates to vacua with negative cosmological constants be interpreted as Poincar\\'e recurrences. Demanding the tunneling rate to be a Poincar\\'e recurrence imposes two constraints, or consistency conditions, on the scalar potential. Although the exact consistency conditions depend on the shape of the scalar potential, generically they correspond to: the distance in field space between the de Sitter vacuum and any other vacuum with negative cosmological constant must be of the order of the reduced Planck mass or larger; and the fourth root of the vacuum energ...
Gravitationally Consistent Halo Catalogs and Merger Trees for Precision Cosmology
Behroozi, Peter S.; Wechsler, Risa H.; Wu, Hao-Yi; Busha, Michael T.; Klypin, Anatoly A.; Primack, Joel R.
2013-01-01
We present a new algorithm for generating merger trees and halo catalogs which explicitly ensures consistency of halo properties (mass, position, and velocity) across time steps. Our algorithm has demonstrated the ability to improve both the completeness (through detecting and inserting otherwise missing halos) and purity (through detecting and removing spurious objects) of both merger trees and halo catalogs. In addition, our method is able to robustly measure the self-consistency of halo finders; it is the first to directly measure the uncertainties in halo positions, halo velocities, and the halo mass function for a given halo finder based on consistency between snapshots in cosmological simulations. We use this algorithm to generate merger trees for two large simulations (Bolshoi and Consuelo) and evaluate two halo finders (ROCKSTAR and BDM). We find that both the ROCKSTAR and BDM halo finders track halos extremely well; in both, the number of halos which do not have physically consistent progenitors is at the 1%-2% level across all halo masses. Our code is publicly available at http://code.google.com/p/consistent-trees. Our trees and catalogs are publicly available at http://hipacc.ucsc.edu/Bolshoi/.
Analysis of Consistency of Printing Blankets using Correlation Technique
Balaraman Kumar
2010-06-01
Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thickness direction. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimising the torque for blankets from different manufacturers.
The internal consistency and validity of the Self-assessment Parkinson's Disease Disability Scale.
Biemans, M.A.J.E.; Dekker, J.; Woude, L.H.V. van der
2001-01-01
OBJECTIVE: To test the consistency and validity of the Self-assessment Parkinson's Disease Disability Scale in patients with Parkinson's disease living at home. DESIGN: Patients with Parkinson's disease responded to a set of questionnaires. In addition, an observation of the performance of daily
Dekker, J.; Biemans, M.A.J.E.; Woude, L.H.V. van der
2000-01-01
OBJECTIVE: To test the consistency and validity of the Self-assessment Parkinson's Disease Disability Scale in patients with Parkinson's disease living at home. DESIGN: Patients with Parkinson's disease responded to a set of questionnaires. In addition, an observation of the performance of daily
Classification Consistency and Accuracy for Complex Assessments Using Item Response Theory
Lee, Won-Chan
2010-01-01
In this article, procedures are described for estimating single-administration classification consistency and accuracy indices for complex assessments using item response theory (IRT). This IRT approach was applied to real test data comprising dichotomous and polytomous items. Several different IRT model combinations were considered. Comparisons…
Leinonen, Risto; Asikainen, Mervi A.; Hirvonen, Pekka E.
2015-01-01
This study concentrates on evaluating the consistency of upper-division students' use of the second law of thermodynamics at macroscopic and microscopic levels. Data were collected by means of a paper and pencil test (N = 48) focusing on the macroscopic and microscopic features of the second law concerned with heat transfer processes. The data…
Remvig, L; Duhn, P H; Ullman, S
2009-01-01
methods. METHODS: Six EDS, 11 BJHS, and 19 controls completed the trial. We analysed the overall inter-examiner agreement on clinical tests for skin extensibility and consistency, in addition to analyses on suction cup (SC) and soft tissue stiffness meter (STSM) methods. RESULTS: Overall agreement...
L.S. Ferreira
2016-02-01
Full Text Available Proton radioactivity from deformed nuclei is described for the first time by a self-consistent calculation based on covariant relativistic density functionals derived from meson exchange and point coupling models. The calculation provides an important new test to these interactions at the limits of stability, since the mixing of different angular momenta in the single particle wave functions is probed.
Leinonen, Risto; Asikainen, Mervi A.; Hirvonen, Pekka E.
2015-01-01
This study concentrates on evaluating the consistency of upper-division students' use of the second law of thermodynamics at macroscopic and microscopic levels. Data were collected by means of a paper and pencil test (N = 48) focusing on the macroscopic and microscopic features of the second law concerned with heat transfer processes. The data…
The Hierarchy Consistency Index: Evaluating Person Fit for Cognitive Diagnostic Assessment
Cui, Ying; Leighton, Jacqueline P.
2009-01-01
In this article, we introduce a person-fit statistic called the hierarchy consistency index (HCI) to help detect misfitting item response vectors for tests developed and analyzed based on a cognitive model. The HCI ranges from -1.0 to 1.0, with values close to -1.0 indicating that students respond unexpectedly or differently from the responses…
The internal consistency and validity of the Self-Assessment Parkinson's Disease Disability Scale
Biemans, M A; Dekker, J; van der Woude, L H
2001-01-01
OBJECTIVE: To test the consistency and validity of the Self-assessment Parkinson's Disease Disability Scale in patients with Parkinson's disease living at home. DESIGN: Patients with Parkinson's disease responded to a set of questionnaires. In addition, an observation of the performance of daily act
Ferreira, L.S., E-mail: flidia@tecnico.ulisboa.pt [Center of Physics and Engineering of Advanced Materials, CeFEMA, and Departamento de Física, Instituto Superior Técnico, Universidade de Lisboa, Avenida Rovisco Pais, P1049-001 Lisbon (Portugal); Maglione, E. [Dipartimento di Fisica e Astronomia “G. Galilei”, Via Marzolo 8, I-35131 Padova (Italy); Istituto Nazionale di Fisica Nucleare, Padova (Italy); Ring, P. [Physik Department der Technischen Universität München, D-85748 Garching (Germany)
2016-02-10
Proton radioactivity from deformed nuclei is described for the first time by a self-consistent calculation based on covariant relativistic density functionals derived from meson exchange and point coupling models. The calculation provides an important new test to these interactions at the limits of stability, since the mixing of different angular momenta in the single particle wave functions is probed.
Time perspective and attitude-behaviour consistency in future-oriented behaviours
Rabinovich, Anna; Morton, Thomas; Postmes, Tom
2010-01-01
The authors propose that the salience of a distant-future time perspective, compared to a near-future time perspective, should increase attitude-behaviour and attitude-intention consistency for future-oriented behaviours. To test this prediction, time perspective was experimentally manipulated in th
Numerical investigation of degas performance on impeller of medium-consistency pump
Hong Li
2015-12-01
Full Text Available Medium-consistency technology is known as the process with high efficiency and low pollution. The gas distribution was simulated in the medium-consistency pump with different degas hole positions. Rheological behaviors of pulp suspension were obtained by experimental test. A modified Herschel–Bulkley model and the Eulerian gas–liquid two-phase flow model were utilized to approximately represent the behaviors of the medium-consistency pulp suspension. The results show that when the relative position is 0.53, the gas volume ratio is less than 0.1% at the pump outlet and 9.8% at the vacuum inlet, and the pump head is at the maximum. Because of the different numbers of the impeller blades and turbulence blades and the asymmetric volute structure, the gas is distributed unevenly in the impeller. In addition, the pump performance was tested in experiment and the results are used to validate computational fluid dynamics outcomes.
A Dynamical Mechanism for Large Volumes with Consistent Couplings
Abel, Steven
2016-01-01
A mechanism for addressing the 'decompactification problem' is proposed, which consists of balancing the vacuum energy in Scherk-Schwarzed theories against contributions coming from non- perturbative physics. Universality of threshold corrections ensures that, in such situations, the stable minimum will have consistent gauge couplings for any gauge group that shares the same N = 2 beta function for the bulk excitations as the gauge group that takes part in the minimisation. Scherk- Schwarz compactification from 6D to 4D in heterotic strings is discussed explicitly, together with two alternative possibilities for the non-perturbative physics, namely metastable SQCD vacua and a single gaugino condensate. In the former case, it is shown that modular symmetries gives various consistency checks, and allow one to follow soft-terms, playing a similar role to R-symmetry in global SQCD. The latter case is particularly attractive when there is nett Bose-Fermi degeneracy in the massless sector. In such cases, because th...
Consistent group selection in high-dimensional linear regression
Wei, Fengrong; 10.3150/10-BEJ252
2010-01-01
In regression problems where covariates can be naturally grouped, the group Lasso is an attractive method for variable selection since it respects the grouping structure in the data. We study the selection and estimation properties of the group Lasso in high-dimensional settings when the number of groups exceeds the sample size. We provide sufficient conditions under which the group Lasso selects a model whose dimension is comparable with the underlying model with high probability and is estimation consistent. However, the group Lasso is, in general, not selection consistent and also tends to select groups that are not important in the model. To improve the selection results, we propose an adaptive group Lasso method which is a generalization of the adaptive Lasso and requires an initial estimator. We show that the adaptive group Lasso is consistent in group selection under certain conditions if the group Lasso is used as the initial estimator.
Lightness constancy through transparency: internal consistency in layered surface representations.
Singh, Manish
2004-01-01
Asymmetric lightness matching was employed to measure how the visual system assigns lightness to surface patches seen through partially-transmissive surfaces. Observers adjusted the luminance of a comparison patch seen through transparency, in order to match the lightness of a standard patch seen in plain view. Plots of matched-to-standard luminance were linear, and their slopes were consistent with Metelli's alpha. A control experiment confirmed that these matches were indeed transparency based. Consistent with recent results, however, when observers directly matched the transmittance of transparent surfaces, their matches deviated strongly and systematically from Metelli's alpha. Although the two sets of results appear to be contradictory, formal analysis reveals a deeper mutual consistency in the representation of the two layers. A ratio-of-contrasts model is shown to explain both the success of Metelli's model in predicting lightness through transparency, and its failure to predict perceived transmittance--and hence is seen to play the primary role in perceptual transparency.
A Consistent Semantics of Self-Adjusting Computation
Acar, Umut A; Donham, Jacob
2011-01-01
This paper presents a semantics of self-adjusting computation and proves that the semantics are correct and consistent. The semantics integrate change propagation with the classic idea of memoization to enable reuse of computations under mutation to memory. During evaluation, reuse of a computation via memoization triggers a change propagation that adjusts the reused computation to reflect the mutated memory. Since the semantics integrate memoization and change-propagation, it involves both non-determinism (due to memoization) and mutation (due to change propagation). Our consistency theorem states that the non-determinism is not harmful: any two evaluations of the same program starting at the same state yield the same result. Our correctness theorem states that mutation is not harmful: self-adjusting programs are consistent with purely functional programming. We formalize the semantics and their meta-theory in the LF logical framework and machine check our proofs using Twelf.
Consistency and Reconciliation Model In Regional Development Planning
Dina Suryawati
2016-10-01
Full Text Available The aim of this study was to identify the problems and determine the conceptual model of regional development planning. Regional development planning is a systemic, complex and unstructured process. Therefore, this study used soft systems methodology to outline unstructured issues with a structured approach. The conceptual models that were successfully constructed in this study are a model of consistency and a model of reconciliation. Regional development planning is a process that is well-integrated with central planning and inter-regional planning documents. Integration and consistency of regional planning documents are very important in order to achieve the development goals that have been set. On the other hand, the process of development planning in the region involves technocratic system, that is, both top-down and bottom-up system of participation. Both must be balanced, do not overlap and do not dominate each other. regional, development, planning, consistency, reconciliation
Self-consistent modelling of resonant tunnelling structures
Fiig, T.; Jauho, A.P.
1992-01-01
We report a comprehensive study of the effects of self-consistency on the I-V-characteristics of resonant tunnelling structures. The calculational method is based on a simultaneous solution of the effective-mass Schrödinger equation and the Poisson equation, and the current is evaluated with the ......We report a comprehensive study of the effects of self-consistency on the I-V-characteristics of resonant tunnelling structures. The calculational method is based on a simultaneous solution of the effective-mass Schrödinger equation and the Poisson equation, and the current is evaluated...... applied voltages and carrier densities at the emitter-barrier interface. We include the two-dimensional accumulation layer charge and the quantum well charge in our self-consistent scheme. We discuss the evaluation of the current contribution originating from the two-dimensional accumulation layer charges...
Model-Consistent Sparse Estimation through the Bootstrap
Bach, Francis
2009-01-01
We consider the least-square linear regression problem with regularization by the $\\ell^1$-norm, a problem usually referred to as the Lasso. In this paper, we first present a detailed asymptotic analysis of model consistency of the Lasso in low-dimensional settings. For various decays of the regularization parameter, we compute asymptotic equivalents of the probability of correct model selection. For a specific rate decay, we show that the Lasso selects all the variables that should enter the model with probability tending to one exponentially fast, while it selects all other variables with strictly positive probability. We show that this property implies that if we run the Lasso for several bootstrapped replications of a given sample, then intersecting the supports of the Lasso bootstrap estimates leads to consistent model selection. This novel variable selection procedure, referred to as the Bolasso, is extended to high-dimensional settings by a provably consistent two-step procedure.
Consistency of the group Lasso and multiple kernel learning
Bach, Francis
2007-01-01
We consider the least-square regression problem with regularization by a block 1-norm, i.e., a sum of Euclidean norms over spaces of dimensions larger than one. This problem, referred to as the group Lasso, extends the usual regularization by the 1-norm where all spaces have dimension one, where it is commonly referred to as the Lasso. In this paper, we study the asymptotic model consistency of the group Lasso. We derive necessary and sufficient conditions for the consistency of group Lasso under practical assumptions, such as model misspecification. When the linear predictors and Euclidean norms are replaced by functions and reproducing kernel Hilbert norms, the problem is usually referred to as multiple kernel learning and is commonly used for learning from heterogeneous data sources and for non linear variable selection. Using tools from functional analysis, and in particular covariance operators, we extend the consistency results to this infinite dimensional case and also propose an adaptive scheme to obt...
The consistent histories approach to loop quantum cosmology
Craig, David A
2016-01-01
We review the application of the consistent (or decoherent) histories formulation of quantum theory to canonical loop quantum cosmology. Conventional quantum theory relies crucially on "measurements" to convert unrealized quantum potentialities into physical outcomes that can be assigned probabilities. In the early universe and other physical contexts in which there are no observers or measuring apparatus (or indeed, in any closed quantum system), what criteria determine which alternative outcomes may be realized and what their probabilities are? In the consistent histories formulation it is the vanishing of interference between the branch wave functions describing alternative histories -- as determined by the system's decoherence functional -- that determines which alternatives may be assigned probabilities. We describe the consistent histories formulation and how it may be applied to canonical loop quantum cosmology, describing in detail the application to homogeneous and isotropic cosmological models with ...
One-particle-irreducible consistency relations for cosmological perturbations
Goldberger, Walter D; Nicolis, Alberto
2013-01-01
We derive consistency relations for correlators of scalar cosmological perturbations which hold in the "squeezed limit" in which one or more of the external momenta become soft. Our results are formulated as relations between suitably defined one-particle irreducible N-point and (N-1)-point functions that follow from residual spatial conformal diffeomorphisms of the unitary gauge Lagrangian. As such, some of these relations are exact to all orders in perturbation theory, and do not rely on approximate deSitter invariance or other dynamical assumptions (e.g., properties of the operator product expansion or the behavior of modes at horizon crossing). The consistency relations apply model-independently to cosmological scenarios where the time evolution is driven by a single scalar field. Besides reproducing the known results for single-field inflation in the slow roll limit, we verify that our consistency relations hold more generally, for instance in ghost condensate models in flat space. We comment on possible...
Compositional based testing with ioco
van der Bijl, H.M.; Rensink, Arend; Tretmans, G.J.; Petrenko, A.; Ulrich, A.
2004-01-01
Compositional testing concerns the testing of systems that consist of communicating components which can also be tested in isolation. Examples are component based testing and interoperability testing. We show that, with certain restrictions, the ioco-test theory for conformance testing is suitable
Consistent Parameter and Transfer Function Estimation using Context Free Grammars
Klotz, Daniel; Herrnegger, Mathew; Schulz, Karsten
2017-04-01
This contribution presents a method for the inference of transfer functions for rainfall-runoff models. Here, transfer functions are defined as parametrized (functional) relationships between a set of spatial predictors (e.g. elevation, slope or soil texture) and model parameters. They are ultimately used for estimation of consistent, spatially distributed model parameters from a limited amount of lumped global parameters. Additionally, they provide a straightforward method for parameter extrapolation from one set of basins to another and can even be used to derive parameterizations for multi-scale models [see: Samaniego et al., 2010]. Yet, currently an actual knowledge of the transfer functions is often implicitly assumed. As a matter of fact, for most cases these hypothesized transfer functions can rarely be measured and often remain unknown. Therefore, this contribution presents a general method for the concurrent estimation of the structure of transfer functions and their respective (global) parameters. Note, that by consequence an estimation of the distributed parameters of the rainfall-runoff model is also undertaken. The method combines two steps to achieve this. The first generates different possible transfer functions. The second then estimates the respective global transfer function parameters. The structural estimation of the transfer functions is based on the context free grammar concept. Chomsky first introduced context free grammars in linguistics [Chomsky, 1956]. Since then, they have been widely applied in computer science. But, to the knowledge of the authors, they have so far not been used in hydrology. Therefore, the contribution gives an introduction to context free grammars and shows how they can be constructed and used for the structural inference of transfer functions. This is enabled by new methods from evolutionary computation, such as grammatical evolution [O'Neill, 2001], which make it possible to exploit the constructed grammar as a
Multiscale Parameter Regionalization for consistent global water resources modelling
Wanders, Niko; Wood, Eric; Pan, Ming; Samaniego, Luis; Thober, Stephan; Kumar, Rohini; Sutanudjaja, Edwin; van Beek, Rens; Bierkens, Marc F. P.
2017-04-01
Due to an increasing demand for high- and hyper-resolution water resources information, it has become increasingly important to ensure consistency in model simulations across scales. This consistency can be ensured by scale independent parameterization of the land surface processes, even after calibration of the water resource model. Here, we use the Multiscale Parameter Regionalization technique (MPR, Samaniego et al. 2010, WRR) to allow for a novel, spatially consistent, scale independent parameterization of the global water resource model PCR-GLOBWB. The implementation of MPR in PCR-GLOBWB allows for calibration at coarse resolutions and subsequent parameter transfer to the hyper-resolution. In this study, the model was calibrated at 50 km resolution over Europe and validation carried out at resolutions of 50 km, 10 km and 1 km. MPR allows for a direct transfer of the calibrated transfer function parameters across scales and we find that we can maintain consistent land-atmosphere fluxes across scales. Here we focus on the 2003 European drought and show that the new parameterization allows for high-resolution calibrated simulations of water resources during the drought. For example, we find a reduction from 29% to 9.4% in the percentile difference in the annual evaporative flux across scales when compared against default simulations. Soil moisture errors are reduced from 25% to 6.9%, clearly indicating the benefits of the MPR implementation. This new parameterization allows us to show more spatial detail in water resources simulations that are consistent across scales and also allow validation of discharge for smaller catchments, even with calibrations at a coarse 50 km resolution. The implementation of MPR allows for novel high-resolution calibrated simulations of a global water resources model, providing calibrated high-resolution model simulations with transferred parameter sets from coarse resolutions. The applied methodology can be transferred to other
Neighborhood consistency in mental arithmetic: Behavioral and ERP evidence
Verguts Tom
2007-12-01
Full Text Available Abstract Background Recent cognitive and computational models (e.g. the Interacting Neighbors Model state that in simple multiplication decade and unit digits of the candidate answers (including the correct result are represented separately. Thus, these models challenge holistic views of number representation as well as traditional accounts of the classical problem size effect in simple arithmetic (i.e. the finding that large problems are answered slower and less accurate than small problems. Empirical data supporting this view are still scarce. Methods Data of 24 participants who performed a multiplication verification task with Arabic digits (e.g. 8 × 4 = 36 - true or false? are reported. Behavioral (i.e. RT and errors and EEG (i.e. ERP measures were recorded in parallel. Results We provide evidence for neighborhood-consistency effects in the verification of simple multiplication problems (e.g. 8 × 4. Behaviorally, we find that decade-consistent lures, which share their decade digit with the correct result (e.g. 36, are harder to reject than matched inconsistent lures, which differ in both digits from the correct result (e.g. 28. This neighborhood consistency effect in product verification is similar to recent observations in the production of multiplication results. With respect to event-related potentials we find significant differences for consistent compared to inconsistent lures in the N400 (increased negativity and Late Positive Component (reduced positivity. In this respect consistency effects in our paradigm resemble lexico-semantic effects earlier found in simple arithmetic and in orthographic input processing. Conclusion Our data suggest that neighborhood consistency effects in simple multiplication stem at least partly from central (lexico-semantic' stages of processing. These results are compatible with current models on the representation of simple multiplication facts – in particular with the Interacting Neighbors Model
Agent-Based Context Consistency Management in Smart Space Environments
Jih, Wan-Rong; Hsu, Jane Yung-Jen; Chang, Han-Wen
Context-aware systems in smart space environments must be aware of the context of their surroundings and adapt to changes in highly dynamic environments. Data management of contextual information is different from traditional approaches because the contextual information is dynamic, transient, and fallible in nature. Consequently, the capability to detect context inconsistency and maintain consistent contextual information are two key issues for context management. We propose an ontology-based model for representing, deducing, and managing consistent contextual information. In addition, we use ontology reasoning to detect and resolve context inconsistency problems, which will be described in a Smart Alarm Clock scenario.
Consistency among integral measurements of aggregate decay heat power
Takeuchi, H.; Sagisaka, M.; Oyamatsu, K.; Kukita, Y. [Nagoya Univ. (Japan)
1998-03-01
Persisting discrepancies between summation calculations and integral measurements force us to assume large uncertainties in the recommended decay heat power. In this paper, we develop a hybrid method to calculate the decay heat power of a fissioning system from those of different fissioning systems. Then, this method is applied to examine consistency among measured decay heat powers of {sup 232}Th, {sup 233}U, {sup 235}U, {sup 238}U and {sup 239}Pu at YAYOI. The consistency among the measured values are found to be satisfied for the {beta} component and fairly well for the {gamma} component, except for cooling times longer than 4000 s. (author)
The consistency service of the ATLAS Distributed Data Management system
Serfon, C; The ATLAS collaboration
2011-01-01
With the continuously increasing volume of data produced by ATLAS and stored on the WLCG sites, the probability of data corruption or data losses, due to software and hardware failures is increasing. In order to ensure the consistency of all data produced by ATLAS a Consistency Service has been developed as part of the DQ2 Distributed Data Management system. This service is fed by the different ATLAS tools, i.e. the analysis tools, production tools, DQ2 site services or by site administrators that report corrupted or lost files. It automatically corrects the errors reported and informs the users in case of irrecoverable file loss.
The Consistency Service of the ATLAS Distributed Data Management system
Serfon, C; The ATLAS collaboration
2010-01-01
With the continuously increasing volume of data produced by ATLAS and stored on the WLCG sites, the probability of data corruption or data losses, due to software and hardware failure is increasing. In order to ensure the consistency of all data produced by ATLAS a Consistency Service has been developed as part of the DQ2 Distributed Data Management system. This service is fed by the different ATLAS tools, i.e. the analysis tools, production tools, DQ2 site services or by site administrators that report corrupted or lost files. It automatically correct the errors reported and informs the users in case of irrecoverable file loss.
Towards consistent nuclear models and comprehensive nuclear data evaluations
Bouland, O [Los Alamos National Laboratory; Hale, G M [Los Alamos National Laboratory; Lynn, J E [Los Alamos National Laboratory; Talou, P [Los Alamos National Laboratory; Bernard, D [FRANCE; Litaize, O [FRANCE; Noguere, G [FRANCE; De Saint Jean, C [FRANCE; Serot, O [FRANCE
2010-01-01
The essence of this paper is to enlighten the consistency achieved nowadays in nuclear data and uncertainties assessments in terms of compound nucleus reaction theory from neutron separation energy to continuum. Making the continuity of theories used in resolved (R-matrix theory), unresolved resonance (average R-matrix theory) and continuum (optical model) rangcs by the generalization of the so-called SPRT method, consistent average parameters are extracted from observed measurements and associated covariances are therefore calculated over the whole energy range. This paper recalls, in particular, recent advances on fission cross section calculations and is willing to suggest some hints for future developments.
Standard Model Vacuum Stability and Weyl Consistency Conditions
Antipin, Oleg; Gillioz, Marc; Krog, Jens;
2013-01-01
At high energy the standard model possesses conformal symmetry at the classical level. This is reflected at the quantum level by relations between the different beta functions of the model. These relations are known as the Weyl consistency conditions. We show that it is possible to satisfy them...... order by order in perturbation theory, provided that a suitable coupling constant counting scheme is used. As a direct phenomenological application, we study the stability of the standard model vacuum at high energies and compare with previous computations violating the Weyl consistency conditions....
Remark on the Consistent Gauge Anomaly in Supersymmetric Theories
Ohshima, Y; Suzuki, H; Yasuta, H; Ohshima, Yoshihisa; Okuyama, Kiyoshi; Suzuki, Hiroshi; Yasuta, Hirofumi
1999-01-01
We present a direct field theoretical calculation of the consistent gauge anomaly in the superfield formalism, on the basis of a definition of the effective action through the covariant gauge current. The scheme is conceptually and technically simple and the gauge covariance in intermediate steps reduces calculational labors considerably. The resultant superfield anomaly, being proportional to the anomaly $d^{abc}=\\tr T^a\\{T^b,T^c\\}$, is minimal even without supplementing any counterterms. Our anomaly coincides with the anomaly obtained by Marinkovi\\'c as the solution of the Wess-Zumino consistency condition.
A Van Atta reflector consisting of half-wave dipoles
Appel-Hansen, Jørgen
1966-01-01
The reradiation pattern of a passive Van Atta reflector consisting of half-wave dipoles is investigated. The character of the reradiation pattern first is deduced by qualitative and physical considerations. Various types of array elements are considered and several geometrical configurations...... of these elements are outlined. Following this, an analysis is made of the reradiation pattern of a linear Van Atta array consisting of four equispaced half-wave dipoles. The general form of the reradiation pattern is studied analytically. The influence of scattering and coupling is determined and the dependence...
HIGH CONSISTENCY PULPING OF OLD NEWSPRINT AND ITS FLOTATION PROPERTIES
Chunhui Zhang; Menghua Qin
2004-01-01
The mechanical and chemical effect on the pulping properties of the old newsprint was studied using a FORMAX Micro-Maelstrom Laboratory Pulper, and the flotation conditions such as velocity of air flow,air pressure and flotation time were also discussed with a FORMAX Deink Cell. The results show that sodium hydroxide, sodium silicate, hydrogen peroxide and deinking agent are the key factors in the chemical effect, and pulping consistency is more important than pulping time and rotation speed in the mechanical effect during the high consistency pulping of the ONP. In general, the chemical effect has a greater influence on the deinked pulp properties than the mechanical effect.
Island of Stability for Consistent Deformations of Einstein's Gravity
Berkhahn, Felix; Hofmann, Stefan; Kühnel, Florian; Moyassari, Parvin
2011-01-01
We construct explicitly deformations of Einstein's theory of gravity that are consistent and phenomenologically viable since they respect, in particular, cosmological backgrounds. We show that these deformations have unique symmetries in accordance with unitarity requirements, and give rise to a curvature induced self-stabilizing mechanism. As a consequence, any nonlinear completed deformation must incorporate self-stabilization on generic spacetimes already at lowest order in perturbation theory. Furthermore, our findings include the possibility of consistent and phenomenologically viable deformations of general relativity that are solely operative on curved spacetime geometries, reducing to Einstein's theory on the Minkowski background.
Quantum monadology: a consistent world model for consciousness and physics.
Nakagomi, Teruaki
2003-04-01
The NL world model presented in the previous paper is embodied by use of relativistic quantum mechanics, which reveals the significance of the reduction of quantum states and the relativity principle, and locates consciousness and the concept of flowing time consistently in physics. This model provides a consistent framework to solve apparent incompatibilities between consciousness (as our interior experience) and matter (as described by quantum mechanics and relativity theory). Does matter have an inside? What is the flowing time now? Does physics allow the indeterminism by volition? The problem of quantum measurement is also resolved in this model.
The cluster bootstrap consistency in generalized estimating equations
Cheng, Guang
2013-03-01
The cluster bootstrap resamples clusters or subjects instead of individual observations in order to preserve the dependence within each cluster or subject. In this paper, we provide a theoretical justification of using the cluster bootstrap for the inferences of the generalized estimating equations (GEE) for clustered/longitudinal data. Under the general exchangeable bootstrap weights, we show that the cluster bootstrap yields a consistent approximation of the distribution of the regression estimate, and a consistent approximation of the confidence sets. We also show that a computationally more efficient one-step version of the cluster bootstrap provides asymptotically equivalent inference. © 2012.
Consistent Deformed Bosonic Algebra in Noncommutative Quantum Mechanics
Zhang, Jian-Zu
2009-01-01
In two-dimensional noncommutive space for the case of both position - position and momentum - momentum noncommuting, the consistent deformed bosonic algebra at the non-perturbation level described by the deformed annihilation and creation operators is investigated. A general relation between noncommutative parameters is fixed from the consistency of the deformed Heisenberg - Weyl algebra with the deformed bosonic algebra. A Fock space is found, in which all calculations can be similarly developed as if in commutative space and all effects of spatial noncommutativity are simply represented by parameters.
Consistent Deformed Bosonic Algebra in Noncommutative Quantum Mechanics
Zhang, Jian-Zu
In two-dimensional noncommutative space for the case of both position-position and momentum-momentum noncommuting, the consistent deformed bosonic algebra at the nonperturbation level described by the deformed annihilation and creation operators is investigated. A general relation between noncommutative parameters is fixed from the consistency of the deformed Heisenberg-Weyl algebra with the deformed bosonic algebra. A Fock space is found, in which all calculations can be similarly developed as if in commutative space and all effects of spatial noncommutativity are simply represented by parameters.
HIGH CONSISTENCY PULPING OF OLD NEWSPRINT AND ITS FLOTATION PROPERTIES
ChunhuiZhang; MenghuaQin
2004-01-01
The mechanical and chemical effect on the pulping properties of the old newsprint was studied using a FORMAX Micro-Maelstrom Laboratory Pulper, and the flotation conditions such as velocity of air flow, air pressure and flotation time were also discussed with a FORMAX Deink Cell The results show that sodium hydroxide, sodium silicate, hydrogen peroxide and deinking agent are the key factors in the chemical effect, and pulping consistency is more important than pulping time and rotation speed in the mechanical effect during the high consistency pulping of the ONP. In general, the chemical effect has a greater influence on the deinked pulp properties than the mechanical effect.
Dynamically Consistent Nonlinear Evaluations with Their Generating Functions in Lp
Feng HU
2013-01-01
In this paper,we study dynamically consistent nonlinear evaluations in Lp (1 ＜ p ＜ 2).One of our aim is to obtain the following result:under a domination condition,an Ft-consistent evaluation is an ∑g-evaluation in Lp.Furthermore,without the assumption that the generating function g(t,ω,y,z) is continuous with respect to t,we provide some useful characterizations of an εg-evaluation by g and give some applications.These results include and extend some existing results.
Single-Field Inflation and the Local Ansatz: Distinguishability and Consistency
de Putter, Roland; Green, Daniel; Meyers, Joel
2016-01-01
The single-field consistency conditions and the local ansatz have played separate but important roles in characterizing the non-Gaussian signatures of single- and multifield inflation respectively. We explore the precise relationship between these two approaches and their predictions. We demonstrate that the predictions of the single-field consistency conditions can never be satisfied by a general local ansatz with deviations necessarily arising at order $(n_s-1)^2$. This implies that there is, in principle, a minimum difference between single- and (fully local) multifield inflation in observables sensitive to the squeezed limit such as scale-dependent halo bias. We also explore some potential observational implications of the consistency conditions and its relationship to the local ansatz. In particular, we propose a new scheme to test the consistency relations. In analogy with delensing of the cosmic microwave background, one can deproject the coupling of the long wavelength modes with the short wavelength ...
Internal Consistency and Power When Comparing Total Scores from Two Groups.
Barchard, Kimberly A; Brouwers, Vincent
2016-01-01
Researchers now know that when theoretical reliability increases, power can increase, decrease, or stay the same. However, no analytic research has examined the relationship of power to the most commonly used type of reliability-internal consistency-and the most commonly used measures of internal consistency, coefficient alpha and ICC(A,k). We examine the relationship between the power of independent samples t tests and internal consistency. We explicate the mathematical model upon which researchers usually calculate internal consistency, one in which total scores are calculated as the sum of observed scores on K measures. Using this model, we derive a new formula for effect size to show that power and internal consistency are influenced by many of the same parameters but not always in the same direction. Changing an experiment in one way (e.g., lengthening the measure) is likely to influence multiple parameters simultaneously; thus, there are no simple relationships between such changes and internal consistency or power. If researchers revise measures to increase internal consistency, this might not increase power. To increase power, researchers should increase sample size, select measures that assess areas where group differences are largest, and use more powerful statistical procedures (e.g., ANCOVA).
Brief Report: Consistency of Search Engine Rankings for Autism Websites
Reichow, Brian; Naples, Adam; Steinhoff, Timothy; Halpern, Jason; Volkmar, Fred R.
2012-01-01
The World Wide Web is one of the most common methods used by parents to find information on autism spectrum disorders and most consumers find information through search engines such as Google or Bing. However, little is known about how the search engines operate or the consistency of the results that are returned over time. This study presents the…
Consistency Within Diversity: Guidelines for Programs to Honor Exemplary Teaching.
Svinicki, Marilla D.; Menges, Robert J.
1996-01-01
Good programs for recognizing exemplary college teaching are consistent with institutional mission and values, are grounded in research-based competencies and practices, recognize all significant facets of instruction, reward both collaborative and individual achievements, neither preclude nor replace the institutional reward system, call on those…
Weakly time consistent concave valuations and their dual representations
Roorda, Berend; Schumacher, Johannes M.
2016-01-01
We derive dual characterizations of two notions of weak time consistency for concave valuations, which are convex risk measures under a positive sign convention. Combined with a suitable risk aversion property, these notions are shown to amount to three simple rules for not necessarily minimal repr
A Consistent Procedure for Pseudo-Component Delumping
Leibovici, Claude; Stenby, Erling Halfdan; Knudsen, Kim
1996-01-01
. Thereby infinite dilution K-values can be obtained exactly without any further computation.Based on these results a consistent procedure for the estimation of equilibrium constants in the more classical cases of finite dilution has been developed. It can be used when moderate binary interaction parameters...
Weakly time consistent concave valuations and their dual representations
Roorda, B.; Schumacher, Hans
2016-01-01
We derive dual characterizations of two notions of weak time consistency for concave valuations, which are convex risk measures under a positive sign convention. Combined with a suitable risk aversion property, these notions are shown to amount to three simple rules for not necessarily minimal repre
Hippocampography Guides Consistent Mesial Resections in Neocortical Temporal Lobe Epilepsy
Marcus C. Ng
2016-01-01
Full Text Available Background. The optimal surgery in lesional neocortical temporal lobe epilepsy is unknown. Hippocampal electrocorticography maximizes seizure freedom by identifying normal-appearing epileptogenic tissue for resection and minimizes neuropsychological deficit by limiting resection to demonstrably epileptogenic tissue. We examined whether standardized hippocampal electrocorticography (hippocampography guides resection for more consistent hippocampectomy than unguided resection in conventional electrocorticography focused on the lesion. Methods. Retrospective chart reviews any kind of electrocorticography (including hippocampography as part of combined lesionectomy, anterolateral temporal lobectomy, and hippocampectomy over 8 years . Patients were divided into mesial (i.e., hippocampography and lateral electrocorticography groups. Primary outcome was deviation from mean hippocampectomy length. Results. Of 26 patients, fourteen underwent hippocampography-guided mesial temporal resection. Hippocampography was associated with 2.6 times more consistent resection. The range of hippocampal resection was 0.7 cm in the mesial group and 1.8 cm in the lateral group (p=0.01. 86% of mesial group versus 42% of lateral group patients achieved seizure freedom (p=0.02. Conclusions. By rationally tailoring excision to demonstrably epileptogenic tissue, hippocampography significantly reduces resection variability for more consistent hippocampectomy than unguided resection in conventional electrocorticography. More consistent hippocampal resection may avoid overresection, which poses greater neuropsychological risk, and underresection, which jeopardizes postoperative seizure freedom.
Discrete anomalies in supergravity and consistency of string backgrounds
Minasian, Ruben; Sasmal, Soumya; Savelli, Raffaele
2017-02-01
We examine SL(2, ℤ) anomalies in ten and eight-dimensional supergravities, the induced local counterterms and their realization in string theory. Composite connections play an important rôle in the cancellation mechanism. At the same time their global properties lead to novel non-trivial consistency constraints on compactifications.
Robust Visual Tracking Via Consistent Low-Rank Sparse Learning
Zhang, Tianzhu
2014-06-19
Object tracking is the process of determining the states of a target in consecutive video frames based on properties of motion and appearance consistency. In this paper, we propose a consistent low-rank sparse tracker (CLRST) that builds upon the particle filter framework for tracking. By exploiting temporal consistency, the proposed CLRST algorithm adaptively prunes and selects candidate particles. By using linear sparse combinations of dictionary templates, the proposed method learns the sparse representations of image regions corresponding to candidate particles jointly by exploiting the underlying low-rank constraints. In addition, the proposed CLRST algorithm is computationally attractive since temporal consistency property helps prune particles and the low-rank minimization problem for learning joint sparse representations can be efficiently solved by a sequence of closed form update operations. We evaluate the proposed CLRST algorithm against 14 state-of-the-art tracking methods on a set of 25 challenging image sequences. Experimental results show that the CLRST algorithm performs favorably against state-of-the-art tracking methods in terms of accuracy and execution time.
Checking Consistency of Pedigree Information is NP-complete
Aceto, Luca; Hansen, Jens A.; Ingolfsdottir, Anna
arose originally from the geneticists' need to filter their input data from erroneous information, and is well motivated from both a biological and a sociological viewpoint. This paper shows that consistency checking is NP-complete, even in the presence of three alleles. Several other results...
An algebraic method for constructing stable and consistent autoregressive filters
Harlim, John, E-mail: jharlim@psu.edu [Department of Mathematics, the Pennsylvania State University, University Park, PA 16802 (United States); Department of Meteorology, the Pennsylvania State University, University Park, PA 16802 (United States); Hong, Hoon, E-mail: hong@ncsu.edu [Department of Mathematics, North Carolina State University, Raleigh, NC 27695 (United States); Robbins, Jacob L., E-mail: jlrobbi3@ncsu.edu [Department of Mathematics, North Carolina State University, Raleigh, NC 27695 (United States)
2015-02-15
In this paper, we introduce an algebraic method to construct stable and consistent univariate autoregressive (AR) models of low order for filtering and predicting nonlinear turbulent signals with memory depth. By stable, we refer to the classical stability condition for the AR model. By consistent, we refer to the classical consistency constraints of Adams–Bashforth methods of order-two. One attractive feature of this algebraic method is that the model parameters can be obtained without directly knowing any training data set as opposed to many standard, regression-based parameterization methods. It takes only long-time average statistics as inputs. The proposed method provides a discretization time step interval which guarantees the existence of stable and consistent AR model and simultaneously produces the parameters for the AR models. In our numerical examples with two chaotic time series with different characteristics of decaying time scales, we find that the proposed AR models produce significantly more accurate short-term predictive skill and comparable filtering skill relative to the linear regression-based AR models. These encouraging results are robust across wide ranges of discretization times, observation times, and observation noise variances. Finally, we also find that the proposed model produces an improved short-time prediction relative to the linear regression-based AR-models in forecasting a data set that characterizes the variability of the Madden–Julian Oscillation, a dominant tropical atmospheric wave pattern.
Consistency of the Takens estimator for the correlation dimension
Borovkova, S; Burton, R; Dehling, H
1999-01-01
Motivated by the problem of estimating the fractal dimension of a strange attractor, we prove weak consistency of U-statistics for stationary ergodic and mixing sequences when the kernel function is unbounded, extending by this earlier results of Aaronson, Burton, Dehling, Gilat, Hill and Weiss. We
Body saccades of Drosophila consist of stereotyped banked turns
Muijres, F.T.; Elzinga, M.J.; Iwasaki, N.A.; Dickinson, M.H.
2015-01-01
The flight pattern of many fly species consists of straight flight segments interspersed with rapid turns called body saccades, a strategy that is thought to minimize motion blur. We analyzed the body saccades of fruit flies (Drosophila hydei), using high-speed 3D videography to track body and wing
Assessing atmospheric bias correction for dynamical consistency using potential vorticity
Rocheta, Eytan; Evans, Jason P.; Sharma, Ashish
2014-12-01
Correcting biases in atmospheric variables prior to impact studies or dynamical downscaling can lead to new biases as dynamical consistency between the ‘corrected’ fields is not maintained. Use of these bias corrected fields for subsequent impact studies and dynamical downscaling provides input conditions that do not appropriately represent intervariable relationships in atmospheric fields. Here we investigate the consequences of the lack of dynamical consistency in bias correction using a measure of model consistency—the potential vorticity (PV). This paper presents an assessment of the biases present in PV using two alternative correction techniques—an approach where bias correction is performed individually on each atmospheric variable, thereby ignoring the physical relationships that exists between the multiple variables that are corrected, and a second approach where bias correction is performed directly on the PV field, thereby keeping the system dynamically coherent throughout the correction process. In this paper we show that bias correcting variables independently results in increased errors above the tropopause in the mean and standard deviation of the PV field, which are improved when using the alternative proposed. Furthermore, patterns of spatial variability are improved over nearly all vertical levels when applying the alternative approach. Results point to a need for a dynamically consistent atmospheric bias correction technique which results in fields that can be used as dynamically consistent lateral boundaries in follow-up downscaling applications.
Discrete anomalies in supergravity and consistency of string backgrounds
Minasian, Ruben; Savelli, Raffaele
2016-01-01
We examine SL(2, Z) anomalies in ten and eight-dimensional supergravities, the induced local counterterms and their realization in string theory. Composite connections play an important role in the cancellation mechanism. At the same time their global properties lead to novel non-trivial consistency constraints on compactifications.
New sequential quadratic programming algorithm with consistent subproblems
贺国平; 高自友; 赖炎连
1997-01-01
One of the most interesting topics related to sequential quadratic programming algorithms is how to guarantee the consistence of all quadratic programming subproblems. In this decade, much work trying to change the form of constraints to obtain the consistence of the subproblems has been done The method proposed by De O. Panto-ja J F A and coworkers solves the consistent problem of SQP method, and is the best to the authors’ knowledge. However, the scale and complexity of the subproblems in De O. Pantoja’s work will be increased greatly since all equality constraints have to be changed into absolute form A new sequential quadratic programming type algorithm is presented by means of a special ε-active set scheme and a special penalty function. Subproblems of the new algorithm are all consistent, and the form of constraints of the subproblems is as simple as one of the general SQP type algorithms. It can be proved that the new method keeps global convergence and local superhnear convergence.
Personalities in great tits, Parus major : stability and consistency
Carere, C; Drent, Piet J.; Privitera, Lucia; Koolhaas, Jaap M.; Groothuis, TGG
2005-01-01
We carried out a longitudinal study on great tits from two lines bidirectionally selected for fast or slow exploratory performance during the juvenile phase, a trait thought to reflect different personalities. We analysed temporal stability and consistency of responses within and between situations
Self-Consistence of Semi-Classical Gravity
Suen, W M
1992-01-01
Simon argued that the semi-classical theory of gravity, unless with some of its solutions excluded, is unacceptable for reasons of both self-consistency and experiment, and that it has to be replaced by a constrained semi-classical theory. We examined whether the evidence is conclusive.
SOCIAL COMPARISON, SELF-CONSISTENCY AND THE PRESENTATION OF SELF.
MORSE, STANLEY J.; GERGEN, KENNETH J.
TO DISCOVER HOW A PERSON'S (P) SELF-CONCEPT IS AFFECTED BY THE CHARACTERISTICS OF ANOTHER (O) WHO SUDDENLY APPEARS IN THE SAME SOCIAL ENVIRONMENT, SEVERAL QUESTIONNAIRES, INCLUDING THE GERGEN-MORSE (1967) SELF-CONSISTENCY SCALE AND HALF THE COOPERSMITH SELF-ESTEEM INVENTORY, WERE ADMINISTERED TO 78 UNDERGRADUATE MEN WHO HAD ANSWERED AN AD FOR WORK…
Plant functional traits have globally consistent effects on competition
Kunstler, Georges; Falster, Daniel; Coomes, David A.; Poorter, Lourens
2016-01-01
Phenotypic traits and their associated trade-offs have been shown to have globally consistent effects on individual plant physiological functions, but how these effects scale up to influence competition, a key driver of community assembly in terrestrial vegetation, has remained unclear. Here we
Fully self-consistent GW calculations for molecules
Rostgaard, Carsten; Jacobsen, Karsten Wedel; Thygesen, Kristian Sommer
2010-01-01
We calculate single-particle excitation energies for a series of 34 molecules using fully self-consistent GW, one-shot G0W0, Hartree-Fock (HF), and hybrid density-functional theory (DFT). All calculations are performed within the projector-augmented wave method using a basis set of Wannier...
On ZRP wind input term consistency in Hasselmann equation
Zakharov, Vladimir; Pushkarev, Andrei
2016-01-01
The new ZRP wind input source term (Zakharov et al. 2012) is checked for its consistency via numerical simulation of Hasselmann equation. The results are compared to field experimental data, collected at different sites around the world, and theoretical predictions of self-similarity analysis. Good agreement is obtained for limited fetch and time domain statements
Consistency in behavior of the CEO regarding corporate social responsibility
Elving, W.J.L.; Kartal, D.
2012-01-01
Purpose - When corporations adopt a corporate social responsibility (CSR) program and use and name it in their external communications, their members should act in line with CSR. The purpose of this paper is to present an experiment in which the consistent or inconsistent behavior of a CEO was
An Intuitionistic Epistemic Logic for Sequential Consistency on Shared Memory
Hirai, Yoichi
In the celebrated Gödel Prize winning papers, Herlihy, Shavit, Saks and Zaharoglou gave topological characterization of waitfree computation. In this paper, we characterize waitfree communication logically. First, we give an intuitionistic epistemic logic k∨ for asynchronous communication. The semantics for the logic k∨ is an abstraction of Herlihy and Shavit's topological model. In the same way Kripke model for intuitionistic logic informally describes an agent increasing its knowledge over time, the semantics of k∨ describes multiple agents passing proofs around and developing their knowledge together. On top of the logic k∨, we give an axiom type that characterizes sequential consistency on shared memory. The advantage of intuitionistic logic over classical logic then becomes apparent as the axioms for sequential consistency are meaningless for classical logic because they are classical tautologies. The axioms are similar to the axiom type for prelinearity (ϕ ⊃ ψ) ∨ (ψ ⊃ ϕ). This similarity reflects the analogy between sequential consistency for shared memory scheduling and linearity for Kripke frames: both require total order on schedules or models. Finally, under sequential consistency, we give soundness and completeness between a set of logical formulas called waitfree assertions and a set of models called waitfree schedule models.
Noncommuting Electric Fields and Algebraic Consistency in Noncommutative Gauge theories
Banerjee, R
2003-01-01
We show that noncommuting electric fields occur naturally in noncommutative gauge theories. Using this noncommutativity, which is field dependent, and a hamiltonian generalisation of the Seiberg-Witten Map, the algebraic consistency in the lagrangian and hamiltonian formulations of these theories, is established. The stability of the Poisson algebra, under this generalised map, is studied.
Efficient self-consistent quantum transport simulator for quantum devices
Gao, X., E-mail: xngao@sandia.gov; Mamaluy, D.; Nielsen, E.; Young, R. W.; Lilly, M. P.; Bishop, N. C.; Carroll, M. S.; Muller, R. P. [Sandia National Laboratories, 1515 Eubank SE, Albuquerque, New Mexico 87123 (United States); Shirkhorshidian, A. [Sandia National Laboratories, 1515 Eubank SE, Albuquerque, New Mexico 87123 (United States); University of New Mexico, Albuquerque, New Mexico 87131 (United States)
2014-04-07
We present a self-consistent one-dimensional (1D) quantum transport simulator based on the Contact Block Reduction (CBR) method, aiming for very fast and robust transport simulation of 1D quantum devices. Applying the general CBR approach to 1D open systems results in a set of very simple equations that are derived and given in detail for the first time. The charge self-consistency of the coupled CBR-Poisson equations is achieved by using the predictor-corrector iteration scheme with the optional Anderson acceleration. In addition, we introduce a new way to convert an equilibrium electrostatic barrier potential calculated from an external simulator to an effective doping profile, which is then used by the CBR-Poisson code for transport simulation of the barrier under non-zero biases. The code has been applied to simulate the quantum transport in a double barrier structure and across a tunnel barrier in a silicon double quantum dot. Extremely fast self-consistent 1D simulations of the differential conductance across a tunnel barrier in the quantum dot show better qualitative agreement with experiment than non-self-consistent simulations.
Context-dependent individual behavioral consistency in Daphnia
Heuschele, Jan; Ekvall, Mikael T.; Bianco, Giuseppe
2017-01-01
, whereas studies focusing on smaller aquatic organisms are still rare. Here, we show individual differences in the swimming behavior of Daphnia magna, a clonal freshwater invertebrate, before, during, and after being exposed to a lethal threat, ultraviolet radiation (UVR). We show consistency in swimming...
FINITE DEFORMATION ELASTO-PLASTIC THEORY AND CONSISTENT ALGORITHM
Liu Xuejun; Li Mingrui; Huang Wenbin
2001-01-01
By using the logarithmic strain, the finite deformation plastic theory, corresponding to the infinitesimal plastic theory, is established successively. The plastic consistent algorithm with first order accuracy for the finite element method (FEM) is developed. Numerical examples are presented to illustrate the validity of the theory and effectiveness of the algorithm.
Consistent measurements comparing the drift features of noble gas mixtures
Becker, U; Fortunato, E M; Kirchner, J; Rosera, K; Uchida, Y
1999-01-01
We present a consistent set of measurements of electron drift velocities and Lorentz deflection angles for all noble gases with methane and ethane as quenchers in magnetic fields up to 0.8 T. Empirical descriptions are also presented. Details on the World Wide Web allow for guided design and optimization of future detectors.
Evaluating Reflective Writing for Appropriateness, Fairness, and Consistency.
Kennison, Monica Metrick; Misselwitz, Shirley
2002-01-01
Samples from 17 reflective journals of nursing students were evaluated by 6 faculty. Results indicate a lack of consistency in grading reflective writing, lack of consensus regarding evaluation, and differences among faculty regarding their view of such exercises. (Contains 26 references.) (JOW)
Improving consistency in student evaluation at affiliated family practice centers.
Rabinowitz, H K
1986-01-01
The Department of Family Medicine at Jefferson Medical College has since 1974 been successful in administering a required third-year family medicine clerkship, providing students with a structured, didactic, and experiential curriculum in six affiliated family practice centers. Prior analysis (1976-1981) had indicated, however, that variation existed in evaluating similar students, depending on the clerkship training site, i.e., three sites graded students in a significantly different fashion than the three other sites. Utilizing these data to focus on the evaluation process, a comprehensive and specific six-point plan was developed to improve consistency in evaluations at the different training sites. This plan consisted of a yearly meeting of affiliate faculty, assigning predoctoral training administrative responsibility to one faculty member at each training site, increased telephone communication, affiliate-faculty attendance at the university site evaluation session, faculty rotation to spend time at other training sites, and financial reimbursement to the affiliate training sites. After intervention, analysis (1981-1983) indicated that five of the six clerkship sites now grade students in a consistent fashion, with only one affiliate using different grading standards. The intervention was therefore judged to be successful for five of the six training sites, allowing for better communication and more critical and consistent evaluation of medical students.
[Consistent presentation of medical images based on CPI integration profile].
Jiang, Tao; An, Ji-ye; Chen, Zhong-yong; Lu, Xu-dong; Duan, Hui-long
2007-11-01
Because of different display parameters and other factors, digital medical images present different display states in different section offices of a hospital. Based on CPI integration profile of IHE, this paper implements the consistent presentation of medical images, and it is helpful for doctors to carry out medical treatments of teamwork.
Measures of Consistency for Holland-Type Codes.
Strahan, Robert F.
1987-01-01
Describes two new measures of consistency which refer to the extent to which more closely related scale types are found together in Holland's Self-Directed Search sort. One measure is based on the hexagonal model for use with three-point codes. The other is based on conditional probabilities for use with two-point codes. (Author/ABL)
Consistent Data Assimilation of Isotopes: 242Pu and 105Pd
G. Palmiotti; H. Hiruta; M. Salvatores
2012-09-01
In this annual report we illustrate the methodology of the consistent data assimilation that allows to use the information coming from integral experiments for improving the basic nuclear parameters used in cross section evaluation. A series of integral experiments are analyzed using the EMPIRE evaluated files for 242Pu and 105Pd. In particular irradiation experiments (PROFIL-1 and -2, TRAPU-1, -2 and -3) provide information about capture cross sections, and a critical configuration, COSMO, where fission spectral indexes were measured, provides information about fission cross section. The observed discrepancies between calculated and experimental results are used in conjunction with the computed sensitivity coefficients and covariance matrix for nuclear parameters in a consistent data assimilation. The results obtained by the consistent data assimilation indicate that not so large modifications on some key identified nuclear parameters allow to obtain reasonable C/E. However, for some parameters such variations are outside the range of 1 s of their initial standard deviation. This can indicate a possible conflict between differential measurements (used to calculate the initial standard deviations) and the integral measurements used in the statistical data adjustment. Moreover, an inconsistency between the C/E of two sets of irradiation experiments (PROFIL and TRAPU) is observed for 242Pu. This is the end of this project funded by the Nuclear Physics Program of the DOE Office of Science. We can indicate that a proof of principle has been demonstrated for a few isotopes for this innovative methodology. However, we are still far from having explored all the possibilities and made this methodology to be considered proved and robust. In particular many issues are worth further investigation: • Non-linear effects • Flexibility of nuclear parameters in describing cross sections • Multi-isotope consistent assimilation • Consistency between differential and integral
The internal consistency of the North Sea carbonate system
Salt, Lesley A.; Thomas, Helmuth; Bozec, Yann; Borges, Alberto V.; de Baar, Hein J. W.
2016-05-01
In 2002 (February) and 2005 (August), the full suite of carbonate system parameters (total alkalinity (AT), dissolved inorganic carbon (DIC), pH, and partial pressure of CO2 (pCO2) were measured on two re-occupations of the entire North Sea basin, with three parameters (AT, DIC, pCO2) measured on four additional re-occupations, covering all four seasons, allowing an assessment of the internal consistency of the carbonate system. For most of the year, there is a similar level of internal consistency, with AT being calculated to within ± 6 μmol kg- 1 using DIC and pH, DIC to ± 6 μmol kg- 1 using AT and pH, pH to ± 0.008 using AT and pCO2, and pCO2 to ± 8 μatm using DIC and pH, with the dissociation constants of Millero et al. (2006). In spring, however, we observe a significant decline in the ability to accurately calculate the carbonate system. Lower consistency is observed with an increasing fraction of Baltic Sea water, caused by the high contribution of organic alkalinity in this water mass, not accounted for in the carbonate system calculations. Attempts to improve the internal consistency by accounting for the unconventional salinity-borate relationships in freshwater and the Baltic Sea, and through application of the new North Atlantic salinity-boron relationship (Lee et al., 2010), resulted in no significant difference in the internal consistency.
Consistency of accuracy assessment indices for soft classification: Simulation analysis
Chen, Jin; Zhu, Xiaolin; Imura, Hidefumi; Chen, Xuehong
Accuracy assessment plays a crucial role in the implementation of soft classification. Even though many indices of accuracy assessment for soft classification have been proposed, the consistencies among these indices are not clear, and the impact of sample size on these consistencies has not been investigated. This paper examines two kinds of indices: map-level indices, including root mean square error ( rmse), kappa, and overall accuracy ( oa) from the sub-pixel confusion matrix (SCM); and category-level indices, including crmse, user accuracy ( ua) and producer accuracy ( pa). A careful simulation was conducted to investigate the consistency of these indices and the effect of sample size. The major findings were as follows: (1) The map-level indices are highly consistent with each other, whereas the category-level indices are not. (2) The consistency among map-level and category-level indices becomes weaker when the sample size decreases. (3) The rmse is more affected by error distribution among classes than are kappa and oa. Based on these results, we recommend that rmse can be used for map-level accuracy due to its simplicity, although kappa and oa may be better alternatives when the sample size is limited because the two indices are affected less by the error distribution among classes. We also suggest that crmse should be provided when map users are not concerned about the error source, whereas ua and pa are more useful when the complete information about different errors is required. The results of this study will be of benefit to the development and application of soft classifiers.
Stable functional networks exhibit consistent timing in the human brain.
Chapeton, Julio I; Inati, Sara K; Zaghloul, Kareem A
2017-03-01
Despite many advances in the study of large-scale human functional networks, the question of timing, stability, and direction of communication between cortical regions has not been fully addressed. At the cellular level, neuronal communication occurs through axons and dendrites, and the time required for such communication is well defined and preserved. At larger spatial scales, however, the relationship between timing, direction, and communication between brain regions is less clear. Here, we use a measure of effective connectivity to identify connections between brain regions that exhibit communication with consistent timing. We hypothesized that if two brain regions are communicating, then knowledge of the activity in one region should allow an external observer to better predict activity in the other region, and that such communication involves a consistent time delay. We examine this question using intracranial electroencephalography captured from nine human participants with medically refractory epilepsy. We use a coupling measure based on time-lagged mutual information to identify effective connections between brain regions that exhibit a statistically significant increase in average mutual information at a consistent time delay. These identified connections result in sparse, directed functional networks that are stable over minutes, hours, and days. Notably, the time delays associated with these connections are also highly preserved over multiple time scales. We characterize the anatomic locations of these connections, and find that the propagation of activity exhibits a preferred posterior to anterior temporal lobe direction, consistent across participants. Moreover, networks constructed from connections that reliably exhibit consistent timing between anatomic regions demonstrate features of a small-world architecture, with many reliable connections between anatomically neighbouring regions and few long range connections. Together, our results demonstrate
Alberto Cargnelutti Filho
2011-09-01
Full Text Available O objetivo deste trabalho foi avaliar a consistência do padrão de agrupamento obtido a partir da combinação de duas medidas de dissimilaridade e quatro métodos de agrupamento, em cenários formados por combinações de número de cultivares e número de variáveis, com dados reais de cultivares de milho (Zea mays L. e com dados simulados. Foram usados os dados reais de cinco variáveis mensuradas em 69 experimentos de competição de cultivares de milho, cujo número de cultivares avaliadas oscilou entre 9 e 40. A fim de investigar os resultados com maior número de cultivares e de variáveis, foram simulados, sob distribuição normal padrão, 1.000 experimentos para cada um dos 54 cenários formados pela combinação entre o número de cultivares (20, 30, 40, 50, 60, 70, 80, 90 e 100 e o número de variáveis (5, 6, 7, 8, 9 e 10. Foram realizadas análises de correlação, de diagnóstico de multicolinearidade e de agrupamento. A consistência do padrão de agrupamento foi avaliada por meio do coeficiente de correlação cofenética. Há decréscimo da consistência do padrão de agrupamento com o acréscimo do número de cultivares e de variáveis. A distância euclidiana proporciona maior consistência no padrão de agrupamento em relação à distância de Manhattan. A consistência do padrão de agrupamento entre os métodos aumenta na seguinte ordem: Ward, ligação completa, ligação simples e ligação média entre grupo.The objective of this research was to evaluate the clustering pattern consistency obtained from the combination of the two dissimilarity measures and four clustering methods, in scenarios consist of combinations number of cultivars and number of variables, with real data in corn cultivars (Zea mays L. and simulated data. We used real data from five variables measured in 69 trials involving corn cultivars, the number of cultivars ranged between 9 and 40. In order to investigate the results with more cultivars and
Pasi Nieminen,
2012-05-01
Full Text Available Previous physics education research has raised the question of “hidden variables” behind students’ success in learning certain concepts. In the context of the force concept, it has been suggested that students’ reasoning ability is one such variable. Strong positive correlations between students’ preinstruction scores for reasoning ability (measured by Lawson’s Classroom Test of Scientific Reasoning and their learning of forces [measured by the Force Concept Inventory (FCI] have been reported in high school and university introductory courses. However, there is no published research concerning the relation between students’ ability to interpret multiple representations consistently (i.e., representational consistency and their learning of forces. To investigate this, we collected 131 high school students’ pre- and post-test data of the Representational Variant of the Force Concept Inventory (for representational consistency and the FCI. The students’ Lawson pretest data were also collected. We found that the preinstruction level of students’ representational consistency correlated strongly with student learning gain of forces. The correlation (0.51 was almost equal to the correlation between Lawson prescore and learning gain of forces (0.52. Our results support earlier findings which suggest that scientific reasoning ability is a hidden variable behind the learning of forces. In addition, we suggest that students’ representational consistency may also be such a factor, and that this should be recognized in physics teaching.
Violation of consistency relations and the protoinflationary transition
Giovannini, Massimo
2014-01-01
If we posit the validity of the consistency relations, the tensor spectral index and the relative amplitude of the scalar and tensor power spectra are both fixed by a single slow roll parameter. The physics of the protoinflationary transition can break explicitly the consistency relations causing a reduction of the inflationary curvature scale in comparison with the conventional lore. After a critical scrutiny, we argue that the inflationary curvature scale, the total number of inflationary efolds and, ultimately, the excursion of the inflaton across its Planckian boundary are all characterized by a computable theoretical error. While these considerations ease some of the tensions between the Bicep2 data and the other satellite observations, they also demand an improved understanding of the protoinflationary transition whose physical features may be assessed, in the future, through a complete analysis of the spectral properties of the B mode autocorrelations.
Design of a Turbulence Generator of Medium Consistency Pulp Pumps
Hong Li
2012-01-01
Full Text Available The turbulence generator is a key component of medium consistency centrifugal pulp pumps, with functions to fluidize the medium consistency pulp and to separate gas from the liquid. Structure sizes of the generator affect the hydraulic performance. The radius and the blade laying angle are two important structural sizes of a turbulence generator. Starting with the research on the flow inside and shearing characteristics of the MC pulp, a simple mathematical model at the flow section of the shearing chamber is built, and the formula and procedure to calculate the radius of the turbulence generator are established. The blade laying angle is referenced from the turbine agitator which has the similar shape with the turbulence generator, and the CFD simulation is applied to study the different flow fields with different blade laying angles. Then the recommended blade laying angle of the turbulence generator is formed to be between 60° and 75°.
Non-trivial checks of novel consistency relations
Berezhiani, Lasha; Khoury, Justin [Center for Particle Cosmology, Department of Physics and Astronomy, University of Pennsylvania, Philadelphia, PA 19104 (United States); Wang, Junpu, E-mail: lashaber@gmail.com, E-mail: jkhoury@sas.upenn.edu, E-mail: jwang217@jhu.edu [Department of Physics and Astronomy, Johns Hopkins University, Baltimore, MD 21218 (United States)
2014-06-01
Single-field perturbations satisfy an infinite number of consistency relations constraining the squeezed limit of correlation functions at each order in the soft momentum. These can be understood as Ward identities for an infinite set of residual global symmetries, or equivalently as Slavnov-Taylor identities for spatial diffeomorphisms. In this paper, we perform a number of novel, non-trivial checks of the identities in the context of single field inflationary models with arbitrary sound speed. We focus for concreteness on identities involving 3-point functions with a soft external mode, and consider all possible scalar and tensor combinations for the hard-momentum modes. In all these cases, we check the consistency relations up to and including cubic order in the soft momentum. For this purpose, we compute for the first time the 3-point functions involving 2 scalars and 1 tensor, as well as 2 tensors and 1 scalar, for arbitrary sound speed.
Non-Trivial Checks of Novel Consistency Relations
Berezhiani, Lasha; Wang, Junpu
2014-01-01
Single-field perturbations satisfy an infinite number of consistency relations constraining the squeezed limit of correlation functions at each order in the soft momentum. These can be understood as Ward identities for an infinite set of residual global symmetries, or equivalently as Slavnov-Taylor identities for spatial diffeomorphisms. In this paper, we perform a number of novel, non-trivial checks of the identities in the context of slow-roll single field inflationary models with arbitrary sound speed. We focus for concreteness on identities involving 3-point functions with a soft external mode, and consider all possible scalar and tensor combinations for the hard-momentum modes. In all these cases, we check the consistency relations up to and including cubic order in the soft momentum. For this purpose, we compute for the first time the 3-point functions involving 2 scalars and 1 tensor, as well as 2 tensors and 1 scalar, for arbitrary sound speed.
Multiconfigurational self-consistent reaction field theory for nonequilibrium solvation
Mikkelsen, Kurt V.; Cesar, Amary; Ågren, Hans
1995-01-01
We present multiconfigurational self-consistent reaction field theory and implementation for solvent effects on a solute molecular system that is not in equilibrium with the outer solvent. The approach incorporates two different polarization vectors for studying the influence of the solvent...... states influenced by the two types of polarization vectors. The general treatment of the correlation problem through the use of complete and restricted active space methodologies makes the present multiconfigurational self-consistent reaction field approach general in that it can handle any type of state......, open-shell, excited, and transition states. We demonstrate the theory by computing solvatochromatic shifts in optical/UV spectra of some small molecules and electron ionization and electron detachment energies of the benzene molecule. It is shown that the dependency of the solvent induced affinity...
Branch dependence in the "consistent histories" approach to quantum mechanics
Müller, T
2005-01-01
In the consistent histories formalism one specifies a family of histories as an exhaustive set of pairwise exclusive descriptions of the dynamics of a quantum system. We define branching families of histories, which strike a middle ground between the two available mathematically precise definitions of families of histories, viz., product families and Isham's history projector operator formalism. The former are too narrow for applications, and the latter's generality comes at a certain cost, barring an intuitive reading of the ``histories''. Branching families retain the intuitiveness of product families, they allow for the interpretation of a history's weight as a probability, and they allow one to distinguish two kinds of coarse-graining. It is shown that for branching families, the ``consistency condition'' is not a precondition for assigning probabilities, but for a specific kind of coarse-graining.
Structures, profile consistency, and transport scaling in electrostatic convection
Bian, N.H.; Garcia, O.E.
2005-01-01
that for interchange modes, profile consistency is in fact due to mixing by persistent large-scale convective cells. This mechanism is not a turbulent diffusion, cannot occur in collisionless systems, and is the analog of the well-known laminar "magnetic flux expulsion" in magneiohydrodynamics. This expulsion process...... involves a "pinch" across closed streamlines and further results in the formation of pressure fingers along the-separatrix of the convective cells. By nature, these coherent structures are dissipative because the mixing process that leads to their formation relies on a finite amount of collisional...... diffusion. Numerical simulations of two-dimensional interchange modes confirm the role of laminar expulsion by convective cells, for profile consistency and structure formation. They also show that the fingerlike pressure structures ultimately control the rate of heat transport across the plasma layer...
Turbulent MHD transport coefficients - An attempt at self-consistency
Chen, H.; Montgomery, D.
1987-01-01
In this paper, some multiple scale perturbation calculations of turbulent MHD transport coefficients begun in earlier papers are first completed. These generalize 'alpha effect' calculations by treating the velocity field and magnetic field on the same footing. Then the problem of rendering such calculations self-consistent is addressed, generalizing an eddy-viscosity hypothesis similar to that of Heisenberg for the Navier-Stokes case. The method also borrows from Kraichnan's direct interaction approximation. The output is a set of integral equations relating the spectra and the turbulent transport coefficients. Previous 'alpha effect' and 'beta effect' coefficients emerge as limiting cases. A treatment of the inertial range can also be given, consistent with a -5/3 energy spectrum power law. In the Navier-Stokes limit, a value of 1.72 is extracted for the Kolmogorov constant. Further applications to MHD are possible.
Consistent return mapping algorithm for plane stress elastoplasticity
Simo, J.C.; Taylor, R.L.
1985-05-01
An unconditionally stable algorithm for plane stress elastoplasticity is developed, based upon the motion of elastic predictor return mapping (plastic corrector). Enforcement of the consistency condition is shown to reduce to the solution of a simple nonlinear equation. Consistent elastoplastic tangent moduli are obtained by exact linearization of the algorithm. Use of these moduli is essential in order to preserve the asymptotic rate of quadratic convergence of Newton methods. An exact solution for constant strain rate over the typical time step is derived. On the basis of this solution the accuracy of the algorithm is assessed by means of iso-error maps. The excellent performance of the algorithm for large time steps is illustrated in numerical experiments.
Strong Consistency of the Empirical Martingale Simulation Option Price Estimator
Zhu-shun Yuan; Ge-mai Chen
2009-01-01
A simulation technique known as empirical martingale simulation (EMS) was proposed to improve simulation accuracy. By an adjustment to the standard Monte Carlo simulation, EMS ensures that the simulated price satisfies the rational option pricing bounds and that the estimated derivative contract price is strongly consistent with payoffs that satisfy Lipschitz condition. However, for some currently used contracts such as self-quanto options and asymmetric or symmetric power options, it is open whether the above asymptotic result holds. In this paper, we prove that the strong consistency of the EMS option price estimator holds for a wider class of univariate payoffs than those restricted by Lipschitz condition. Numerical experiments demonstrate that EMS can also substantially increase simulation accuracy in the extended setting.
A correlation consistency based multivariate alarm thresholds optimization approach.
Gao, Huihui; Liu, Feifei; Zhu, Qunxiong
2016-11-01
Different alarm thresholds could generate different alarm data, resulting in different correlations. A new multivariate alarm thresholds optimization methodology based on the correlation consistency between process data and alarm data is proposed in this paper. The interpretative structural modeling is adopted to select the key variables. For the key variables, the correlation coefficients of process data are calculated by the Pearson correlation analysis, while the correlation coefficients of alarm data are calculated by kernel density estimation. To ensure the correlation consistency, the objective function is established as the sum of the absolute differences between these two types of correlations. The optimal thresholds are obtained using particle swarm optimization algorithm. Case study of Tennessee Eastman process is given to demonstrate the effectiveness of proposed method.
Stochastic multi-configurational self-consistent field theory
Thomas, Robert E; Alavi, Ali; Booth, George H
2015-01-01
The multi-configurational self-consistent field theory is considered the standard starting point for almost all multireference approaches required for strongly-correlated molecular problems. The limitation of the approach is generally given by the number of strongly-correlated orbitals in the molecule, as its cost will grow exponentially with this number. We present a new multi-configurational self-consistent field approach, wherein linear determinant coefficients of a multi-configurational wavefunction are optimized via the stochastic full configuration interaction quantum Monte Carlo technique at greatly reduced computational cost, with non-linear orbital rotation parameters updated variationally based on this sampled wavefunction. This extends this approach to strongly-correlated systems with far larger active spaces than it is possible to treat by conventional means. By comparison with this traditional approach, we demonstrate that the introduction of stochastic noise in both the determinant amplitudes an...
Consistency Across Standards or Standards in a New Business Model
Russo, Dane M.
2010-01-01
Presentation topics include: standards in a changing business model, the new National Space Policy is driving change, a new paradigm for human spaceflight, consistency across standards, the purpose of standards, danger of over-prescriptive standards, a balance is needed (between prescriptive and general standards), enabling versus inhibiting, characteristics of success-oriented standards, characteristics of success-oriented standards, and conclusions. Additional slides include NASA Procedural Requirements 8705.2B identifies human rating standards and requirements, draft health and medical standards for human rating, what's been done, government oversight models, examples of consistency from anthropometry, examples of inconsistency from air quality and appendices of government and non-governmental human factors standards.
Exceptional generalised geometry for massive IIA and consistent reductions
Cassani, Davide; Petrini, Michela; Strickland-Constable, Charles; Waldram, Daniel
2016-01-01
We develop an exceptional generalised geometry formalism for massive type IIA supergravity. In particular, we construct a deformation of the generalised Lie derivative, which generates the type IIA gauge transformations as modified by the Romans mass. We apply this new framework to consistent Kaluza-Klein reductions preserving maximal supersymmetry. We find a generalised parallelisation of the exceptional tangent bundle on S^6, and from this reproduce the consistent truncation ansatz and embedding tensor leading to dyonically gauged ISO(7) supergravity in four dimensions. We also discuss closely related hyperboloid reductions, yielding a dyonic ISO(p,7-p) gauging. Finally, while for vanishing Romans mass we find a generalised parallelisation on S^d, d=4,3,2, leading to a maximally supersymmetric reduction with gauge group SO(d+1) (or larger), we provide evidence that an analogous reduction does not exist in the massive theory.
Exceptional generalised geometry for massive IIA and consistent reductions
Cassani, Davide; de Felice, Oscar; Petrini, Michela; Strickland-Constable, Charles; Waldram, Daniel
2016-08-01
We develop an exceptional generalised geometry formalism for massive type IIA supergravity. In particular, we construct a deformation of the generalised Lie derivative, which generates the type IIA gauge transformations as modified by the Romans mass. We apply this new framework to consistent Kaluza-Klein reductions preserving maximal supersymmetry. We find a generalised parallelisation of the exceptional tangent bundle on S 6, and from this reproduce the consistent truncation ansatz and embedding tensor leading to dyonically gauged ISO(7) supergravity in four dimensions. We also discuss closely related hyperboloid reductions, yielding a dyonic ISO( p, 7 - p) gauging. Finally, while for vanishing Romans mass we find a generalised parallelisation on S d , d = 4 , 3 , 2, leading to a maximally supersymmetric reduction with gauge group SO( d + 1) (or larger), we provide evidence that an analogous reduction does not exist in the massive theory.
Bolasso: model consistent Lasso estimation through the bootstrap
Bach, Francis
2008-01-01
We consider the least-square linear regression problem with regularization by the l1-norm, a problem usually referred to as the Lasso. In this paper, we present a detailed asymptotic analysis of model consistency of the Lasso. For various decays of the regularization parameter, we compute asymptotic equivalents of the probability of correct model selection (i.e., variable selection). For a specific rate decay, we show that the Lasso selects all the variables that should enter the model with probability tending to one exponentially fast, while it selects all other variables with strictly positive probability. We show that this property implies that if we run the Lasso for several bootstrapped replications of a given sample, then intersecting the supports of the Lasso bootstrap estimates leads to consistent model selection. This novel variable selection algorithm, referred to as the Bolasso, is compared favorably to other linear regression methods on synthetic data and datasets from the UCI machine learning rep...
Consistency Relations for an Implicit n-dimensional Regularization Scheme
Scarpelli, A P B; Nemes, M C
2001-01-01
We extend an implicit regularization scheme to be applicable in the $n$-dimensional space-time. Within this scheme divergences involving parity violating objects can be consistently treated without recoursing to dimensional continuation. Special attention is paid to differences between integrals of the same degree of divergence, typical of one loop calculations, which are in principle undetermined. We show how to use symmetries in order to fix these quantities consistently. We illustrate with examples in which regularization plays a delicate role in order to both corroborate and elucidate the results in the literature for the case of CPT violation in extended $QED_4$, topological mass generation in 3-dimensional gauge theories and the Schwinger Model and its chiral version.
Collisional decoherence of tunneling molecules: a consistent histories treatment
Coles, Patrick J; Griffiths, Robert B
2012-01-01
The decoherence of a two-state tunneling molecule, such as a chiral molecule or ammonia, due to collisions with a buffer gas is analyzed in terms of a succession of quantum states of the molecule satisfying the conditions for a consistent family of histories. With $\\hbar \\omega$ the separation in energy of the levels in the isolated molecule and $\\gamma$ a decoherence rate proportional to the rate of collisions, we find for $\\gamma \\gg \\omega$ (strong decoherence) a consistent family in which the molecule flips randomly back and forth between the left- and right-handed chiral states in a stationary Markov process. For $\\gamma \\omega$ and for $\\gamma < \\omega$. In addition we relate the speed with which chiral information is transferred to the environment to the rate of decrease of complementary types of information (e.g., parity information) remaining in the molecule itself.
A New Heuristic for Feature Selection by Consistent Biclustering
Mucherino, Antonio
2010-01-01
Given a set of data, biclustering aims at finding simultaneous partitions in biclusters of its samples and of the features which are used for representing the samples. Consistent biclusterings allow to obtain correct classifications of the samples from the known classification of the features, and vice versa, and they are very useful for performing supervised classifications. The problem of finding consistent biclusterings can be seen as a feature selection problem, where the features that are not relevant for classification purposes are removed from the set of data, while the total number of features is maximized in order to preserve information. This feature selection problem can be formulated as a linear fractional 0-1 optimization problem. We propose a reformulation of this problem as a bilevel optimization problem, and we present a heuristic algorithm for an efficient solution of the reformulated problem. Computational experiments show that the presented algorithm is able to find better solutions with re...
Viscoelastic models with consistent hypoelasticity for fluids undergoing finite deformations
Altmeyer, Guillaume; Rouhaud, Emmanuelle; Panicaud, Benoit; Roos, Arjen; Kerner, Richard; Wang, Mingchuan
2015-08-01
Constitutive models of viscoelastic fluids are written with rate-form equations when considering finite deformations. Trying to extend the approach used to model these effects from an infinitesimal deformation to a finite transformation framework, one has to ensure that the tensors and their rates are indifferent with respect to the change of observer and to the superposition with rigid body motions. Frame-indifference problems can be solved with the use of an objective stress transport, but the choice of such an operator is not obvious and the use of certain transports usually leads to physically inconsistent formulation of hypoelasticity. The aim of this paper is to present a consistent formulation of hypoelasticity and to combine it with a viscosity model to construct a consistent viscoelastic model. In particular, the hypoelastic model is reversible.
Sparse motion segmentation using multiple six-point consistencies
Zografos, Vasileios; Ellis, Liam
2010-01-01
We present a method for segmenting an arbitrary number of moving objects in image sequences using the geometry of 6 points in 2D to infer motion consistency. The method has been evaluated on the Hopkins 155 database and surpasses current state-of-the-art methods such as SSC, both in terms of overall performance on two and three motions but also in terms of maximum errors. The method works by ?nding initial clusters in the spatial domain, and then classifying each remaining point as belonging to the cluster that minimizes a motion consistency score. In contrast to most other motion segmentation methods that are based on an a?ne camera model, the proposed method is fully projective.
Consistency and axiomatization of a natural extensional combinatory logic
蒋颖
1996-01-01
In the light of a question of J. L. Krivine about the consistency of an extensional λ-theory,an extensional combinatory logic ECL+U(G)+RU_∞+ is established, with its consistency model provedtheoretically and it is shown the it is not equivalent to any system of universal axioms. It is expressed bythe theory in first order logic that, for every given group G of order n, there simultaneously exist infinitelymany universal retractions and a surjective n-tuple notion, such that each element of G acts as a permutationof the components of the n-tuple, and as an Ap-automorphism of the model; further each of the universalretractions is invarian under the action of the Ap-automorphisms induced by G The difference between thetheory and that of Krivine is the G need not be a symmetric group.
A minimal model of self-consistent partial synchrony
Clusella, Pau; Politi, Antonio; Rosenblum, Michael
2016-09-01
We show that self-consistent partial synchrony in globally coupled oscillatory ensembles is a general phenomenon. We analyze in detail appearance and stability properties of this state in possibly the simplest setup of a biharmonic Kuramoto-Daido phase model as well as demonstrate the effect in limit-cycle relaxational Rayleigh oscillators. Such a regime extends the notion of splay state from a uniform distribution of phases to an oscillating one. Suitable collective observables such as the Kuramoto order parameter allow detecting the presence of an inhomogeneous distribution. The characteristic and most peculiar property of self-consistent partial synchrony is the difference between the frequency of single units and that of the macroscopic field.
Planck 2013 results. XXXI. Consistency of the Planck data
Ade, P. A. R.; Arnaud, M.; Ashdown, M.
2014-01-01
by deviation of the ratio from unity) between 70 and 100 GHz power spectra averaged over 70 ≤∫≥ 390 at the 0.8% level, and agreement between 143 and 100 GHz power spectra of 0.4% over the same ` range. These values are within and consistent with the overall uncertainties in calibration given in the Planck 2013...... foreground emission. In this paper, we analyse the level of consistency achieved in the 2013 Planck data. We concentrate on comparisons between the 70, 100, and 143 GHz channel maps and power spectra, particularly over the angular scales of the first and second acoustic peaks, on maps masked for diuse....../100 ratio. Correcting for this, the 70, 100, and 143 GHz power spectra agree to 0.4% over the first two acoustic peaks. The likelihood analysis that produced the 2013 cosmological parameters incorporated uncertainties larger than this. We show explicitly that correction of the missing near sidelobe power...
Time-Consistent and Market-Consistent Evaluations (Revised version of CentER DP 2011-063)
Pelsser, A.; Stadje, M.A.
2012-01-01
Abstract: We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from
Time-Consistent and Market-Consistent Evaluations (replaced by CentER DP 2012-086)
Pelsser, A.; Stadje, M.A.
2011-01-01
We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from mathemati
Non-autonomous discrete Boussinesq equation: Solutions and consistency
Nong, Li-Juan; Zhang, Da-Juan
2014-07-01
A non-autonomous 3-component discrete Boussinesq equation is discussed. Its spacing parameters pn and qm are related to independent variables n and m, respectively. We derive bilinear form and solutions in Casoratian form. The plain wave factor is defined through the cubic roots of unity. The plain wave factor also leads to extended non-autonomous discrete Boussinesq equation which contains a parameter δ. Tree-dimendional consistency and Lax pair of the obtained equation are discussed.
An Algebraic Characterization of Inductive Soundness in Proof by Consistency
邵志清; 宋国新
1995-01-01
Kapur and Musser studied the theoretical basis for proof by consistency and obtained an inductive completeness result:p=q if and only if p=q is true in every inductive model.However,there is a loophole in their proof for the soundness part:p=q implies p=q is true in every inductive model.The aim of this paper is to give a correct characterization of inductive soundness from an algebraic view by introducing strong inductive models.
Consistency analysis of a nonbirefringent Lorentz-violating planar model
Casana, Rodolfo; Moreira, Roemir P M
2011-01-01
In this work analyze the physical consistency of a nonbirefringent Lorentz-violating planar model via the analysis of the pole structure of its Feynman's propagators. The nonbirefringent planar model, obtained from the dimensional reduction of the CPT-even gauge sector of the standard model extension, is composed of a gauge and a scalar fields, being affected by Lorentz-violating (LIV) coefficients encoded in the symmetric tensor $\\kappa_{\\mu\
Incomplete Lineage Sorting: Consistent Phylogeny Estimation From Multiple Loci
Mossel, Elchanan
2008-01-01
We introduce a simple algorithm for reconstructing phylogenies from multiple gene trees in the presence of incomplete lineage sorting, that is, when the topology of the gene trees may differ from that of the species tree. We show that our technique is statistically consistent under standard stochastic assumptions, that is, it returns the correct tree given sufficiently many unlinked loci. We also show that it can tolerate moderate estimation errors.
Consistent 4D Brain Extraction of Serial Brain MR Images
Wang, Yaping; Li, Gang; Nie, Jingxin; Yap, Pew-Thian; Guo, Lei; Shen, Dinggang
2013-01-01
Accurate and consistent skull stripping of serial brain MR images is of great importance in longitudinal studies that aim to detect subtle brain morphological changes. To avoid inconsistency and the potential bias introduced by independently performing skull-stripping for each time-point image, we propose an effective method that is capable of skull-stripping serial brain MR images simultaneously. Specifically, all serial images of the same subject are first affine aligned in a groupwise mann...
Consistency argument and classification problem in λ-calculus
王驹; 赵希顺; 黄且圆; 蒋颖
1999-01-01
Enlightened by Mal’cev theorem in universal algebra, a new criterion for consistency argument in λ-calculus has been introduced. It is equivalent to Jacopini and Baeten-Boerboom’ s, but more convenient to use. Based on the new criterion, one uses an enhanced technique to show a few results which provides a deeper insight in the classification problem of λ-terms with no normal forms.
Security Policy:Consistency,Adjustments and Restraining Factors
Yang Jiemian
2004-01-01
@@ In the 2004 U.S. presidential election, despite well-divided domestic opinions and Kerry's appealing slogan of "Reversing the Trend," a slight majority still voted for George W. Bush in the end. It is obvious that, based on the author's analysis, security agenda such as counter-terrorism and Iraqi issue has contributed greatly to the reelection of Mr. Bush. This also indicates that the security policy of Bush's second term will basically be consistent.
Measuring Consistent Poverty in Ireland with EU SILC Data
Whelan, Christopher T.; Nolan, Brian; Maitre, Bertrand
2006-01-01
In this paper we seek to make use of the newly available Irish component of the European Union Statistics on Income and Living Conditions (EU-SILC) in order to develop a measure of consistent poverty that overcomes some of the difficulties associated with the original indicators employed as targets in the Irish National Anti-Poverty Strategy. Our analysis leads us to propose a set of economic strain indicators that cover a broader range than the original basic deprivation set. The accumulated...