WorldWideScience

Sample records for videodisc system consisted

  1. Videodisc technology

    Energy Technology Data Exchange (ETDEWEB)

    Marsh, F.E. Jr.

    1981-03-01

    An overview of the technology of videodiscs is given. The emphasis is on systems that use reflection or transmission of laser light. Possible use of videodiscs for storage of bibliographic information is considered. 6 figures, 3 tables. (RWR)

  2. Preliminary evaluation of learning via the AI/LEARN/Rheumatology interactive videodisc system.

    OpenAIRE

    Mitchell, J. A.; Bridges, A. J.; Reid, J. C.; Cutts, J. H.; Hazelwood, S.; Sharp, G. C.

    1992-01-01

    AI/LEARN/Rheumatology is a level three videodisc system to teach clinical observational skills in three important diseases: rheumatoid arthritis, osteoarthritis, and ankylosing spondylitis. The AI/LEARN software was developed on an independent authoring system called GALE designed for MS-DOS based computers. The purpose of this paper is to present preliminary data about the efficacy of teaching by the use of an interactive videodisc system as evaluated by examinations centered upon disease-or...

  3. Videodisc Technology.

    Science.gov (United States)

    Marsh, Fred E., Jr.

    1982-01-01

    Identifies and describes the major areas of videodisc technology; discusses the operation, reliability, storage capacities, and applications of two types of laser systems; and illustrates the versatility of the optical digital disc through a description of its ability to digitize large bodies of data. Included are six figures and three tables.…

  4. Micro-Based Optical Videodisc Applications.

    Science.gov (United States)

    Chen, Ching-chih

    1985-01-01

    This overview of optical videodisc technology focuses on microcomputer-based applications for information processing. Topics discussed include fundamentals of videodisc technology, interactive videodisc technology, and associated hardware systems; government-supported and commercial videodisc and CD ROM information projects; and speculations on…

  5. Information Providers and Videodisc/Optical Disk Technology.

    Science.gov (United States)

    Galloway, Emily; Paris, Judith

    1983-01-01

    Explores the possibilities of using videodisc and optical disk technology as publishing media, highlighting the videodisc as an educational tool and visual supplement to online databases, digital database publishing on videodisc, optical disks for electronic document and image delivery systems, and costs associated with videodisc design and…

  6. Preliminary evaluation of learning via the AI/LEARN/Rheumatology interactive videodisc system.

    Science.gov (United States)

    Mitchell, J A; Bridges, A J; Reid, J C; Cutts, J H; Hazelwood, S; Sharp, G C

    1992-01-01

    AI/LEARN/Rheumatology is a level three videodisc system to teach clinical observational skills in three important diseases: rheumatoid arthritis, osteoarthritis, and ankylosing spondylitis. The AI/LEARN software was developed on an independent authoring system called GALE designed for MS-DOS based computers. The purpose of this paper is to present preliminary data about the efficacy of teaching by the use of an interactive videodisc system as evaluated by examinations centered upon disease-oriented learning objectives and by attitude questionnaires. We tested the efficacy of the AI/LEARN/Rheumatology system using both medical students and residents taking the rheumatology elective. Data collected were on learning, attitudes, and ranking of curricular elements of the rotation. We kept records on the student time and search path through the interactive videodisc system. Control data were collected during 1990, before the AI/LEARN/Rheumatology program was available. Data for the treatment groups were collected during 1991 and 1992, while the trainees used the AI/LEARN/Rheumatology system. The basic difference between the control year and the treatment year curricula was the substitution of AI/LEARN/Rheumatology for three hours of lecture covering the three target diseases. AI/LEARN/Rheumatology was as effective as traditional methods of instruction as measured by scores on a multiple choice test. Student and resident learning was related to the time spent on the system. Students and residents ranked the AI/LEARN/Rheumatology system as the single most helpful learning tool in their 8 week rheumatology block, ranking it above the examination of patients.

  7. Instructional Systems Development Model for Interactive Videodisc Training Delivery Systems. Volume I. Hardware, Software, and Procedures

    Science.gov (United States)

    1980-06-01

    simulation specs, answer processing, and response history branching schemes). e. Title, description, excerpts desired, and location of film, videotape...Media production includes the narration and soundtrack for the videodisc. At present, audio is available on the videodisc only with motion. (i.e., to...MLLITARY ACAuE4Y DEPT. OF HISTORY , BLOG 6O1 1 USA iNTELLIGENCE CEN ANI SCH ATTNt SCHOOL LIBRARY I US4 iNTELLIGENCE CEN ANU SCM ArTNI ATSI-UP I mAkl.*E

  8. Enhancing Comprehension with Videodiscs.

    Science.gov (United States)

    Howson, Betty Ann; Davis, Hilarie

    1992-01-01

    Discusses the use of videodiscs to increase students' comprehension. Benefits of adding visual images to learning activities are discussed, videodiscs as sources of data for students to analyze are considered, and an example is given of using videodiscs to illustrate concepts in a chemistry class. (LRW)

  9. Using Videodiscs in Instruction: Realizing Their Potential through Instructional Design.

    Science.gov (United States)

    Reigeluth, Charles M.; Garfield, Joanne M.

    1984-01-01

    Examines the state-of-the-art of intelligent videodisc systems and of the aspects of instructional theory that have implications for design of hardware, software, and courseware for such systems. Some problems inhibiting the introduction of videodisc systems into education are discussed along with solutions to these inhibiting factors. (MBR)

  10. Using Videodiscs in Instruction: Realizing Their Potential through Instructional Design. IDD&E Working Paper No. 4.

    Science.gov (United States)

    Reigeluth, Charles M.; Garfield, Joanne M.

    Arguing that the systematic application of knowledge about instruction to videodisc technology is essential if the full potential of this medium is to be realized, this paper begins by discussing the need for intelligent videodisc technology in our educational system. A brief review of the state of the art in intelligent videodisc systems, which…

  11. Basics of Videodisc and Optical Disk Technology.

    Science.gov (United States)

    Paris, Judith

    1983-01-01

    Outlines basic videodisc and optical disk technology describing both optical and capacitance videodisc technology. Optical disk technology is defined as a mass digital image and data storage device and briefly compared with other information storage media including magnetic tape and microforms. The future of videodisc and optical disk is…

  12. Videodisc and Optical Disk: Technology, Research, and Applications.

    Science.gov (United States)

    Lunin, Lois F.

    1983-01-01

    Introduction to videodisc and optical disk technology (information storage media which are able to handle word, data, image, and sound) cites articles written about videodisc and optical disk applications, instructional use, videodisc research, and information retrieval. A list of 30 suggested readings and additional information resources are…

  13. Consistent Design of Dependable Control Systems

    DEFF Research Database (Denmark)

    Blanke, M.

    1996-01-01

    Design of fault handling in control systems is discussed, and a method for consistent design is presented.......Design of fault handling in control systems is discussed, and a method for consistent design is presented....

  14. Videodisc Training Delivery System Project.

    Science.gov (United States)

    1982-07-01

    Production Sheet No. Lesson Segment SMPTE( ) to Compositor Cartridge-Page _Estimated time (second) Branch on To GRAPHICS: __Prop _Art __Photo No. NOTES VIDEO...Still __Motion Tape( ) to NOTES COMPOSITOR : ’External __Text _Animation NOTES STUDIO: _Split __Quad _Window _Highlight NOTES PROGRAMMING: Stop Calc...SMPTE( ) to Compositor Cartridge-Page Estimated time (second) Branch on To GRAPHICS: _Prop _Art _Photo No. NOTES VIDEO: __Still _Motion Tape( ) to NOTES

  15. Using Interactive Videodiscs for Bilingual Education.

    Science.gov (United States)

    Copra, Edward R.

    1990-01-01

    This article describes "Hands On," a research project employing interactive computer/videodisc technology to teach English to deaf children with American Sign Language (ASL) skills. Elementary school students can read a story in printed English text, watch an ASL-signed version of the story, access a list of vocabulary words, or caption a story…

  16. Videodisc/Microcomputer Technology in Wildland Fire Behavior Training

    Science.gov (United States)

    M. J. Jenkins; K.Y. Matsumoto-Grah

    1987-01-01

    Interactive video is a powerful medium, bringing together the emotional impact of video and film and the interactive capabilities of the computer. Interactive videodisc instruction can be used as a tutorial, for drill and practice and in simulations, as well as for information storage. Videodisc technology is being used in industrial, military and medical applications...

  17. On the existence of consistent price systems

    DEFF Research Database (Denmark)

    Bayraktar, Erhan; Pakkanen, Mikko S.; Sayit, Hasanjan

    2014-01-01

    We formulate a sufficient condition for the existence of a consistent price system (CPS), which is weaker than the conditional full support condition (CFS). We use the new condition to show the existence of CPSs for certain processes that fail to have the CFS property. In particular this condition...

  18. GALE: a graphics assisted learning environment for computer-based interactive videodisc education.

    Science.gov (United States)

    Cutts, J H; Hazelwood, S E; Mitchell, J A; Bridges, A J; Reid, J C

    1992-08-01

    GALE, a Graphics Assisted Learning Environment, is a computer-based interactive videodisc authoring tool. GALE was created as the authoring package for AI/LEARN/Rheumatology, an independent study system for teaching rheumatology to medical trainees. GALE has potential widespread application beyond rheumatology. Interactive videodisc technology is a prime feature of GALE. Other highlights are: WordPerfect macros which simplify programming, graphics-based large text characters, tracking of user responses, hypertext-like definition capabilities, color coded screens to distinguish between hypertext branches and the mainstream of the course content and ability to overlay text on the video image. GALE runs on a PC-compatible computer with selected Pioneer LaserDisc players. GALE uses WordPerfect 5.1 for text editing and has been designed for use by non-programmers.

  19. Consistent thermodynamic properties of lipids systems

    DEFF Research Database (Denmark)

    Cunico, Larissa; Ceriani, Roberta; Sarup, Bent

    Physical and thermodynamic properties of pure components and their mixtures are the basic requirement for process design, simulation, and optimization. In the case of lipids, our previous works[1-3] have indicated a lack of experimental data for pure components and also for their mixtures...... different pressures, with azeotrope behavior observed. Available thermodynamic consistency tests for TPx data were applied before performing parameter regressions for Wilson, NRTL, UNIQUAC and original UNIFAC models. The relevance of enlarging experimental databank of lipids systems data in order to improve...

  20. A Selected Interactive Videodisc Bibliography. TDC Research Report No. 2.

    Science.gov (United States)

    Montgomery, Rae; Sayre, Scott

    This bibliography lists 360 monographs, journal articles, research reports, and conference proceedings on interactive videodisc and educational applications of this technology. Materials through December 1988 are included. A sidebar provides background on interactive video technology. (MES)

  1. Montevidisco: An Anecdotal History of an Interactive Videodisc.

    Science.gov (United States)

    Gale, Larrie E.

    1983-01-01

    The development of an interactive videodisc-microcomputer simulation of a visit to a Mexican village for college-level Spanish instruction is described. Problems encountered, production considerations, computer program development, hardware, and classroom results are discussed. (MSE)

  2. Using Interactive Videodisc To Teach Psychomotor Skills to Nursing Students

    Science.gov (United States)

    Renshaw, Sharon M.; Beadenkopf, F. Scott; Murray, Rodney

    1989-01-01

    An interactive videodisc program on the process of administering medications to clients will be demonstrated. Discussion will center on the strengths and limitations of interactive video for teaching psychomotor skills to healthcare professionals as well as design modifications that will facilitate this process. Interactive videodisc technology provides an exciting new medium for teaching psychomotor clinical skills to health care professionals. It is a particularly valuable approach for complex skills which involve visualization of motor activities and extensive client assessments.

  3. Selected Conference Proceedings from the 1985 Videodisc, Optical Disk, and CD-ROM Conference and Exposition (Philadelphia, PA, December 10-12, 1985).

    Science.gov (United States)

    Cerva, John R.; And Others

    1986-01-01

    Eight papers cover: optical storage technology; cross-cultural videodisc design; optical disk technology use at the Library of Congress Research Service and National Library of Medicine; Internal Revenue Service image storage and retrieval system; solving business problems with CD-ROM; a laser disk operating system; and an optical disk for…

  4. Questioning Categories Used By Elementary Science Teachers during Moving and Still Frames of Videodisc Instruction.

    Science.gov (United States)

    Smith, Coralee S.; Barrow, Lloyd H.

    The purpose of the study reported in this paper was to examine the categories of teacher-asked questions while using moving and still frames of science videodisc instruction. Videotapes were made of 12 volunteer, Midwestern, urban, elementary teachers using videodisc instruction. Coding of the teacher-asked questioning categories was determined…

  5. The Power and Potential of Laser Videodisc Technology for Art Education in the 90's.

    Science.gov (United States)

    Schwartz, Bernard

    1991-01-01

    Explains that the laser videodisc is a versatile and cost-effective tool with enormous instructional potential for art education. Describes the origins, quality, and capability of videodiscs, and discusses the varieties of players and discs presently available. Maintains that this technology is especially relevant now that art education includes…

  6. Interactive Videodisc Design and Production, Workshop Guide. Volume 2

    Science.gov (United States)

    1983-12-01

    Frames, Animations The bulk of text on the videodisc will come from a video character generator (CG). We use a Fernseh (Telemation) Compositor I. A number...used the Compositor graphics digitizer for some of the graphics/ animations. An example is a highly detailed animated bus on the ARI disc. Another...considering in some applications. Art, photos, slides or video intended to be used with compositor need to be considered carefully to be certain they work

  7. The Finitistic Consistency of Heck's Predicative Fregean System

    DEFF Research Database (Denmark)

    Cruz-Filipe, L.; Ferreira, Fernando

    2015-01-01

    Frege's theory is inconsistent (Russell's paradox). However, the predicative version of Frege's system is consistent. This was proved by Richard Heck in 1996 using a model theoretic argument. In this paper, we give a finitistic proof of this consistency result. As a consequence, Heck's predicative...

  8. Ensuring Data Consistency Over CMS Distributed Computing System

    CERN Document Server

    Rossman, Paul

    2009-01-01

    CMS utilizes a distributed infrastructure of computing centers to custodially store data, to provide organized processing resources, and to provide analysis computing resources for users. Integrated over the whole system, even in the first year of data taking, the available disk storage approaches 10 petabytes of space. Maintaining consistency between the data bookkeeping, the data transfer system, and physical storage is an interesting technical and operations challenge. In this paper we will discuss the CMS effort to ensure that data is consistently available at all computing centers. We will discuss the technical tools that monitor the consistency of the catalogs and the physical storage as well as the operations model used to find and solve inconsistencies.

  9. The Consistency Service of the ATLAS Distributed Data Management system

    CERN Document Server

    Serfon, C; The ATLAS collaboration

    2010-01-01

    With the continuously increasing volume of data produced by ATLAS and stored on the WLCG sites, the probability of data corruption or data losses, due to software and hardware failure is increasing. In order to ensure the consistency of all data produced by ATLAS a Consistency Service has been developed as part of the DQ2 Distributed Data Management system. This service is fed by the different ATLAS tools, i.e. the analysis tools, production tools, DQ2 site services or by site administrators that report corrupted or lost files. It automatically correct the errors reported and informs the users in case of irrecoverable file loss.

  10. The consistency service of the ATLAS Distributed Data Management system

    CERN Document Server

    Serfon, C; The ATLAS collaboration

    2011-01-01

    With the continuously increasing volume of data produced by ATLAS and stored on the WLCG sites, the probability of data corruption or data losses, due to software and hardware failures is increasing. In order to ensure the consistency of all data produced by ATLAS a Consistency Service has been developed as part of the DQ2 Distributed Data Management system. This service is fed by the different ATLAS tools, i.e. the analysis tools, production tools, DQ2 site services or by site administrators that report corrupted or lost files. It automatically corrects the errors reported and informs the users in case of irrecoverable file loss.

  11. Thermodynamically consistent Bayesian analysis of closed biochemical reaction systems.

    Science.gov (United States)

    Jenkinson, Garrett; Zhong, Xiaogang; Goutsias, John

    2010-11-05

    Estimating the rate constants of a biochemical reaction system with known stoichiometry from noisy time series measurements of molecular concentrations is an important step for building predictive models of cellular function. Inference techniques currently available in the literature may produce rate constant values that defy necessary constraints imposed by the fundamental laws of thermodynamics. As a result, these techniques may lead to biochemical reaction systems whose concentration dynamics could not possibly occur in nature. Therefore, development of a thermodynamically consistent approach for estimating the rate constants of a biochemical reaction system is highly desirable. We introduce a Bayesian analysis approach for computing thermodynamically consistent estimates of the rate constants of a closed biochemical reaction system with known stoichiometry given experimental data. Our method employs an appropriately designed prior probability density function that effectively integrates fundamental biophysical and thermodynamic knowledge into the inference problem. Moreover, it takes into account experimental strategies for collecting informative observations of molecular concentrations through perturbations. The proposed method employs a maximization-expectation-maximization algorithm that provides thermodynamically feasible estimates of the rate constant values and computes appropriate measures of estimation accuracy. We demonstrate various aspects of the proposed method on synthetic data obtained by simulating a subset of a well-known model of the EGF/ERK signaling pathway, and examine its robustness under conditions that violate key assumptions. Software, coded in MATLAB®, which implements all Bayesian analysis techniques discussed in this paper, is available free of charge at http://www.cis.jhu.edu/~goutsias/CSS%20lab/software.html. Our approach provides an attractive statistical methodology for estimating thermodynamically feasible values for the rate

  12. A Consistent Design Methodology for Wireless Embedded Systems

    Directory of Open Access Journals (Sweden)

    Sauzon G

    2005-01-01

    Full Text Available Complexity demand of modern communication systems, particularly in the wireless domain, grows at an astounding rate, a rate so high that the available complexity and even worse the design productivity required to convert algorithms into silicon are left far behind. This effect is commonly referred to as the design productivity crisis or simply the design gap. Since the design gap is predicted to widen every year, it is of utmost importance to look closer at the design flow of such communication systems in order to find improvements. While various ideas for speeding up designs have been proposed, very few have found their path into existing EDA products. This paper presents requirements for such tools and shows how an open design environment offers a solution to integrate existing EDA tools, allowing for a consistent design flow, considerably speeding up design times.

  13. Thermodynamically consistent Bayesian analysis of closed biochemical reaction systems

    Directory of Open Access Journals (Sweden)

    Goutsias John

    2010-11-01

    Full Text Available Abstract Background Estimating the rate constants of a biochemical reaction system with known stoichiometry from noisy time series measurements of molecular concentrations is an important step for building predictive models of cellular function. Inference techniques currently available in the literature may produce rate constant values that defy necessary constraints imposed by the fundamental laws of thermodynamics. As a result, these techniques may lead to biochemical reaction systems whose concentration dynamics could not possibly occur in nature. Therefore, development of a thermodynamically consistent approach for estimating the rate constants of a biochemical reaction system is highly desirable. Results We introduce a Bayesian analysis approach for computing thermodynamically consistent estimates of the rate constants of a closed biochemical reaction system with known stoichiometry given experimental data. Our method employs an appropriately designed prior probability density function that effectively integrates fundamental biophysical and thermodynamic knowledge into the inference problem. Moreover, it takes into account experimental strategies for collecting informative observations of molecular concentrations through perturbations. The proposed method employs a maximization-expectation-maximization algorithm that provides thermodynamically feasible estimates of the rate constant values and computes appropriate measures of estimation accuracy. We demonstrate various aspects of the proposed method on synthetic data obtained by simulating a subset of a well-known model of the EGF/ERK signaling pathway, and examine its robustness under conditions that violate key assumptions. Software, coded in MATLAB®, which implements all Bayesian analysis techniques discussed in this paper, is available free of charge at http://www.cis.jhu.edu/~goutsias/CSS%20lab/software.html. Conclusions Our approach provides an attractive statistical methodology for

  14. Finite Fermi systems theory and self-consistency relations

    Energy Technology Data Exchange (ETDEWEB)

    Khodel, V.A.; Saperstein, E.E. (Gosudarstvennyj Komitet po Ispol' zovaniyu Atomnoj Ehnergii SSSR, Moscow. Inst. Atomnoj Ehnergii)

    1982-12-01

    The self-consistent theory of the finite Fermi systems is outlined. This approach is based on the same Fermi liquid theory principles as the familiar theory for finite Fermi systems (FFS) by Migdal. We show that the basic Fermi system properties can be evaluated in terms of the quasiparticle Lagrangian Lsub(q) which incorporates the energy dependency effects. This Lagrangian is defined so that the corresponding Lagrange equations should coincide with the FFS theory equations of motion of the quasiparticles. The quasiparticle energy Esub(q) defined in the terms of the quasiparticle Lagrangian Lsub(q) according to the usual canonical rules is shown to be equal to the binding energy E/sub 0/ of the system. For a given Lagrangian Lsub(q) the particle densities in nuclei, the nuclear single-particle spectra, the low-lying collective states (LCS) properties, and the amplitude of the interquasiparticle interaction are also evaluated. The suggested approach is compared with the Hartree-Fock theory with effective forces.

  15. Measuring children's social skills using microcomputer-based videodisc assessment.

    Science.gov (United States)

    Irvin, L K; Walker, H M; Noell, J; Singer, G H; Irvine, A B; Marquez, K; Britz, B

    1992-10-01

    This article describes the development of a microcomputer-based videodisc assessment prototype for measuring children's social skills. The theoretical and empirical foundations for the content are described, and the contributions of interactive microcomputer-based video technology to assessment of children with handicaps are detailed. An application of Goldfried and D'Zurilla's "behavior-analytic" approach to development of the content of assessments is presented, and the related video and computer technology development is detailed. The article describes the conceptual foundations of the psychometrics of the assessment prototype as well as the psychometric methodology that was employed throughout the development process. Finally, a discussion of the potential applications and implications of the social skills assessment prototype is included.

  16. Student Consistency and Implications for Feedback in Online Assessment Systems

    Science.gov (United States)

    Madhyastha, Tara M.; Tanimoto, Steven

    2009-01-01

    Most of the emphasis on mining online assessment logs has been to identify content-specific errors. However, the pattern of general "consistency" is domain independent, strongly related to performance, and can itself be a target of educational data mining. We demonstrate that simple consistency indicators are related to student outcomes,…

  17. Understanding and Improving the Performance Consistency of Distributed Computing Systems

    NARCIS (Netherlands)

    Yigitbasi, M.N.

    2012-01-01

    With the increasing adoption of distributed systems in both academia and industry, and with the increasing computational and storage requirements of distributed applications, users inevitably demand more from these systems. Moreover, users also depend on these systems for latency and throughput

  18. An Evaluation of Information Consistency in Grid Information Systems

    CERN Document Server

    Field, Laurence

    2016-01-01

    A Grid information system resolves queries that may need to consider all information sources (Grid services), which are widely distributed geographically, in order to enable efficient Grid functions that may utilise multiple cooperating services. Fundamentally this can be achieved by either moving the query to the data (query shipping) or moving the data to the query (data shipping). Existing Grid information system implementations have adopted one of the two approaches. This paper explores the two approaches in further detail by evaluating them to the best possible extent with respect to Grid information system benchmarking metrics. A Grid information system that follows the data shipping approach based on the replication of information that aims to improve the currency for highly-mutable information is presented. An implementation of this, based on an Enterprise Messaging System, is evaluated using the benchmarking method and the consequence of the results for the design of Grid information systems is discu...

  19. Thermodynamically consistent Bayesian analysis of closed biochemical reaction systems

    OpenAIRE

    Goutsias John; Zhong Xiaogang; Jenkinson Garrett

    2010-01-01

    Abstract Background Estimating the rate constants of a biochemical reaction system with known stoichiometry from noisy time series measurements of molecular concentrations is an important step for building predictive models of cellular function. Inference techniques currently available in the literature may produce rate constant values that defy necessary constraints imposed by the fundamental laws of thermodynamics. As a result, these techniques may lead to biochemical reaction systems whose...

  20. Using Interactive Videodiscs in Open University Courses. I.E.T. Papers on Broadcasting No. 218.

    Science.gov (United States)

    Fuller, Robert G., Ed.

    This nine-paper collection from a June 1983 Open University (OU) campus workshop in Milton Keynes, England, describes an interactive video project developed for an OU undergraduate course, T252, Introduction to Engineering Materials, and discusses varied aspects of interactive videodisc program development. The following papers are included:…

  1. Fidelity and Moral Authority: Ethical Issues in Videodisc Design for the Improvement of Teaching.

    Science.gov (United States)

    Campbell, Katy; And Others

    1995-01-01

    Discussion of the use of videodiscs to improve postsecondary teaching focuses on a project at the University of Alberta (Canada) that considered ethical standards in visual anthropology. Topics include identifying teaching examples, accountability, collaboration, integrity, intentionality, authenticity, and dignity and privacy. (LRW)

  2. Consistent Prediction of Properties of Systems with Lipids

    DEFF Research Database (Denmark)

    Cunico, Larissa; Ceriani, Roberta; Sarup, Bent

    . Lipids are found in almost all mixtures involving edible oils, fats and biodiesel. They are also being extracted for use in the pharma-industry. A database for pure components (lipids) present in these processes and mixtures properties has been developed and made available for different applications...... on the analysis, an uncertainty score is given, which is then used to define the weights of data-sets to be used for property model parameter regression. The performance of selected models for pure component properties as well as mixture properties when applied to systems containing lipids will be highlighted....

  3. An interactive histology image-barcode manual for a videodisc image library.

    Science.gov (United States)

    Ogilvie, R W

    1995-01-01

    Cell Biology and HISTOLOGY (alias Microanatomy, alias Microscopic Anatomy) is a required course for first-year medical and dental students in most health science centers. The traditional approach used in teaching this discipline is to present photomicrographic images of structures to students in lecture using 35 mm slides of fields seen through the microscope. The students then spend many hours viewing and studying specimens of tissues using a light microscope in a laboratory setting. Students in traditional courses of histology spend an inordinate amount of time learning the component structures by attempting to find and identify them in tissue sections using a microscope, where the structure being sought is surrounded by a multitude of other structures with which they are also not familiar. With the recent availability of videodisc stored image libraries of histological samples, it is now possible to study histological principles without the use of the microscope as the primary learning tool. A videodisc entitled " A Photographic Atlas" by S. Downing (published by Image Premastering Services Limited, Minneapolis, MN, 1991) has been incorporated into our histology course. Fifteen videodisc player stations are provided for 150 students. Images are retrieved by students using a bar code scanner attached to a videodisc player (Pioneer CLD-2400). Using this kind of image library, students can now learn basic histological structure, such as cell and tissue types, without the use of a microscope or as a tool for facilitating microscopy. The use of a videodisc library of randomly accessible images simplifies learning the basic components which all organs are composed of by presenting the learner with clear-cut examples to avoid confusion with other structures. However, videodisc players and TV monitors are still not appropriately priced for every student to own. This presents a problem in that the same images studied in class are not available to study and review outside

  4. The Development and Evaluation of a Computer-Based System for Managing the Design and Pilot-Testing of Interactive Videodisc Programs. Training and Development Research Center, Project Number Forty-Three.

    Science.gov (United States)

    Sayre, Scott Alan

    The purpose of this study was to develop and validate a computer-based system that would allow interactive video developers to integrate and manage the design components prior to production. These components of an interactive video (IVD) program include visual information in a variety of formats, audio information, and instructional techniques,…

  5. The role of interactive control systems in obtaining internal consistency in the management control system package

    DEFF Research Database (Denmark)

    Toldbod, Thomas; Israelsen, Poul

    2014-01-01

    Companies rely on multiple Management Control Systems to obtain their short and long term objectives. When applying a multifaceted perspective on Management Control System the concept of internal consistency has been found to be important in obtaining goal congruency in the company. However...... of MCSs when analyzing internal consistency in the MCS package and how managers obtain internal consistency in the new MCS package when a MCS change occur. This study focuses specifically on changes to administrative controls, which are not internal consistent with the current cybernetic controls. As top...... management is aware of this shortcoming they use the cybernetic controls more interactively to overcome this shortcoming, whereby the cybernetic controls are also used as a learning platform and not just for performance control....

  6. A Hypertext Database for Accessing the International Veterinary Pathology Slide Bank Videodisc

    OpenAIRE

    Weeks, B.R.; Smith, R.; Snell, J.R.; Hall, S. Mark

    1990-01-01

    The International Veterinary Pathology Slide Bank Videodisc is an archival resource containing more than 12,000 color video images of interest to veterinary and comparative pathologists. To increase the utility of this database, we have developed a HyperCard-based database for Macintosh computers that allows rapid searches of the information associated with the images, and automatic display of specified images. Complex searches are handled using HyperKRS™, an indexing and search utility for H...

  7. Interactive Videodisc Technology: Applications to the Air Command and Staff College Curriculum.

    Science.gov (United States)

    1988-04-01

    support interactive instruction. The videodisc can hold a large volume of ~vii 0 - . . .W" CONTINUED pictures, both stills and movies, and a stereo sound...conduct Code of conduct seminar Ethics and leadership Professional ethics Air Force professionals: the NCO Leadership American style Tips for...China Arab-Israeli Conflict Introduction to Islamic political world-view East Asia and US security Contemporary Africa Latin America-United States

  8. The Object Oriented Approach in Systems Analysis and Design Texts: Consistency within the IS Curriculum

    Science.gov (United States)

    Wood, David F.; Kohun, Frederick G.; Laverty, Joseph Packy

    2010-01-01

    This paper reports on a study of systems analysis textbooks in terms of topics covered and academic background of the authors. It addresses the consistency within IS curricula with respect to the content of a systems analysis and design course using the object-oriented approach. The research questions addressed were 1: Is there a consistency among…

  9. A Hierarchy of Discrete Integrable Coupling System with Self-Consistent Sources

    Directory of Open Access Journals (Sweden)

    Yuqing Li

    2014-01-01

    Full Text Available Integrable coupling system of a lattice soliton equation hierarchy is deduced. The Hamiltonian structure of the integrable coupling is constructed by using the discrete quadratic-form identity. The Liouville integrability of the integrable coupling is demonstrated. Finally, the discrete integrable coupling system with self-consistent sources is deduced.

  10. A consistent description of kinetics and hydrodynamics of quantum Bose-systems

    Directory of Open Access Journals (Sweden)

    P.A.Hlushak

    2004-01-01

    Full Text Available A consistent approach to the description of kinetics and hydrodynamics of many-Boson systems is proposed. The generalized transport equations for strongly and weakly nonequilibrium Bose systems are obtained. Here we use the method of nonequilibrium statistical operator by D.N. Zubarev. New equations for the time distribution function of the quantum Bose system with a separate contribution from both the kinetic and potential energies of particle interactions are obtained. The generalized transport coefficients are determined accounting for the consistent description of kinetic and hydrodynamic processes.

  11. A proposed grading system for standardizing tumor consistency of intracranial meningiomas.

    Science.gov (United States)

    Zada, Gabriel; Yashar, Parham; Robison, Aaron; Winer, Jesse; Khalessi, Alexander; Mack, William J; Giannotta, Steven L

    2013-12-01

    Tumor consistency plays an important and underrecognized role in the surgeon's ability to resect meningiomas, especially with evolving trends toward minimally invasive and keyhole surgical approaches. Aside from descriptors such as "hard" or "soft," no objective criteria exist for grading, studying, and conveying the consistency of meningiomas. The authors designed a practical 5-point scale for intraoperative grading of meningiomas based on the surgeon's ability to internally debulk the tumor and on the subsequent resistance to folding of the tumor capsule. Tumor consistency grades and features are as follows: 1) extremely soft tumor, internal debulking with suction only; 2) soft tumor, internal debulking mostly with suction, and remaining fibrous strands resected with easily folded capsule; 3) average consistency, tumor cannot be freely suctioned and requires mechanical debulking, and the capsule then folds with relative ease; 4) firm tumor, high degree of mechanical debulking required, and capsule remains difficult to fold; and 5) extremely firm, calcified tumor, approaches density of bone, and capsule does not fold. Additional grading categories included tumor heterogeneity (with minimum and maximum consistency scores) and a 3-point vascularity score. This grading system was prospectively assessed in 50 consecutive patients undergoing craniotomy for meningioma resection by 2 surgeons in an independent fashion. Grading scores were subjected to a linear weighted kappa analysis for interuser reliability. Fifty patients (100 scores) were included in the analysis. The mean maximal tumor diameter was 4.3 cm. The distribution of overall tumor consistency scores was as follows: Grade 1, 4%; Grade 2, 9%; Grade 3, 43%; Grade 4, 44%; and Grade 5, 0%. Regions of Grade 5 consistency were reported only focally in 14% of heterogeneous tumors. Tumors were designated as homogeneous in 68% and heterogeneous in 32% of grades. The kappa analysis score for overall tumor consistency

  12. A Fully Consistent Hidden Semi-Markov Model-Based Speech Recognition System

    Science.gov (United States)

    Oura, Keiichiro; Zen, Heiga; Nankaku, Yoshihiko; Lee, Akinobu; Tokuda, Keiichi

    In a hidden Markov model (HMM), state duration probabilities decrease exponentially with time, which fails to adequately represent the temporal structure of speech. One of the solutions to this problem is integrating state duration probability distributions explicitly into the HMM. This form is known as a hidden semi-Markov model (HSMM). However, though a number of attempts to use HSMMs in speech recognition systems have been proposed, they are not consistent because various approximations were used in both training and decoding. By avoiding these approximations using a generalized forward-backward algorithm, a context-dependent duration modeling technique and weighted finite-state transducers (WFSTs), we construct a fully consistent HSMM-based speech recognition system. In a speaker-dependent continuous speech recognition experiment, our system achieved about 9.1% relative error reduction over the corresponding HMM-based system.

  13. Discretizing LTI Descriptor (Regular Differential Input Systems with Consistent Initial Conditions

    Directory of Open Access Journals (Sweden)

    Athanasios D. Karageorgos

    2010-01-01

    Full Text Available A technique for discretizing efficiently the solution of a Linear descriptor (regular differential input system with consistent initial conditions, and Time-Invariant coefficients (LTI is introduced and fully discussed. Additionally, an upper bound for the error ‖x¯(kT−x¯k‖ that derives from the procedure of discretization is also provided. Practically speaking, we are interested in such kind of systems, since they are inherent in many physical, economical and engineering phenomena.

  14. Consistency in Multi-Viewpoint Architectural Design of Enterprise Information Systems

    NARCIS (Netherlands)

    Dijkman, R.M.; Quartel, Dick; van Sinderen, Marten J.

    2006-01-01

    Different stakeholders in the design of an enterprise information system have their own view on that design. To help produce a coherent design this paper presents a framework that aids in specifying relations between such views. To help produce a consistent design the framework also aids in

  15. The potential for intelligent decision support systems to improve the quality and consistency of medication reviews.

    Science.gov (United States)

    Bindoff, I; Stafford, A; Peterson, G; Kang, B H; Tenni, P

    2012-08-01

    Drug-related problems (DRPs) are of serious concern worldwide, particularly for the elderly who often take many medications simultaneously. Medication reviews have been demonstrated to improve medication usage, leading to reductions in DRPs and potential savings in healthcare costs. However, medication reviews are not always of a consistently high standard, and there is often room for improvement in the quality of their findings. Our aim was to produce computerized intelligent decision support software that can improve the consistency and quality of medication review reports, by helping to ensure that DRPs relevant to a patient are overlooked less frequently. A system that largely achieved this goal was previously published, but refinements have been made. This paper examines the results of both the earlier and newer systems. Two prototype multiple-classification ripple-down rules medication review systems were built, the second being a refinement of the first. Each of the systems was trained incrementally using a human medication review expert. The resultant knowledge bases were analysed and compared, showing factors such as accuracy, time taken to train, and potential errors avoided. The two systems performed well, achieving accuracies of approximately 80% and 90%, after being trained on only a small number of cases (126 and 244 cases, respectively). Through analysis of the available data, it was estimated that without the system intervening, the expert training the first prototype would have missed approximately 36% of potentially relevant DRPs, and the second 43%. However, the system appeared to prevent the majority of these potential expert errors by correctly identifying the DRPs for them, leaving only an estimated 8% error rate for the first expert and 4% for the second. These intelligent decision support systems have shown a clear potential to substantially improve the quality and consistency of medication reviews, which should in turn translate into

  16. The Consistency of Performance Management System Based on Attributes of the Performance Indicator: An Empirical Study

    Directory of Open Access Journals (Sweden)

    Jan Zavadsky

    2014-07-01

    Full Text Available Purpose: The performance management system (PMS is a metasystem over all business processes at the strategic and operational level. Effectiveness of the various management systems depends on many factors. One of them is the consistent definition of each system elements. The main purpose of this study is to explore if the performance management systems of the sample companies is consistent and how companies can create such a system. The consistency in this case is based on the homogenous definition of attributes relating to the performance indicator as a basic element of PMS.Methodology: At the beginning, we used an affinity diagram that helped us to clarify and to group various attributes of performance indicators. The main research results we achieved are through empirical study. The empirical study was carried out in a sample of Slovak companies. The criterion for selection was the existence of the certified management systems according to the ISO 9001. Representativeness of the sample companies was confirmed by application of Pearson´s chi-squared test (χ2 - test due to above standards. Findings: Coming from the review of various literature, we defined four groups of attributes relating to the performance indicator: formal attributes, attributes of target value, informational attributes and attributes of evaluation. The whole set contains 21 attributes. The consistency of PMS is based not on maximum or minimum number of attributes, but on the same type of attributes for each performance indicator used in PMS at both the operational and strategic level. The main findings are: companies use various financial and non-financial indicators at strategic or operational level; companies determine various attributes of performance indicator, but most of the performance indicators are otherwise determined; we identified the common attributes for the whole sample of companies. Practical implications: The research results have got an implication for

  17. Metabolic Effects of Dietary Proteins, Amino Acids and The Other Amine Consisting Compounds on Cardiovascular System.

    Directory of Open Access Journals (Sweden)

    Elif Uğur

    2017-01-01

    Full Text Available During the prevention and treatment of cardiovascular diseases, first cause of deaths in the world, diet has a vital role. While nutrition programs for the cardiovascular health generally focus on lipids and carbohydrates, effects of proteins are not well concerned. Thus this review is written in order to examine effect of proteins, amino acids, and the other amine consisting compounds on cardiovascular system. Because of that animal or plant derived proteins have different protein composition in different foods such as dairy products, egg, meat, chicken, fish, pulse and grains, their effects on blood pressure and regulation of lipid profile are unlike. In parallel amino acids made up proteins have different effect on cardiovascular system. From this point, sulfur containing amino acids, branched chain amino acids, aromatic amino acids, arginine, ornithine, citrulline, glycine, and glutamine may affect cardiovascular system in different metabolic pathways. In this context, one carbon metabolism, synthesis of hormone, stimulation of signaling pathways and effects of intermediate and final products that formed as a result of amino acids metabolism is determined. Despite the protein and amino acids, some other amine consisting compounds in diet include trimethylamine N-oxide, heterocyclic aromatic amines, polycyclic aromatic hydrocarbons and products of Maillard reaction. These amine consisting compounds generally increase the risk for cardiovascular diseases by stimulating oxidative stress, inflammation, and formation of atherosclerotic plaque.

  18. The self-consistent field model for Fermi systems with account of three-body interactions

    Directory of Open Access Journals (Sweden)

    Yu.M. Poluektov

    2015-12-01

    Full Text Available On the basis of a microscopic model of self-consistent field, the thermodynamics of the many-particle Fermi system at finite temperatures with account of three-body interactions is built and the quasiparticle equations of motion are obtained. It is shown that the delta-like three-body interaction gives no contribution into the self-consistent field, and the description of three-body forces requires their nonlocality to be taken into account. The spatially uniform system is considered in detail, and on the basis of the developed microscopic approach general formulas are derived for the fermion's effective mass and the system's equation of state with account of contribution from three-body forces. The effective mass and pressure are numerically calculated for the potential of "semi-transparent sphere" type at zero temperature. Expansions of the effective mass and pressure in powers of density are obtained. It is shown that, with account of only pair forces, the interaction of repulsive character reduces the quasiparticle effective mass relative to the mass of a free particle, and the attractive interaction raises the effective mass. The question of thermodynamic stability of the Fermi system is considered and the three-body repulsive interaction is shown to extend the region of stability of the system with the interparticle pair attraction. The quasiparticle energy spectrum is calculated with account of three-body forces.

  19. General approach to the testing of binary solubility systems for thermodynamic consistency. Consolidated Fuel Reprocessing Program

    Energy Technology Data Exchange (ETDEWEB)

    Hamm, L.L.; Van Brunt, V.

    1982-08-01

    A comparison of implicit Runge-Kutta and orthogonal collocation methods is made for the numerical solution to the ordinary differential equation which describes the high-pressure vapor-liquid equilibria of a binary system. The systems of interest are limited to binary solubility systems where one of the components is supercritical and exists as a noncondensable gas in the pure state. Of the two methods - implicit Runge-Kuta and orthogonal collocation - this paper attempts to present some preliminary but not necessarily conclusive results that the implicit Runge-Kutta method is superior for the solution to the ordinary differential equation utilized in the thermodynamic consistency testing of binary solubility systems. Due to the extreme nonlinearity of thermodynamic properties in the region near the critical locus, an extended cubic spline fitting technique is devised for correlating the P-x data. The least-squares criterion is employed in smoothing the experimental data. Even though the derivation is presented specifically for the correlation of P-x data, the technique could easily be applied to any thermodynamic data by changing the endpoint requirements. The volumetric behavior of the systems must be given or predicted in order to perform thermodynamic consistency tests. A general procedure is developed for predicting the volumetric behavior required and some indication as to the expected limit of accuracy is given.

  20. Software-based microwave CT system consisting of antennas and vector network analyzer.

    Science.gov (United States)

    Ogawa, Takahiro; Miyakawa, Michio

    2011-01-01

    We have developed a software-based microwave CT (SMCT) that consists of antennas and a vector network analyzer. Regardless of the scanner type, SMCT collects the S-parameters at each measurement position in the frequency range of interest. After collecting all the S-parameters, it calculates the shortest path to obtain the projection data for CPMCT. Because of the redundant data in SMCT, the calculation of the projection is easily optimized. Therefore, the system can improve the accuracy and stability of the measurement. Furthermore, the experimental system is constructed at a reasonable cost. Hence, SMCT is useful for imaging experiments for CP-MCT and particularly for basic studies. This paper describes the software-based microwave imaging system, and experimental results show the usefulness of the system.

  1. Incorporating rapid neocortical learning of new schema-consistent information into complementary learning systems theory.

    Science.gov (United States)

    McClelland, James L

    2013-11-01

    The complementary learning systems theory of the roles of hippocampus and neocortex (McClelland, McNaughton, & O'Reilly, 1995) holds that the rapid integration of arbitrary new information into neocortical structures is avoided to prevent catastrophic interference with structured knowledge representations stored in synaptic connections among neocortical neurons. Recent studies (Tse et al., 2007, 2011) showed that neocortical circuits can rapidly acquire new associations that are consistent with prior knowledge. The findings challenge the complementary learning systems theory as previously presented. However, new simulations extending those reported in McClelland et al. (1995) show that new information that is consistent with knowledge previously acquired by a putatively cortexlike artificial neural network can be learned rapidly and without interfering with existing knowledge; it is when inconsistent new knowledge is acquired quickly that catastrophic interference ensues. Several important features of the findings of Tse et al. (2007, 2011) are captured in these simulations, indicating that the neural network model used in McClelland et al. has characteristics in common with neocortical learning mechanisms. An additional simulation generalizes beyond the network model previously used, showing how the rate of change of cortical connections can depend on prior knowledge in an arguably more biologically plausible network architecture. In sum, the findings of Tse et al. are fully consistent with the idea that hippocampus and neocortex are complementary learning systems. Taken together, these findings and the simulations reported here advance our knowledge by bringing out the role of consistency of new experience with existing knowledge and demonstrating that the rate of change of connections in real and artificial neural networks can be strongly prior-knowledge dependent. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  2. Convective plasma stability consistent with MHD equilibrium in magnetic confinement systems with a decreasing field

    Science.gov (United States)

    Tsventoukh, M. M.

    2010-10-01

    A study is made of the convective (interchange, or flute) plasma stability consistent with equilibrium in magnetic confinement systems with a magnetic field decreasing outward and large curvature of magnetic field lines. Algorithms are developed which calculate convective plasma stability from the Kruskal-Oberman kinetic criterion and in which the convective stability is iteratively consistent with MHD equilibrium for a given pressure and a given type of anisotropy in actual magnetic geometry. Vacuum and equilibrium convectively stable configurations in systems with a decreasing, highly curved magnetic field are calculated. It is shown that, in convectively stable equilibrium, the possibility of achieving high plasma pressures in the central region is restricted either by the expansion of the separatrix (when there are large regions of a weak magnetic field) or by the filamentation of the gradient plasma current (when there are small regions of a weak magnetic field, in which case the pressure drops mainly near the separatrix). It is found that, from the standpoint of equilibrium and of the onset of nonpotential ballooning modes, a kinetic description of convective stability yields better plasma confinement parameters in systems with a decreasing, highly curved magnetic field than a simpler MHD model and makes it possible to substantially improve the confinement parameters for a given type of anisotropy. For the Magnetor experimental compact device, the maximum central pressure consistent with equilibrium and stability is calculated to be as high as β ˜ 30%. It is shown that, for the anisotropy of the distribution function that is typical of a background ECR plasma, the limiting pressure gradient is about two times steeper than that for an isotropic plasma. From a practical point of view, the possibility is demonstrated of achieving better confinement parameters of a hot collisionless plasma in systems with a decreasing, highly curved magnetic field than those

  3. Self-consistent field theory based molecular dynamics with linear system-size scaling

    Energy Technology Data Exchange (ETDEWEB)

    Richters, Dorothee [Institute of Mathematics and Center for Computational Sciences, Johannes Gutenberg University Mainz, Staudinger Weg 9, D-55128 Mainz (Germany); Kühne, Thomas D., E-mail: kuehne@uni-mainz.de [Institute of Physical Chemistry and Center for Computational Sciences, Johannes Gutenberg University Mainz, Staudinger Weg 7, D-55128 Mainz (Germany); Technical and Macromolecular Chemistry, University of Paderborn, Warburger Str. 100, D-33098 Paderborn (Germany)

    2014-04-07

    We present an improved field-theoretic approach to the grand-canonical potential suitable for linear scaling molecular dynamics simulations using forces from self-consistent electronic structure calculations. It is based on an exact decomposition of the grand canonical potential for independent fermions and does neither rely on the ability to localize the orbitals nor that the Hamilton operator is well-conditioned. Hence, this scheme enables highly accurate all-electron linear scaling calculations even for metallic systems. The inherent energy drift of Born-Oppenheimer molecular dynamics simulations, arising from an incomplete convergence of the self-consistent field cycle, is circumvented by means of a properly modified Langevin equation. The predictive power of the present approach is illustrated using the example of liquid methane under extreme conditions.

  4. Coordinability and consistency in accident causation and prevention: formal system theoretic concepts for safety in multilevel systems.

    Science.gov (United States)

    Cowlagi, Raghvendra V; Saleh, Joseph H

    2013-03-01

    Although a "system approach" to accidents in sociotechnical systems has been frequently advocated, formal system theoretic concepts remain absent in the literature on accident analysis and system safety. To address this gap, we introduce the notions of coordinability and consistency from the hierarchical and multilevel systems theory literature. We then investigate the applicability and the importance of these concepts to accident causation and safety. Using illustrative examples, including the worst disaster in aviation history, and recent incidents in the United States of aircraft clipping each other on the tarmac, we propose that the lack of coordinability is a fundamental failure mechanism causing or contributing to accidents in multilevel systems. We make a similar case for the lack of consistency. Coordinability and consistency become ingredients for accident prevention, and their absence fundamental failure mechanisms that can lead to system accidents. Finally, using the concepts introduced in this work, we identify several venues for further research, including the development of a theory of coordination in multilevel systems, the investigation of potential synergies between coordinability, consistency, and the high reliability organizations paradigm, and the possibility of reframing the view that "sloppy management is the root cause of many industrial accidents" as one of lack of coordinability and/or consistency between management and operations. By introducing and expanding on the concepts of coordinability and consistency, we hope to contribute to the thinking about, and the to language of, accident causation, and prevention and to add to the intellectual toolkit of safety professionals and academics. © 2012 Society for Risk Analysis.

  5. Methods of information geometry in computational system biology (consistency between chemical and biological evolution).

    Science.gov (United States)

    Astakhov, Vadim

    2009-01-01

    Interest in simulation of large-scale metabolic networks, species development, and genesis of various diseases requires new simulation techniques to accommodate the high complexity of realistic biological networks. Information geometry and topological formalisms are proposed to analyze information processes. We analyze the complexity of large-scale biological networks as well as transition of the system functionality due to modification in the system architecture, system environment, and system components. The dynamic core model is developed. The term dynamic core is used to define a set of causally related network functions. Delocalization of dynamic core model provides a mathematical formalism to analyze migration of specific functions in biosystems which undergo structure transition induced by the environment. The term delocalization is used to describe these processes of migration. We constructed a holographic model with self-poetic dynamic cores which preserves functional properties under those transitions. Topological constraints such as Ricci flow and Pfaff dimension were found for statistical manifolds which represent biological networks. These constraints can provide insight on processes of degeneration and recovery which take place in large-scale networks. We would like to suggest that therapies which are able to effectively implement estimated constraints, will successfully adjust biological systems and recover altered functionality. Also, we mathematically formulate the hypothesis that there is a direct consistency between biological and chemical evolution. Any set of causal relations within a biological network has its dual reimplementation in the chemistry of the system environment.

  6. Consistent evaluation of an ultrasound-guided surgical navigation system by utilizing an active validation platform

    Science.gov (United States)

    Kim, Younsu; Kim, Sungmin; Boctor, Emad M.

    2017-03-01

    An ultrasound image-guided needle tracking systems have been widely used due to their cost-effectiveness and nonionizing radiation properties. Various surgical navigation systems have been developed by utilizing state-of-the-art sensor technologies. However, ultrasound transmission beam thickness causes unfair initial evaluation conditions due to inconsistent placement of the target with respect to the ultrasound probe. This inconsistency also brings high uncertainty and results in large standard deviations for each measurement when we compare accuracy with and without the guidance. To resolve this problem, we designed a complete evaluation platform by utilizing our mid-plane detection and time of flight measurement systems. The evaluating system uses a PZT element target and an ultrasound transmitting needle. In this paper, we evaluated an optical tracker-based surgical ultrasound-guided navigation system whereby the optical tracker tracks marker frames attached on the ultrasound probe and the needle. We performed ten needle trials of guidance experiment with a mid-plane adjustment algorithm and with a B-mode segmentation method. With the midplane adjustment, the result showed a mean error of 1.62+/-0.72mm. The mean error increased to 3.58+/-2.07mm without the mid-plane adjustment. Our evaluation system can reduce the effect of the beam-thickness problem, and measure ultrasound image-guided technologies consistently with a minimal standard deviation. Using our novel evaluation system, ultrasound image-guided technologies can be compared under equal initial conditions. Therefore, the error can be evaluated more accurately, and the system provides better analysis on the error sources such as ultrasound beam thickness.

  7. Self-consistent gyrokinetic Vlasov-Maxwell system for nonlinear processes in plasmas

    Science.gov (United States)

    Liu, Pengfei; Zhang, Wenlu; Dong, Chao; Lin, Jingbo; Lin, Zhihong

    2017-10-01

    A self-consistent gyrokinetic Vlasov-Maxwell system which is capable of studying phenomenons related to ponderomotive force is developed with long wavelength approximation and background Maxwellian distribution in the present of electromagnetic fluctuations. According to the ordering analysis, the introduction of quadratic Hamiltonian would raise the order of the Vlasov-Maxwell system. Therefore, guiding-center transformation is proceeded up to the order of ɛB2, and gyrocenter transformation is proceeded up to the order of ɛδ2. And higher order terms of the first order gyrocenter Hamiltonian H1 and gauge field S1 are brought back. In this way, effects are also presented which are resulted from the inhomogeneities of equilibrium profile but the curvature of equilibrium magnetic field on the moments of distribution.

  8. Consistency from the perspective of an experimental systems approach to the sciences and their epistemic objects

    Directory of Open Access Journals (Sweden)

    Hans-Jörg Rheinberger

    2011-06-01

    Full Text Available It is generally accepted that the development of the modern sciences is rooted in experiment. Yet for a long time, experimentation did not occupy a prominent role, neither in philosophy nor in history of science. With the 'practical turn' in studying the sciences and their history, this has begun to change. This paper is concerned with systems and cultures of experimentation and the consistencies that are generated within such systems and cultures. The first part of the paper exposes the forms of historical and structural coherence that characterize the experimental exploration of epistemic objects. In the second part, a particular experimental culture in the life sciences is briefly described as an example. A survey will be given of what it means and what it takes to analyze biological functions in the test tube.

  9. Self-consistent theory of finite Fermi systems and Skyrme–Hartree–Fock method

    Energy Technology Data Exchange (ETDEWEB)

    Saperstein, E. E., E-mail: saper@mbslab.kiae.ru; Tolokonnikov, S. V. [National Research Center Kurchatov Institute (Russian Federation)

    2016-11-15

    Recent results obtained on the basis of the self-consistent theory of finite Fermi systems by employing the energy density functional proposed by Fayans and his coauthors are surveyed. These results are compared with the predictions of Skyrme–Hartree–Fock theory involving several popular versions of the Skyrme energy density functional. Spherical nuclei are predominantly considered. The charge radii of even and odd nuclei and features of low-lying 2{sup +} excitations in semimagic nuclei are discussed briefly. The single-particle energies ofmagic nuclei are examined inmore detail with allowance for corrections to mean-field theory that are induced by particle coupling to low-lying collective surface excitations (phonons). The importance of taking into account, in this problem, nonpole (tadpole) diagrams, which are usually disregarded, is emphasized. The spectroscopic factors of magic and semimagic nuclei are also considered. In this problem, only the surface term stemming from the energy dependence induced in the mass operator by the exchange of surface phonons is usually taken into account. The volume contribution associated with the energy dependence initially present in the mass operator within the self-consistent theory of finite Fermi systems because of the exchange of high-lying particle–hole excitations is also included in the spectroscopic factor. The results of the first studies that employed the Fayans energy density functional for deformed nuclei are also presented.

  10. Application of the correlation consistent composite approach to biological systems and noncovalent interactions

    Science.gov (United States)

    Riojas, Amanda G.

    Advances in computing capabilities have facilitated the application of quantum mechanical methods to increasingly larger and more complex chemical systems, including weakly interacting and biologically relevant species. One such ab initio-based composite methodology, the correlation consistent composite approach (ccCA), has been shown to be reliable for the prediction of enthalpies of formation and reaction energies of main group species in the gas phase to within 1 kcal mol-1, on average, of well-established experiment, without dependence on experimental parameterization or empirical corrections. In this collection of work, ccCA has been utilized to determine the proton affinities of deoxyribonucleosides within an ONIOM framework (ONIOM-ccCA) and to predict accurate enthalpies of formation for organophosphorus compounds. Despite the complexity of these systems, ccCA is shown to result in enthalpies of formation to within ~2 kcal mol-1 of experiment and predict reliable reaction energies for systems with little to no experimental data. New applications for the ccCA method have also been introduced, expanding the utility of ccCA to solvated systems and complexes with significant noncovalent interactions. By incorporating the SMD solvation model into the ccCA formulation, the Solv-ccCA method is able to predict the pKa values of nitrogen systems to within 0.7 pKa unit (less than 1.0 kcal mol-1), overall. A hydrogen bonding constant has also been developed for use with weakly interacting dimers and small cluster compounds, resulting in ccCA interaction energies for water clusters and dimers of the S66 set to within 1.0 kcal mol-1 of well-established theoretical values.

  11. Consistent Probabilistic Description of the Neutral Kaon System: Novel Observable Effects

    CERN Document Server

    Bernabeu, J.; Villanueva-Perez, P.

    2013-01-01

    The neutral Kaon system has both CP violation in the mass matrix and a non-vanishing lifetime difference in the width matrix. This leads to an effective Hamiltonian which is not a normal operator, with incompatible (non-commuting) masses and widths. In the Weisskopf-Wigner Approach (WWA), by diagonalizing the entire Hamiltonian, the unphysical non-orthogonal "stationary" states $K_{L,S}$ are obtained. These states have complex eigenvalues whose real (imaginary) part does not coincide with the eigenvalues of the mass (width) matrix. In this work we describe the system as an open Lindblad-type quantum mechanical system due to Kaon decays. This approach, in terms of density matrices for initial and final states, provides a consistent probabilistic description, avoiding the standard problems because the width matrix becomes a composite operator not included in the Hamiltonian. We consider the dominant-decay channel to two pions, so that one of the Kaon states with definite lifetime becomes stable. This new approa...

  12. Self-consistent second-order Green's function perturbation theory for periodic systems.

    Science.gov (United States)

    Rusakov, Alexander A; Zgid, Dominika

    2016-02-07

    Despite recent advances, systematic quantitative treatment of the electron correlation problem in extended systems remains a formidable task. Systematically improvable Green's function methods capable of quantitatively describing weak and at least qualitatively strong correlations appear as promising candidates for computational treatment of periodic systems. We present a periodic implementation of temperature-dependent self-consistent 2nd-order Green's function (GF2) method, where the self-energy is evaluated in the basis of atomic orbitals. Evaluating the real-space self-energy in atomic orbitals and solving the Dyson equation in k-space are the key components of a computationally feasible algorithm. We apply this technique to the one-dimensional hydrogen lattice--a prototypical crystalline system with a realistic Hamiltonian. By analyzing the behavior of the spectral functions, natural occupations, and self-energies, we claim that GF2 is able to recover metallic, band insulating, and at least qualitatively Mott regimes. We observe that the iterative nature of GF2 is essential to the emergence of the metallic and Mott phases.

  13. Wire position system to consistently measure and record the location change of girders following ground changes

    Science.gov (United States)

    Choi, H. J.; Lee, S. B.; Lee, H. G.; Y Back, S.; Kim, S. H.; Kang, H. S.

    2017-07-01

    Several parts that comprise the large scientific device should be installed and operated at the accurate three-dimensional location coordinates (X, Y, and Z) where they should be subjected to survey and alignment. The location of the aligned parts should not be changed in order to ensure that the electron beam parameters (Energy 10 GeV, Charge 200 pC, and Bunch Length 60 fs, Emittance X/Y 0.481 μm/0.256 μm) of PAL-XFEL (X-ray Free Electron Laser of the Pohang Accelerator Laboratory) remain stable and can be operated without any problems. As time goes by, however, the ground goes through uplift and subsidence, which consequently deforms building floors. The deformation of the ground and buildings changes the location of several devices including magnets and RF accelerator tubes, which eventually leads to alignment errors (∆X, ∆Y, and ∆Z). Once alignment errors occur with regard to these parts, the electron beam deviates from its course and beam parameters change accordingly. PAL-XFEL has installed the Hydrostatic Leveling System (HLS) to measure and record the vertical change of buildings and ground consistently and systematically and the Wire Position System (WPS) to measure the two dimensional changes of girders. This paper is designed to introduce the operating principle and design concept of WPS and discuss the current situation regarding installation and operation.

  14. Steps towards a consistent Climate Forecast System Reanalysis wave hindcast (1979-2016)

    Science.gov (United States)

    Stopa, Justin E.; Ardhuin, Fabrice; Huchet, Marion; Accensi, Mickael

    2017-04-01

    Surface gravity waves are being increasingly recognized as playing an important role within the climate system. Wave hindcasts and reanalysis products of long time series (>30 years) have been instrumental in understanding and describing the wave climate for the past several decades and have allowed a better understanding of extreme waves and inter-annual variability. Wave hindcasts have the advantage of covering the oceans in higher space-time resolution than possible with conventional observations from satellites and buoys. Wave reanalysis systems like ECWMF's ERA-Interim directly included a wave model that is coupled to the ocean and atmosphere, otherwise reanalysis wind fields are used to drive a wave model to reproduce the wave field in long time series. The ERA Interim dataset is consistent in time, but cannot adequately resolve extreme waves. On the other hand, the NCEP Climate Forecast System (CFSR) wind field better resolves the extreme wind speeds, but suffers from discontinuous features in time which are due to the quantity and quality of the remote sensing data incorporated into the product. Therefore, a consistent hindcast that resolves the extreme waves still alludes us limiting our understanding of the wave climate. In this study, we systematically correct the CFSR wind field to reproduce a homogeneous wave field in time. To verify the homogeneity of our hindcast we compute error metrics on a monthly basis using the observations from a merged altimeter wave database which has been calibrated and quality controlled from 1985-2016. Before 1985 only few wave observations exist and are limited to a select number of wave buoys mostly in the North Hemisphere. Therefore we supplement our wave observations with seismic data which responds to nonlinear wave interactions created by opposing waves with nearly equal wavenumbers. Within the CFSR wave hindcast, we find both spatial and temporal discontinuities in the error metrics. The Southern Hemisphere often

  15. Dental students consistency in applying the ICDAS system within paediatric dentistry.

    Science.gov (United States)

    Foley, J I

    2012-12-01

    To examine dental students' consistency in utilising the International Caries Detection and Assessment System (ICDAS) one and three months after training. A prospective study. All clinical dental students (Year Two: BDS2; Year Three: BDS3; Year Four: BDS4) as part of their education in Paediatric Dentistry at Aberdeen Dental School (n = 56) received baseline training by two "gold-standard" examiners and were advised to complete the 90-minute ICDAS e-learning program. Study One: One month later, the occlusal surface of 40 extracted primary and permanent molar teeth were examined and assigned both a caries (0-6 scale) and restorative code (0-9 scale). Study Two: The same teeth were examined three months later. Kappa statistics were used to determine inter- and intra-examiner reliability at baseline and after three months. In total, 31 students (BDS2: n = 9; BDS3: n = 8; BDS4: n = 14) completed both examinations. The inter-examiner reliability kappa scores for restoration codes for Study One and Study Two were: BDS2: 0.47 and 0.38; BDS3: 0.61 and 0.52 and BDS4: 0.56 and 0.52. The caries scores for the two studies were: BDS2: 0.31 and 0.20; BDS3: 0.45 and 0.32 and BDS4: 0.35 and 0.34. The intra-examiner reliability range for restoration codes were: BDS2: 0.20 to 0.55; BDS3: 0.34 to 0.72 and BDS4: 0.28 to 0.80. The intra-examiner reliability range for caries codes were: BDS2: 0.35 to 0.62; BDS3: 0.22 to 0.53 and BDS4: 0.22 to 0.65. The consistency of ICDAS codes varied between students and also, between year groups. In general, consistency was greater for restoration codes.

  16. Consistency analysis for the performance of planar detector systems used in advanced radiotherapy

    Directory of Open Access Journals (Sweden)

    Kanan Jassal

    2015-03-01

    Full Text Available Purpose: To evaluate the performance linked to the consistency of a-Si EPID and ion-chamber array detectors for dose verification in advanced radiotherapy.Methods: Planar measurements were made for 250 patients using an array of ion chamber and a-Si EPID. For pre-treatment verification, the plans were generated on the phantom for re-calculation of doses. The γ-evaluation method with the criteria: dose-difference (DD ≤ 3% and distance-to-agreement (DTA ≤ 3 mm was used for the comparison of measurements. Also, the central axis (CAX doses were measured using 0.125cc ion chamber and were compared with the central chamber of array and central pixel correlated dose value from EPID image. Two types of statistical approaches were applied for the analysis. Conventional statistics used analysis of variance (ANOVA and unpaired t-test to evaluate the performance of the detectors. And statistical process control (SPC was utilized to study the statistical variation for the measured data. Control charts (CC based on an average , standard deviation ( and exponentially weighted moving averages (EWMA were prepared. The capability index (Cpm was determined as an indicator for the performance consistency of the two systems.Results: Array and EPID measurements had the average gamma pass rates as 99.9% ± 0.15% and 98.9% ± 1.06% respectively. For the point doses, the 0.125cc chamber results were within 2.1% ± 0.5% of the central chamber of the array. Similarly, CAX doses from EPID and chamber matched within 1.5% ± 0.3%. The control charts showed that both the detectors were performing optimally and all the data points were within ± 5%. EWMA charts revealed that both the detectors had a slow drift along the mean of the processes but was found well within ± 3%. Further, higher Cpm values for EPID demonstrate its higher efficiency for radiotherapy techniques.Conclusion: The performances of both the detectors were seen to be of high quality irrespective of the

  17. Consistency in mixed demand systems: Contingent valuation and travel cost data

    NARCIS (Netherlands)

    Cunha-E-Sá, Maria A.; Ducla-Soares, Maria M.; Nunes, Luis C.; Polome, Philippe

    2004-01-01

    In contrast to previous literature, we propose a consistency test that does not impose any particular common functional form for the preference structure underlying the travel cost (TC) and contingent valuation (CV) models. We derive testable consistency conditions between TC and CV data in the

  18. Assessment of the Degree of Consistency of the System of Fuzzy Rules

    Directory of Open Access Journals (Sweden)

    Pospelova Lyudmila Yakovlevna

    2013-12-01

    Full Text Available The article analyses recent achievements and publications and shows that difficulties of explaining the nature of fuzziness and equivocation arise in socio-economic models that use the traditional paradigm of classical rationalism (computational, agent and econometric models. The accumulated collective experience of development of optimal models confirms prospectiveness of application of the fuzzy set approach in modelling the society. The article justifies the necessity of study of the nature of inconsistency in fuzzy knowledge bases both on the generalised ontology level and on pragmatic functional level of the logical inference. The article offers the method of search for logical and conceptual contradictions in the form of a combination of the abduction and modus ponens. It discusses the key issue of the proposed method: what properties should have the membership function of the secondary fuzzy set, which describes in fuzzy inference models such a resulting state of the object of management, which combines empirically incompatible properties with high probability. The degree of membership of the object of management in several incompatible classes with respect to the fuzzy output variable is the degree of fuzziness of the “Intersection of all results of the fuzzy inference of the set, applied at some input of rules, is an empty set” statement. The article describes an algorithm of assessment of the degree of consistency. It provides an example of the step-by-step detection of contradictions in statistical fuzzy knowledge bases at the pragmatic functional level of the logical output. The obtained results of testing in the form of sets of incompatible facts, output chains, sets of non-crossing intervals and computed degrees of inconsistency allow experts timely elimination of inadmissible contradictions and, at the same time, increase of quality of recommendations and assessment of fuzzy expert systems.

  19. Performance of hybrid quad generation system consisting of solid oxide fuel cell system and absorption heat pump

    DEFF Research Database (Denmark)

    Cachorro, Irene Albacete; Daraban, Iulia Maria; Lainé, Guillaume

    2013-01-01

    In this paper a system consisting of an SOFC system for cogeneration of heat and power and vapour absorption heat pump for cooling and freezing is assessed and performance is evaluated. Food industry where demand includes four forms of energy simultaneously is a relevant application such a system...... with natural gas. The natural gas is first converted to a mixture of H2 and CO which feed the anode after a preheating step. The cathode is supplied with preheated air and gives, as output, electrical energy. The anode output is the exhaust gas which represents the thermal energy reservoir for heating...... in order to meet the bought cooling and freezing demands. This is an innovative configuration for absorption heat pumps because the cascade is implemented only in vapour compression heat pumps. A smaller ratio of the exhausted gases supplies the energy demand for space heating. The SOFC is fuelled...

  20. Process consistency in models : The importance of system signatures, expert knowledge, and process complexity

    NARCIS (Netherlands)

    Hrachowitz, M.; Fovet, O.; Ruiz, L.; Euser, T.; Gharari, S.; Nijzink, R.; Freer, J.; Savenije, H.H.G.; Gascuel-Odoux, C.

    2014-01-01

    Hydrological models frequently suffer from limited predictive power despite adequate calibration performances. This can indicate insufficient representations of the underlying processes. Thus, ways are sought to increase model consistency while satisfying the contrasting priorities of increased

  1. Adjoint-consistent formulations of slip models for coupled electroosmotic flow systems

    KAUST Repository

    Garg, Vikram V

    2014-09-27

    Background Models based on the Helmholtz `slip\\' approximation are often used for the simulation of electroosmotic flows. The objectives of this paper are to construct adjoint-consistent formulations of such models, and to develop adjoint-based numerical tools for adaptive mesh refinement and parameter sensitivity analysis. Methods We show that the direct formulation of the `slip\\' model is adjoint inconsistent, and leads to an ill-posed adjoint problem. We propose a modified formulation of the coupled `slip\\' model, which is shown to be well-posed, and therefore automatically adjoint-consistent. Results Numerical examples are presented to illustrate the computation and use of the adjoint solution in two-dimensional microfluidics problems. Conclusions An adjoint-consistent formulation for Helmholtz `slip\\' models of electroosmotic flows has been proposed. This formulation provides adjoint solutions that can be reliably used for mesh refinement and sensitivity analysis.

  2. STUDY OF TRANSIENT AND STATIONARY OPERATION MODES OF SYNCHRONOUS SYSTEM CONSISTING IN TWO MACHINES

    Directory of Open Access Journals (Sweden)

    V. S. Safaryan

    2017-01-01

    Full Text Available The solution of the problem of reliable functioning of an electric power system (EPS in steady-state and transient regimes, prevention of EPS transition into asynchronous regime, maintenance and restoration of stability of post-emergency processes is based on formation and realization of mathematical models of an EPS processes. During the functioning of electric power system in asynchronous regime, besides the main frequencies, the currents and voltages include harmonic components, the frequencies of which are multiple of the difference of main frequencies. At the two-frequency asynchronous regime the electric power system is being made equivalent in a form of a two-machine system, functioning for a generalized load. In the article mathematical models of transient process of a two-machine system in natural form and in d–q coordinate system are presented. The mathematical model of two-machine system is considered in case of two windings of excitement at the rotors. Also, in the article varieties of mathematical models of EPS transient regimes (trivial, simple, complete are presented. Transient process of a synchronous two-machine system is described by the complete model. The quality of transient processes of a synchronous machine depends on the number of rotor excitation windings. When there are two excitation windings on the rotor (dual system of excitation, the mathematical model of electromagnetic transient processes of a synchronous machine is represented in a complex form, i.e. in coordinate system d, q, the current of rotor being represented by a generalized vector. In asynchronous operation of a synchronous two-machine system with two excitation windings on the rotor the current and voltage systems include only harmonics of two frequencies. The mathematical model of synchronous steady-state process of a two-machine system is also provided, and the steady-state regimes with different structures of initial information are considered.

  3. Direct measurement of weakly nonequilibrium system entropy is consistent with Gibbs–Shannon form

    Science.gov (United States)

    Gavrilov, Momčilo; Chétrite, Raphaël; Bechhoefer, John

    2017-10-01

    Stochastic thermodynamics extends classical thermodynamics to small systems in contact with one or more heat baths. It can account for the effects of thermal fluctuations and describe systems far from thermodynamic equilibrium. A basic assumption is that the expression for Shannon entropy is the appropriate description for the entropy of a nonequilibrium system in such a setting. Here we measure experimentally this function in a system that is in local but not global equilibrium. Our system is a micron-scale colloidal particle in water, in a virtual double-well potential created by a feedback trap. We measure the work to erase a fraction of a bit of information and show that it is bounded by the Shannon entropy for a two-state system. Further, by measuring directly the reversibility of slow protocols, we can distinguish unambiguously between protocols that can and cannot reach the expected thermodynamic bounds.

  4. Properties of ground states of atomic nuclei in self-consistent theory of finite fermi-system

    Energy Technology Data Exchange (ETDEWEB)

    Sapershtejn, Eh.E.; Khodel' , V.A. (Gosudarstvennyj Komitet po Ispol' zovaniyu Atomnoj Ehnergii SSSR, Moscow. Inst. Atomnoj Ehnergii)

    1983-05-01

    Ground states of atomic nuclei are described within the framework of the self-consistent theory of finite Fermi systems. The developed approach is compared with the Hartree-Fock method with effective forces.

  5. Determining the Consistency of Information between Multiple Systems Used in Maritime Domain Awareness

    Science.gov (United States)

    2010-07-01

    contrat de sept mois accordé à OODA Technologies en appui au projet de recherche appliquée 11HL : Technologies assurant la fiabilité de la connaissance...architectural modification would have to be made to enable the tracking of a source consistency per ship. Loose-coupling allows possibility of reuse and...trieve the information. In the case of a modification in the website’s search page output, the extraction would be compromised. As mentioned in Section

  6. Construction of an integrated enzyme system consisting azoreductase and glucose 1-dehydrogenase for dye removal.

    Science.gov (United States)

    Yang, Yuyi; Wei, Buqing; Zhao, Yuhua; Wang, Jun

    2013-02-01

    Azo dyes are toxic and carcinogenic and are often present in industrial effluents. In this research, azoreductase and glucose 1-dehydrogenase were coupled for both continuous generation of the cofactor NADH and azo dye removal. The results show that 85% maximum relative activity of azoreductase in an integrated enzyme system was obtained at the conditions: 1U azoreductase:10U glucose 1-dehydrogenase, 250mM glucose, 1.0mM NAD(+) and 150μM methyl red. Sensitivity analysis of the factors in the enzyme system affecting dye removal examined by an artificial neural network model shows that the relative importance of enzyme ratio between azoreductase and glucose 1-dehydrogenase was 22%, followed by dye concentration (27%), NAD(+) concentration (23%) and glucose concentration (22%), indicating none of the variables could be ignored in the enzyme system. Batch results show that the enzyme system has application potential for dye removal. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. How to consistently make your product, technology or system more environmentally-sustainable?

    DEFF Research Database (Denmark)

    Laurent, Alexis; Cosme, Nuno Miguel Dias; Molin, Christine

    the assessment of the product/technology in a life cycle perspective, from the extraction of raw materials through production and use/operation of the product up to its final disposal. Fully embracing these 2 features enables to minimize the risk of burden-shifting, e.g. if impacts on climate change are being......-hand with low environmental impacts, low-carbon emissions, low environmental footprints or more sustainability as a whole. To enable a scientifically-sound and consistent documentation of such sustainable development, quantitative assessments of all environmental impacts are needed. Life cycle assessment (LCA......-impact materials, identifying environmental hotspots parts of the life cycle with largest environmental impacts), making prospective simulations through scenario analyses, comparing and selecting most environmentally-friendly product/technology alternatives, reporting on the environmental performances...

  8. Fundamental measure theory for the inhomogeneous hard-sphere system based on Santos' consistent free energy.

    Science.gov (United States)

    Hansen-Goos, Hendrik; Mortazavifar, Mostafa; Oettel, Martin; Roth, Roland

    2015-05-01

    Based on Santos' general solution for the scaled-particle differential equation [Phys. Rev. E 86, 040102(R) (2012)], we construct a free-energy functional for the hard-sphere system. The functional is obtained by a suitable generalization and extension of the set of scaled-particle variables using the weighted densities from Rosenfeld's fundamental measure theory for the hard-sphere mixture [Phys. Rev. Lett. 63, 980 (1989)]. While our general result applies to the hard-sphere mixture, we specify remaining degrees of freedom by requiring the functional to comply with known properties of the pure hard-sphere system. Both for mixtures and pure systems, the functional can be systematically extended following the lines of our derivation. We test the resulting functionals regarding their behavior upon dimensional reduction of the fluid as well as their ability to accurately describe the hard-sphere crystal and the liquid-solid transition.

  9. Cosmological evolution and Solar System consistency of massive scalar-tensor gravity

    Science.gov (United States)

    de Pirey Saint Alby, Thibaut Arnoulx; Yunes, Nicolás

    2017-09-01

    The scalar-tensor theory of Damour and Esposito-Farèse recently gained some renewed interest because of its ability to suppress modifications to general relativity in the weak field, while introducing large corrections in the strong field of compact objects through a process called scalarization. A large sector of this theory that allows for scalarization, however, has been shown to be in conflict with Solar System observations when accounting for the cosmological evolution of the scalar field. We here study an extension of this theory by endowing the scalar field with a mass to determine whether this allows the theory to pass Solar System constraints upon cosmological evolution for a larger sector of coupling parameter space. We show that the cosmological scalar field goes first through a quiescent phase, similar to the behavior of a massless field, but then it enters an oscillatory phase, with an amplitude (and frequency) that decays (and grows) exponentially. We further show that after the field enters the oscillatory phase, its effective energy density and pressure are approximately those of dust, as expected from previous cosmological studies. Due to these oscillations, we show that the scalar field cannot be treated as static today on astrophysical scales, and so we use time-dependent perturbation theory to compute the scalar-field-induced modifications to Solar System observables. We find that these modifications are suppressed when the mass of the scalar field and the coupling parameter of the theory are in a wide range, allowing the theory to pass Solar System constraints, while in principle possibly still allowing for scalarization.

  10. Interface Consistency

    DEFF Research Database (Denmark)

    Staunstrup, Jørgen

    1998-01-01

    This paper proposes that Interface Consistency is an important issue for the development of modular designs. Byproviding a precise specification of component interfaces it becomes possible to check that separately developedcomponents use a common interface in a coherent matter thus avoiding a very...

  11. Generation-based memory synchronization in a multiprocessor system with weakly consistent memory accesses

    Energy Technology Data Exchange (ETDEWEB)

    Ohmacht, Martin

    2017-08-15

    In a multiprocessor system, a central memory synchronization module coordinates memory synchronization requests responsive to memory access requests in flight, a generation counter, and a reclaim pointer. The central module communicates via point-to-point communication. The module includes a global OR reduce tree for each memory access requesting device, for detecting memory access requests in flight. An interface unit is implemented associated with each processor requesting synchronization. The interface unit includes multiple generation completion detectors. The generation count and reclaim pointer do not pass one another.

  12. Generation-based memory synchronization in a multiprocessor system with weakly consistent memory accesses

    Science.gov (United States)

    Ohmacht, Martin

    2014-09-09

    In a multiprocessor system, a central memory synchronization module coordinates memory synchronization requests responsive to memory access requests in flight, a generation counter, and a reclaim pointer. The central module communicates via point-to-point communication. The module includes a global OR reduce tree for each memory access requesting device, for detecting memory access requests in flight. An interface unit is implemented associated with each processor requesting synchronization. The interface unit includes multiple generation completion detectors. The generation count and reclaim pointer do not pass one another.

  13. Reducing Friction and Wear of Tribological Systems through Hybrid Tribofilm Consisting of Coating and Lubricants

    Directory of Open Access Journals (Sweden)

    Shuichiro Yazawa

    2014-06-01

    Full Text Available The role of surface protective additives becomes vital when operating conditions become severe and moving components operate in a boundary lubrication regime. After protecting film is slowly removed by rubbing, it can regenerate through the tribochemical reaction of the additives at the contact. However, there are limitations about the regeneration of the protecting film when additives are totally consumed. On the other hand, there are a lot of hard coatings to protect the steel surface from wear. These can enable the functioning of tribological systems, even in adverse lubrication conditions. However, hard coatings usually make the friction coefficient higher, because of their high interfacial shear strength. Amongst hard coatings, diamond-like carbon (DLC is widely used, because of its relatively low friction and superior wear resistance. In practice, conventional lubricants that are essentially formulated for a steel/steel surface are still used for lubricating machine component surfaces provided with protective coatings, such as DLCs, despite the fact that the surface properties of coatings are quite different from those of steel. It is therefore important that the design of additive molecules and their interaction with coatings should be re-considered. The main aim of this paper is to discuss the DLC and the additive combination that enable tribofilm formation and effective lubrication of tribological systems.

  14. Self-consistent calculations within the extended theory of finite Fermi systems

    Science.gov (United States)

    Avdeenkov, A.; Grümmer, F.; Kamerdzhiev, S.; Krewald, S.; Lyutorovich, N.; Speth, J.

    2007-09-01

    The Extended Theory of Finite Fermi Systems (ETFFS) describes nuclear excitations considering phonons and pairing degrees of freedom, using the effective Landau-Migdal interaction and nuclear mean fields obtained from experimental data. Here we employ the nuclear mean field derived from Skyrme interactions and the corresponding particle-hole interaction. This allows to extend the range of applicability of the ETFFS to experimentally not yet investigated short-lived isotopes. We find that Skyrme interactions which reproduce at the mean field level both ground state properties and nuclear excitations are able to describe the spreading widths of the giant resonances in the new approach, but produce shifts of the centroid energies. A renormalization of the Skyrme interactions is required for approaches going beyond the mean field level.

  15. Self-consistent calculations within the extended theory of finite Fermi systems

    Energy Technology Data Exchange (ETDEWEB)

    Avdeenkov, A. [Institut fuer Kernphysik, Forschungszentrum Juelich, 52425 Juelich (Germany); Institute of Physics and Power Engineering, 249020 Obninsk (Russian Federation); Gruemmer, F. [Institut fuer Kernphysik, Forschungszentrum Juelich, 52425 Juelich (Germany); Kamerdzhiev, S. [Institute of Physics and Power Engineering, 249020 Obninsk (Russian Federation); Krewald, S. [Institut fuer Kernphysik, Forschungszentrum Juelich, 52425 Juelich (Germany); Lyutorovich, N. [Institut fuer Kernphysik, Forschungszentrum Juelich, 52425 Juelich (Germany); Institute of Physics, St. Petersburg University (Russian Federation); Speth, J. [Institut fuer Kernphysik, Forschungszentrum Juelich, 52425 Juelich (Germany); Institute of Nuclear Physics, PAN, PL-31-342 Cracow (Poland)], E-mail: j.speth@tz-juelich.de

    2007-09-20

    The Extended Theory of Finite Fermi Systems (ETFFS) describes nuclear excitations considering phonons and pairing degrees of freedom, using the effective Landau-Migdal interaction and nuclear mean fields obtained from experimental data. Here we employ the nuclear mean field derived from Skyrme interactions and the corresponding particle-hole interaction. This allows to extend the range of applicability of the ETFFS to experimentally not yet investigated short-lived isotopes. We find that Skyrme interactions which reproduce at the mean field level both ground state properties and nuclear excitations are able to describe the spreading widths of the giant resonances in the new approach, but produce shifts of the centroid energies. A renormalization of the Skyrme interactions is required for approaches going beyond the mean field level.

  16. Neonatal protection by an innate immune system of human milk consisting of oligosaccharides and glycans.

    Science.gov (United States)

    Newburg, D S

    2009-04-01

    This review discusses the role of human milk glycans in protecting infants, but the conclusion that the human milk glycans constitute an innate immune system whereby the mother protects her offspring may have general applicability in all mammals, including species of commercial importance. Infants that are not breastfed have a greater incidence of severe diarrhea and respiratory diseases than those who are breastfed. In the past, this had been attributed primarily to human milk secretory antibodies. However, the oligosaccharides are major components of human milk, and milk is also rich in other glycans, including glycoproteins, mucins, glycosaminoglycans, and glycolipids. These milk glycans, especially the oligosaccharides, are composed of thousands of components. The milk factor that promotes gut colonization by Bifidobacterium bifidum was found to be a glycan, and such prebiotic characteristics may contribute to protection against infectious agents. However, the ability of human milk glycans to protect the neonate seems primarily to be due to their inhibition of pathogen binding to their host cell target ligands. Many such examples include specific fucosylated oligosaccharides and glycans that inhibit specific pathogens. Most human milk oligosaccharides are fucosylated, and their production depends on fucosyltransferase enzymes; mutations in these fucosyltransferase genes are common and underlie the various Lewis blood types in humans. Variable expression of specific fucosylated oligosaccharides in milk, also a function of these genes (and maternal Lewis blood type), is significantly associated with the risk of infectious disease in breastfed infants. Human milk also contains major quantities and large numbers of sialylated oligosaccharides, many of which are also present in bovine colostrum. These could similarly inhibit several common viral pathogens. Moreover, human milk oligosaccharides strongly attenuate inflammatory processes in the intestinal mucosa. These

  17. Final Scientific/Technical Report "Arc Tube Coating System for Color Consistency"

    Energy Technology Data Exchange (ETDEWEB)

    Buelow, Roger [Energy Focus, Inc., Solon, OH (United States); Jenson, Chris [Energy Focus, Inc., Solon, OH (United States); Kazenski, Keith [Energy Focus, Inc., Solon, OH (United States)

    2013-03-21

    DOE has enabled the use of coating materials using low cost application methods on light sources to positively affect the output of those sources. The coatings and light source combinations have shown increased lumen output of LED fixtures (1.5%-2.0%), LED arrays (1.4%) and LED powered remote phosphor systems Philips L-Prize lamp (0.9%). We have also demonstrated lifetime enhancements (3000 hrs vs 8000 hrs) and shifting to higher CRI (51 to 65) in metal halide high intensity discharge lamps with metal oxide coatings. The coatings on LEDs and LED products are significant as the market is moving increasingly more towards LED technology. Enhancements in LED performance are demonstrated in this work through the use of available materials and low cost application processes. EFOI used low refractive index fluoropolymers and low cost dipping processes for application of the material to surfaces related to light transmission of LEDs and LED products. Materials included Teflon AF, an amorphous fluorinated polymer and fluorinated acrylic monomers. The DOE SSL Roadmap sets goals for LED performance moving into the future. EFOI's coating technology is a means to shift the performance curve for LEDs. This is not limited to one type of LED, but is relevant across LED technologies. The metal halide work included the use of sol-gel solutions resulting in silicon dioxide and titanium dioxide coatings on the quartz substrates of the metal halide arc tubes. The coatings were applied using low cost dipping processes.

  18. Linearized cloudpoint curve correlation for ternary systems consisting of one polymer, one solvent and one non-solvent

    NARCIS (Netherlands)

    Boom, R.M.; Boom, R.M.; van den Boomgaard, Anthonie; van den Berg, J.W.A.; Smolders, C.A.; Smolders, C.A.

    1993-01-01

    A linear correlation function is found for cloudpoint composition curves of ternary systems consisting of one polymer, one solvent and one non-solvent. The conditions for validity of this correlation function appear to be that the polymer is strongly incompatible with the non-solvent, and that only

  19. A sample design for globally consistent biomass estimation using lidar data from the Geoscience Laser Altimeter System (GLAS)

    Science.gov (United States)

    Sean P. Healey; Paul L. Patterson; Sassan S. Saatchi; Michael A. Lefsky; Andrew J. Lister; Elizabeth A. Freeman

    2012-01-01

    Lidar height data collected by the Geosciences Laser Altimeter System (GLAS) from 2002 to 2008 has the potential to form the basis of a globally consistent sample-based inventory of forest biomass. GLAS lidar return data were collected globally in spatially discrete full waveform "shots," which have been shown to be strongly correlated with aboveground forest...

  20. Model-based optimal control of a hybrid power generation system consisting of photovoltaic arrays and fuel cells

    Science.gov (United States)

    Zervas, P. L.; Sarimveis, H.; Palyvos, J. A.; Markatos, N. C. G.

    Hybrid renewable energy systems are expected to become competitive to conventional power generation systems in the near future and, thus, optimization of their operation is of particular interest. In this work, a hybrid power generation system is studied consisting of the following main components: photovoltaic array (PV), electrolyser, metal hydride tanks, and proton exchange membrane fuel cells (PEMFC). The key advantage of the hybrid system compared to stand-alone photovoltaic systems is that it can store efficiently solar energy by transforming it to hydrogen, which is the fuel supplied to the fuel cell. However, decision making regarding the operation of this system is a rather complicated task. A complete framework is proposed for managing such systems that is based on a rolling time horizon philosophy.

  1. Stability of Steady-State Motion of an Isolated System Consisting of a Rotating Body and Two Pendulums

    Science.gov (United States)

    Filimonikhin, G. B.; Filimonikhina, I. I.; Pirogov, V. V.

    2014-07-01

    An isolated mechanical system consisting of a rotating body and two pendulums fit on its longitudinal axis is studied. This system models how pendulum, ball, or fluid (ring) dampers decrease or increase the nutation angle of a spin-stabilized artificial satellite. The conditions of origin, existence, and cessation of the steady-state motion of the system, depending on its parameters, and the stability conditions for the primary motion (the body rotates about the longitudinal axis and the pendulums lie on the same line) and secondary motions (the body does not rotate around the longitudinal axis) are established. The residual nutation angle is estimated

  2. Air Force System Safety Handbook, Designing the Safest Possible Systems Consistent with Mission Requirements and Cost Effectiveness

    Science.gov (United States)

    2000-07-01

    is a discipline employed from the initial design steps through system demilitarization or disposal (a.k.a. “cradle to grave or “ womb to tomb”). 1.2...clear weather versus cold weather and limited visibility; or smooth, level desert versus mountainous terrain. (2) Artificial or induced environment...aircraft or buildings. (3) In compiling a preliminary hazard list (Figure A-5), the analyst should identify the natural and artificial environmental

  3. The self-consistent model of the anomalously slow relaxation of the systems nonwetting liquid-nanoporous medium

    Science.gov (United States)

    Borman, Vladimir Dmitrievich; Borodulya, Nikolay Andreevich; Belogorlov, Anton Anatolevich; Tronin, Vladimir Nikolaevich

    2017-11-01

    This paper provides information on a self-consistent model of an anomalously slow relaxation of nonwetting liquid-nanoporous medium systems with a random size distribution of pores, which introduces changes in interaction between local liquid cluster configurations in the process of liquid outflow from the porous medium. A self-consistent equation was deduced, the solution of which determines a functional connection of porous medium filling degree or time 𝜃(t). It is shown that the anomalously slow relaxation is presented as a process of decay of interacting local metastable configurations, initialized by thermal fluctuations. As time increments, relaxation acceleration takes place with subsequent avalanche fluid outflow from the porous medium, which is connected with interaction decrease between local configurations. The dependence of the fraction of volume of liquid remaining in a porous medium changes by the power law 𝜃(t) ˜ t‑α(T,t). It is shown that for a system of water-L23 at the initial stage in the time range of 10s < t < 103s, an index assumes a constant value α ≈const(T), while at the following stage the acceleration of relaxation and the increase of parameter α(T,t) are observed.

  4. A SELF-CONSISTENT MODEL OF THE CIRCUMSTELLAR DEBRIS CREATED BY A GIANT HYPERVELOCITY IMPACT IN THE HD 172555 SYSTEM

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, B. C.; Melosh, H. J. [Department of Physics, Purdue University, 525 Northwestern Avenue, West Lafayette, IN 47907 (United States); Lisse, C. M. [JHU-APL, 11100 Johns Hopkins Road, Laurel, MD 20723 (United States); Chen, C. H. [STScI, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Wyatt, M. C. [Institute of Astronomy, University of Cambridge, Madingley Road, Cambridge CB3 0HA (United Kingdom); Thebault, P. [LESIA, Observatoire de Paris, F-92195 Meudon Principal Cedex (France); Henning, W. G. [NASA Goddard Space Flight Center, 8800 Greenbelt Road, Greenbelt, MD 20771 (United States); Gaidos, E. [Department of Geology and Geophysics, University of Hawaii at Manoa, Honolulu, HI 96822 (United States); Elkins-Tanton, L. T. [Department of Terrestrial Magnetism, Carnegie Institution for Science, Washington, DC 20015 (United States); Bridges, J. C. [Department of Physics and Astronomy, University of Leicester, Leicester LE1 7RH (United Kingdom); Morlok, A., E-mail: johns477@purdue.edu [Department of Physical Sciences, Open University, Walton Hall, Milton Keynes MK7 6AA (United Kingdom)

    2012-12-10

    Spectral modeling of the large infrared excess in the Spitzer IRS spectra of HD 172555 suggests that there is more than 10{sup 19} kg of submicron dust in the system. Using physical arguments and constraints from observations, we rule out the possibility of the infrared excess being created by a magma ocean planet or a circumplanetary disk or torus. We show that the infrared excess is consistent with a circumstellar debris disk or torus, located at {approx}6 AU, that was created by a planetary scale hypervelocity impact. We find that radiation pressure should remove submicron dust from the debris disk in less than one year. However, the system's mid-infrared photometric flux, dominated by submicron grains, has been stable within 4% over the last 27 years, from the Infrared Astronomical Satellite (1983) to WISE (2010). Our new spectral modeling work and calculations of the radiation pressure on fine dust in HD 172555 provide a self-consistent explanation for this apparent contradiction. We also explore the unconfirmed claim that {approx}10{sup 47} molecules of SiO vapor are needed to explain an emission feature at {approx}8 {mu}m in the Spitzer IRS spectrum of HD 172555. We find that unless there are {approx}10{sup 48} atoms or 0.05 M{sub Circled-Plus} of atomic Si and O vapor in the system, SiO vapor should be destroyed by photo-dissociation in less than 0.2 years. We argue that a second plausible explanation for the {approx}8 {mu}m feature can be emission from solid SiO, which naturally occurs in submicron silicate ''smokes'' created by quickly condensing vaporized silicate.

  5. Systems biology definition of the core proteome of metabolism and expression is consistent with high-throughput data.

    Science.gov (United States)

    Yang, Laurence; Tan, Justin; O'Brien, Edward J; Monk, Jonathan M; Kim, Donghyuk; Li, Howard J; Charusanti, Pep; Ebrahim, Ali; Lloyd, Colton J; Yurkovich, James T; Du, Bin; Dräger, Andreas; Thomas, Alex; Sun, Yuekai; Saunders, Michael A; Palsson, Bernhard O

    2015-08-25

    Finding the minimal set of gene functions needed to sustain life is of both fundamental and practical importance. Minimal gene lists have been proposed by using comparative genomics-based core proteome definitions. A definition of a core proteome that is supported by empirical data, is understood at the systems-level, and provides a basis for computing essential cell functions is lacking. Here, we use a systems biology-based genome-scale model of metabolism and expression to define a functional core proteome consisting of 356 gene products, accounting for 44% of the Escherichia coli proteome by mass based on proteomics data. This systems biology core proteome includes 212 genes not found in previous comparative genomics-based core proteome definitions, accounts for 65% of known essential genes in E. coli, and has 78% gene function overlap with minimal genomes (Buchnera aphidicola and Mycoplasma genitalium). Based on transcriptomics data across environmental and genetic backgrounds, the systems biology core proteome is significantly enriched in nondifferentially expressed genes and depleted in differentially expressed genes. Compared with the noncore, core gene expression levels are also similar across genetic backgrounds (two times higher Spearman rank correlation) and exhibit significantly more complex transcriptional and posttranscriptional regulatory features (40% more transcription start sites per gene, 22% longer 5'UTR). Thus, genome-scale systems biology approaches rigorously identify a functional core proteome needed to support growth. This framework, validated by using high-throughput datasets, facilitates a mechanistic understanding of systems-level core proteome function through in silico models; it de facto defines a paleome.

  6. The development and validation of an assessment of safety awareness of science teachers using interactive videodisc technology

    Science.gov (United States)

    Lomask, Michal S.; Jacobson, Larry; Hafner, Laurin P.

    A new assessment of science teachers' knowledge of school lab safety management was developed and tested. The assessment, called the Safety Simulator, is based on Interactive Videodisc (IVD) technology and was developed as part of the Beginning Educator Support and Training program for beginning teachers in Connecticut. The Safety Simulator contains two phases: 1) A walk through the lab room, designed to assess teacher knowledge of current regulations about safety equipment and storage of chemicals; 2) A General science lab activity, performed by four middle school students, designed to assess teacher knowledge of school safety management and his/her ability to monitor students' work. Reliability of scores for the four different categories of school lab safety (Physical facilities, Chemicals. Lab techniques and Students' behavior), examined by calculating the mean correlation coefficients among three different scorers, was found to be moderate to high. Evidence for content and construct validity were studied by examining job relatedness, safety expert judgment, and by comparing the performance of known groups.

  7. Stochastic Self-Consistent Second-Order Green's Function Method for Correlation Energies of Large Electronic Systems.

    Science.gov (United States)

    Neuhauser, Daniel; Baer, Roi; Zgid, Dominika

    2017-11-14

    The second-order Matsubara Green's function method (GF2) is a robust temperature-dependent quantum chemistry approach, extending beyond the random-phase approximation. However, until now the scope of GF2 applications was quite limited as they require computer resources that rise steeply with system size. In each step of the self-consistent GF2 calculation there are two parts: estimating of the self-energy from the previous step's Green's function and updating the Green's function from the self-energy. The first part formally scales as the fifth power of the system size, while the second has a much gentler cubic scaling. Here, we develop a stochastic approach to GF2 (sGF2), which reduces the fifth power scaling of the first step to merely quadratic, leaving the overall sGF2 scaling as cubic. We apply the method to linear hydrogen chains with up to 1000 electrons, showing that the approach is numerically stable, efficient, and accurate. The stochastic errors are very small, on the order of 0.1% or less of the correlation energy for large systems, with only a moderate computational effort. The first iteration of GF2 is an MP2 calculation that is done in linear scaling; hence we obtain an extremely fast stochastic MP2 (sMP2) method as a byproduct. While here we consider finite systems with large band gaps where at low temperatures effects are negligible, the sGF2 formalism is temperature dependent and general and can be applied to finite or periodic systems with small gaps at finite temperatures.

  8. Separation of polysaccharides from rice husk and wheat bran using solvent system consisting of BMIMOAc and DMI.

    Science.gov (United States)

    Hou, Qidong; Li, Weizun; Ju, Meiting; Liu, Le; Chen, Yu; Yang, Qian; Wang, Jingyu

    2015-11-20

    A solvent system consisting of 1,3-dimethyl-2-imidazolidinone (DMI), and ionic liquid 1-butyl-3-methylimidazolium acetate (BMIMOAc) was used to separate polysaccharides from rice husk and wheat bran. The effects of the DMI/BMIMOAc ratios, temperature, and time on the dissolution of rice husk and wheat bran were investigated, and the influence of anti-solvents on the regeneration of polysaccharides-rich material was evaluated. We found that the solvent system is more powerful to dissolve rice husk and wheat bran than pure BMIMOAc, and that polysaccharides-rich material can be effectively separated from the biomass solution. The polysaccharides content of regenerated material from wheat bran can reach as high as 94.4% when ethanol was used as anti-solvents. Under optimized conditions, the extraction rate of polysaccharides for wheat bran can reach as high as 71.8% at merely 50°C. The recycled solvent system exhibited constant ability to separate polysaccharides from rice husk and wheat bran. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Evaluation of the HFACS-ADF safety classification system: inter-coder consensus and intra-coder consistency.

    Science.gov (United States)

    Olsen, Nikki S; Shorrock, Steven T

    2010-03-01

    This article evaluates an adaptation of the human factors analysis and classification system (HFACS) adopted by the Australian Defence Force (ADF) to classify factors that contribute to incidents. Three field studies were undertaken to assess the reliability of HFACS-ADF in the context of a particular ADF air traffic control (ATC) unit. Study one was designed to assess inter-coder consensus between many coders for two incident reports. Study two was designed to assess inter-coder consensus between one participant and the previous original analysts for a large set of incident reports. Study three was designed to test intra-coder consistency for four participants over many months. For all studies, agreement was low at the level of both fine-level HFACS-ADF descriptors and high-level HFACS-type categories. A survey of participants suggested that they were not confident that HFACS-ADF could be used consistently. The three field studies reported suggest that the ADF adaptation of HFACS is unreliable for incident analysis at the ATC unit level, and may therefore be invalid in this context. Several reasons for the results are proposed, associated with the underlying HFACS model and categories, the HFACS-ADF adaptations, the context of use, and the conduct of the studies. Copyright 2009 Elsevier Ltd. All rights reserved.

  10. An eddy-permitting, dynamically consistent adjoint-based assimilation system for the tropical Pacific: Hindcast experiments in 2000

    KAUST Repository

    Hoteit, Ibrahim

    2010-03-02

    An eddy-permitting adjoint-based assimilation system has been implemented to estimate the state of the tropical Pacific Ocean. The system uses the Massachusetts Institute of Technology\\'s general circulation model and its adjoint. The adjoint method is used to adjust the model to observations by controlling the initial temperature and salinity; temperature, salinity, and horizontal velocities at the open boundaries; and surface fluxes of momentum, heat, and freshwater. The model is constrained with most of the available data sets in the tropical Pacific, including Tropical Atmosphere and Ocean, ARGO, expendable bathythermograph, and satellite SST and sea surface height data, and climatologies. Results of hindcast experiments in 2000 suggest that the iterated adjoint-based descent is able to significantly improve the model consistency with the multivariate data sets, providing a dynamically consistent realization of the tropical Pacific circulation that generally matches the observations to within specified errors. The estimated model state is evaluated both by comparisons with observations and by checking the controls, the momentum balances, and the representation of small-scale features that were not well sampled by the observations used in the assimilation. As part of these checks, the estimated controls are smoothed and applied in independent model runs to check that small changes in the controls do not greatly change the model hindcast. This is a simple ensemble-based uncertainty analysis. In addition, the original and smoothed controls are applied to a version of the model with doubled horizontal resolution resulting in a broadly similar “downscaled” hindcast, showing that the adjustments are not tuned to a single configuration (meaning resolution, topography, and parameter settings). The time-evolving model state and the adjusted controls should be useful for analysis or to supply the forcing, initial, and boundary conditions for runs of other models.

  11. An Overview on the Project to Develop Consistent Earth System Data Records for the Global Terrestrial Water Cycle

    Science.gov (United States)

    Sahoo, A. K.; Pan, M.; Gao, H.; Wood, E. F.; Houser, P. R.; Lettenmaier, D. P.; Pinker, R.; Kummerow, C. D.

    2008-12-01

    We aim to develop consistent, long-term Earth System Data Records (ESDRs) for the major components (storages and fluxes) of the terrestrial water cycle at a spatial resolution of 0.5 degrees (latitude-longitude) and for the period 1950 to near-present. The resulting ESDRs are intended to provide a consistent basis for estimating the mean state and variability of the land surface water cycle at the spatial scale of the major global river basins. The ESDRs to produce include a) surface meteorology (precipitation, air temperature, humidity and wind), b) surface downward radiation (solar and longwave) and c) derived and/or assimilated fluxes and storages such as surface soil moisture storage, total basin water storage, snow water equivalent, storage in large lakes, reservoirs, and wetlands, evapotranspiration, and surface runoff. We construct data records for all variables back to 1950, recognizing that the post-satellite data will be of higher quality than pre-satellite (a reasonable compromise given the need for long-term records to define interannual and interdecadal variability of key water cycle variables). A distinguishing feature will be inclusion of two variables that reflect the massive effects of anthropogenic manipulation of the terrestrial water cycle, specifically reservoir storage, and irrigation water use. The overall goal of the project is to develop long term, consistent ESDRs for terrestrial water cycle states and variables by updating and extending previously funded Pathfinder data set activities to the investigators, and by making available the data set to the scientific community and data users via a state-of-the-art internet web-portal. The ESDRs will utilize algorithms and methods that are well documented in the peer reviewed literature. The ESDRs will merge satellite-derived products with predictions of the same variables by LSMs driven by merged satellite and in situ forcing data sets (most notably precipitation), with the constraint that the

  12. Consistency of orthodox gravity

    Energy Technology Data Exchange (ETDEWEB)

    Bellucci, S. [INFN, Frascati (Italy). Laboratori Nazionali di Frascati; Shiekh, A. [International Centre for Theoretical Physics, Trieste (Italy)

    1997-01-01

    A recent proposal for quantizing gravity is investigated for self consistency. The existence of a fixed-point all-order solution is found, corresponding to a consistent quantum gravity. A criterion to unify couplings is suggested, by invoking an application of their argument to more complex systems.

  13. Consistency between kinetics and thermodynamics: general scaling conditions for reaction rates of nonlinear chemical systems without constraints far from equilibrium.

    Science.gov (United States)

    Vlad, Marcel O; Popa, Vlad T; Ross, John

    2011-02-03

    We examine the problem of consistency between the kinetic and thermodynamic descriptions of reaction networks. We focus on reaction networks with linearly dependent (but generally kinetically independent) reactions for which only some of the stoichiometric vectors attached to the different reactions are linearly independent. We show that for elementary reactions without constraints preventing the system from approaching equilibrium there are general scaling relations for nonequilibrium rates, one for each linearly dependent reaction. These scaling relations express the ratios of the forward and backward rates of the linearly dependent reactions in terms of products of the ratios of the forward and backward rates of the linearly independent reactions raised to different scaling powers; the scaling powers are elements of the transformation matrix, which relates the linearly dependent stoichiometric vectors to the linearly independent stoichiometric vectors. These relations are valid for any network of elementary reactions without constraints, linear or nonlinear kinetics, far from equilibrium or close to equilibrium. We show that similar scaling relations for the reaction routes exist for networks of nonelementary reactions described by the Horiuti-Temkin theory of reaction routes where the linear dependence of the mechanistic (elementary) reactions is transferred to the overall (route) reactions. However, in this case, the scaling conditions are valid only at the steady state. General relationships between reaction rates of the two levels of description are presented. These relationships are illustrated for a specific complex reaction: radical chlorination of ethylene.

  14. A sample design for globally consistent biomass estimation using lidar data from the Geoscience Laser Altimeter System (GLAS

    Directory of Open Access Journals (Sweden)

    Healey Sean P

    2012-10-01

    Full Text Available Abstract Background Lidar height data collected by the Geosciences Laser Altimeter System (GLAS from 2002 to 2008 has the potential to form the basis of a globally consistent sample-based inventory of forest biomass. GLAS lidar return data were collected globally in spatially discrete full waveform “shots,” which have been shown to be strongly correlated with aboveground forest biomass. Relationships observed at spatially coincident field plots may be used to model biomass at all GLAS shots, and well-established methods of model-based inference may then be used to estimate biomass and variance for specific spatial domains. However, the spatial pattern of GLAS acquisition is neither random across the surface of the earth nor is it identifiable with any particular systematic design. Undefined sample properties therefore hinder the use of GLAS in global forest sampling. Results We propose a method of identifying a subset of the GLAS data which can justifiably be treated as a simple random sample in model-based biomass estimation. The relatively uniform spatial distribution and locally arbitrary positioning of the resulting sample is similar to the design used by the US national forest inventory (NFI. We demonstrated model-based estimation using a sample of GLAS data in the US state of California, where our estimate of biomass (211 Mg/hectare was within the 1.4% standard error of the design-based estimate supplied by the US NFI. The standard error of the GLAS-based estimate was significantly higher than the NFI estimate, although the cost of the GLAS estimate (excluding costs for the satellite itself was almost nothing, compared to at least US$ 10.5 million for the NFI estimate. Conclusions Global application of model-based estimation using GLAS, while demanding significant consolidation of training data, would improve inter-comparability of international biomass estimates by imposing consistent methods and a globally coherent sample frame. The

  15. Consistent quantification of climate impacts due to biogenic carbon storage across a range of bio-product systems

    Energy Technology Data Exchange (ETDEWEB)

    Guest, Geoffrey, E-mail: geoffrey.guest@ntnu.no; Bright, Ryan M., E-mail: ryan.m.bright@ntnu.no; Cherubini, Francesco, E-mail: francesco.cherubini@ntnu.no; Strømman, Anders H., E-mail: anders.hammer.stromman@ntnu.no

    2013-11-15

    Temporary and permanent carbon storage from biogenic sources is seen as a way to mitigate climate change. The aim of this work is to illustrate the need to harmonize the quantification of such mitigation across all possible storage pools in the bio- and anthroposphere. We investigate nine alternative storage cases and a wide array of bio-resource pools: from annual crops, short rotation woody crops, medium rotation temperate forests, and long rotation boreal forests. For each feedstock type and biogenic carbon storage pool, we quantify the carbon cycle climate impact due to the skewed time distribution between emission and sequestration fluxes in the bio- and anthroposphere. Additional consideration of the climate impact from albedo changes in forests is also illustrated for the boreal forest case. When characterizing climate impact with global warming potentials (GWP), we find a large variance in results which is attributed to different combinations of biomass storage and feedstock systems. The storage of biogenic carbon in any storage pool does not always confer climate benefits: even when biogenic carbon is stored long-term in durable product pools, the climate outcome may still be undesirable when the carbon is sourced from slow-growing biomass feedstock. For example, when biogenic carbon from Norway Spruce from Norway is stored in furniture with a mean life time of 43 years, a climate change impact of 0.08 kg CO{sub 2}eq per kg CO{sub 2} stored (100 year time horizon (TH)) would result. It was also found that when biogenic carbon is stored in a pool with negligible leakage to the atmosphere, the resulting GWP factor is not necessarily − 1 CO{sub 2}eq per kg CO{sub 2} stored. As an example, when biogenic CO{sub 2} from Norway Spruce biomass is stored in geological reservoirs with no leakage, we estimate a GWP of − 0.56 kg CO{sub 2}eq per kg CO{sub 2} stored (100 year TH) when albedo effects are also included. The large variance in GWPs across the range of

  16. Low Leachable Container System Consisting of a Polymer-Based Syringe with Chlorinated Isoprene Isobutene Rubber Plunger Stopper.

    Science.gov (United States)

    Kiminami, Hideaki; Takeuchi, Katsuyuki; Nakamura, Koji; Abe, Yoshihiko; Lauwers, Philippe; Dierick, William; Yoshino, Keisuke; Suzuki, Shigeru

    2015-01-01

    A 36 month leachable study on water for injection in direct contact within a polymer-based prefillable syringe consisting of a cyclo olefin polymer barrel, a chlorinated isoprene isobutene rubber plunger stopper, a polymer label attached on the barrel, and a secondary packaging was conducted at 25 ± 2 °C and 60 ± 5% relative humidity. Through the various comparison studies, no difference in the leachable amounts was observed between this polymer-based prefilled syringe and a glass bottle as a blank sample reference by 36 months. No influence on the leachables study outcome was noted from the printed label and/or label adhesive or from the secondary packaging. In an additional study, no acrylic acid used as the label adhesive leachable was detected by an extended storage for 45 months at 25 ± 2 °C and 60 ± 5% relative humidity as a worst case. To obtain more details, a comparison extractable study was conducted between a cyclo olefin polymer barrel and a glass barrel. In addition, chlorinated isoprene isobutene rubber and bromo isoprene isobutene rubber were compared. As a result, no remarkable difference was found in the organic extractables for syringe barrels. On the other hand, in the case of element extractable analysis, the values for the cyclo olefin polymer barrel were lower than that for the glass barrel. For the plunger stoppers, the chlorinated isoprene isobutene rubber applied in this study was showing a lower extractable profile as compared to the bromo isoprene isobutene rubber, both for organic and element extractables. In conclusion, the proposed polymer-based prefillable syringe system has great potential and represents a novel alternative that can achieve very low level extractable profiles and can bring additional value to the highly sensitive biotech drug market. A 36 month leachable study on water for injection in direct contact within a cyclo olefin polymer barrel and chlorinated isoprene isobutene rubber plunger stopper that has a

  17. The Rucio Consistency Service

    CERN Document Server

    Serfon, Cedric; The ATLAS collaboration

    2016-01-01

    One of the biggest challenge with Large scale data management system is to ensure the consistency between the global file catalog and what is physically on all storage elements. To tackle this issue, the Rucio software which is used by the ATLAS Distributed Data Management system has been extended to automatically handle lost or unregistered files (aka Dark Data). This system automatically detects these inconsistencies and take actions like recovery or deletion of unneeded files in a central manner. In this talk, we will present this system, explain the internals and give some results.

  18. Unifying inflation with {lambda}CDM epoch in modified f(R) gravity consistent with Solar System tests

    Energy Technology Data Exchange (ETDEWEB)

    Nojiri, Shin' ichi [Department of Physics, Nagoya University, Nagoya 464-8602 (Japan)], E-mail: nojiri@phys.nagoya-u.ac.jp; Odintsov, Sergei D. [Institucio Catalana de Recerca i Estudis Avancats (ICREA) and Institut de Ciencies de l' Espai (IEEC-CSIC), Campus UAB, Facultat de Ciencies, Torre C5-Par-2a pl, E-08193 Bellaterra, Barcelona (Spain)], E-mail: odintsov@ieec.uab.es

    2007-12-06

    We suggest two realistic f(R) and one F(G) modified gravities which are consistent with local tests and cosmological bounds. The typical property of such theories is the presence of the effective cosmological constant epochs in such a way that early-time inflation and late-time cosmic acceleration are naturally unified within single model. It is shown that classical instability does not appear here and Newton law is respected. Some discussion of possible anti-gravity regime appearance and related modification of the theory is done.

  19. Interacting multiple zero mode formulation and its application to a system consisting of a dark soliton in a condensate

    Science.gov (United States)

    Takahashi, J.; Nakamura, Y.; Yamanaka, Y.

    2015-08-01

    To formulate the zero modes in a finite-size system with spontaneous breakdown of symmetries in quantum field theory is not trivial, for in the naive Bogoliubov theory, one encounters difficulties such as phase diffusion, the absence of a definite criterion for determining the ground state, and infrared divergences. An interacting zero mode formulation that has been proposed for systems with a single zero mode to avoid these difficulties is extended to general systems with multiple zero modes. It naturally and definitely gives the interactions among the quantized zero modes, the consequences of which can be observed experimentally. In this paper, as a typical example, we consider an atomic Bose-Einstein condensed system with a dark soliton that contains two zero modes corresponding to the spontaneous breakdown of the U(1) gauge and translational symmetries. Then we evaluate the standard deviations of the zero mode operators and see how the mutual interaction between the two zero modes affects them.

  20. Monitoring and control of a hydrogen production and storage system consisting of water electrolysis and metal hydrides

    Energy Technology Data Exchange (ETDEWEB)

    Perez-Herranz, V.; Perez-Page, M. [Departamento de Ingenieria Quimica y Nuclear. Universidad Politecnica de Valencia. Camino de Vera S/N, 46022 Valencia (Spain); Beneito, R. [Area de Energia. Departamento de Gestion e Innovacion. Instituto Tecnologico del Juguete (AIJU). Avda. Industria 23, 03440 Ibi, Alicante (Spain)

    2010-02-15

    Renewable energy sources such as wind turbines and solar photovoltaic are energy sources that cannot generate continuous electric power. The seasonal storage of solar or wind energy in the form of hydrogen can provide the basis for a completely renewable energy system. In this way, water electrolysis is a convenient method for converting electrical energy into a chemical form. The power required for hydrogen generation can be supplied through a photovoltaic array. Hydrogen can be stored as metal hydrides and can be converted back into electricity using a fuel cell. The elements of these systems, i.e. the photovoltaic array, electrolyzer, fuel cell and hydrogen storage system in the form of metal hydrides, need a control and monitoring system for optimal operation. This work has been performed within a Research and Development contract on Hydrogen Production granted by Solar Iniciativas Tecnologicas, S.L. (SITEC), to the Politechnic University of Valencia and to the AIJU, and deals with the development of a system to control and monitor the operation parameters of an electrolyzer and a metal hydride storage system that allow to get a continuous production of hydrogen. (author)

  1. Evaluating the systemic right ventricle by CMR: the importance of consistent and reproducible delineation of the cavity

    Directory of Open Access Journals (Sweden)

    van Dijk Arie PJ

    2008-08-01

    Full Text Available Abstract Background The method used to delineate the boundary of the right ventricle (RV, relative to the trabeculations and papillary muscles in cardiovascular magnetic resonance (CMR ventricular volume analysis, may matter more when these structures are hypertrophied than in individuals with normal cardiovascular anatomy. This study aimed to compare two methods of cavity delineation in patients with systemic RV. Methods Twenty-nine patients (mean age 34.7 ± 12.4 years with a systemic RV (12 with congenitally corrected transposition of the great arteries (ccTGA and 17 with atrially switched (TGA underwent CMR. We compared measurements of systemic RV volumes and function using two analysis protocols. The RV trabeculations and papillary muscles were either included in the calculated blood volume, the boundary drawn immediately within the apparently compacted myocardial layer, or they were manually outlined and excluded. RV stroke volume (SV calculated using each method was compared with corresponding left ventricular (LV SV. Additionally, we compared the differences in analysis time, and in intra- and inter-observer variability between the two methods. Paired samples t-test was used to test for differences in volumes, function and analysis time between the two methods. Differences in intra- and inter-observer reproducibility were tested using an extension of the Bland-Altman method. Results The inclusion of trabeculations and papillary muscles in the ventricular volume resulted in higher values for systemic RV end diastolic volume (mean difference 28.7 ± 10.6 ml, p Conclusion The choice of method for systemic RV cavity delineation significantly affected volume measurements, given the CMR acquisition and analysis systems used. We recommend delineation outside the trabeculations for routine clinical measurements of systemic RV volumes as this approach took less time and gave more reproducible measurements.

  2. An historically consistent and broadly applicable MRV system based on LiDAR sampling and Landsat time-series

    Science.gov (United States)

    W. Cohen; H. Andersen; S. Healey; G. Moisen; T. Schroeder; C. Woodall; G. Domke; Z. Yang; S. Stehman; R. Kennedy; C. Woodcock; Z. Zhu; J. Vogelmann; D. Steinwand; C. Huang

    2014-01-01

    The authors are developing a REDD+ MRV system that tests different biomass estimation frameworks and components. Design-based inference from a costly fi eld plot network was compared to sampling with LiDAR strips and a smaller set of plots in combination with Landsat for disturbance monitoring. Biomass estimation uncertainties associated with these different data sets...

  3. LITERATURE REVIEW: HEAT TRANSFER THROUGH TWO-PHASE INSULATION SYSTEMS CONSISTING OF POWDERS IN A CONTINUOUS GAS PHASE

    Science.gov (United States)

    The report, a review of the literature on heat flow through powders, was motivated by the use of fine powder systems to produce high thermal resistivities (thermal resistance per unit thickness). he term "superinsulations" has been used to describe this type of material, which ha...

  4. Gel/Space Ratio Evolution in Ternary Composite System Consisting of Portland Cement, Silica Fume, and Fly Ash.

    Science.gov (United States)

    Wu, Mengxue; Li, Chen; Yao, Wu

    2017-01-11

    In cement-based pastes, the relationship between the complex phase assemblage and mechanical properties is usually described by the "gel/space ratio" descriptor. The gel/space ratio is defined as the volume ratio of the gel to the available space in the composite system, and it has been widely studied in the cement unary system. This work determines the gel/space ratio in the cement-silica fume-fly ash ternary system (C-SF-FA system) by measuring the reaction degrees of the cement, SF, and FA. The effects that the supplementary cementitious material (SCM) replacements exert on the evolution of the gel/space ratio are discussed both theoretically and practically. The relationship between the gel/space ratio and compressive strength is then explored, and the relationship disparities for different mix proportions are analyzed in detail. The results demonstrate that the SCM replacements promote the gel/space ratio evolution only when the SCM reaction degree is higher than a certain value, which is calculated and defined as the critical reaction degree (CRD). The effects of the SCM replacements can be predicted based on the CRD, and the theological predictions agree with the test results quite well. At low gel/space ratios, disparities in the relationship between the gel/space ratio and the compressive strength are caused by porosity, which has also been studied in cement unary systems. The ratio of cement-produced gel to SCM-produced gel ( G C to G S C M ratio) is introduced for use in analyzing high gel/space ratios, in which it plays a major role in creating relationship disparities.

  5. Gel/Space Ratio Evolution in Ternary Composite System Consisting of Portland Cement, Silica Fume, and Fly Ash

    Directory of Open Access Journals (Sweden)

    Mengxue Wu

    2017-01-01

    Full Text Available In cement-based pastes, the relationship between the complex phase assemblage and mechanical properties is usually described by the “gel/space ratio” descriptor. The gel/space ratio is defined as the volume ratio of the gel to the available space in the composite system, and it has been widely studied in the cement unary system. This work determines the gel/space ratio in the cement-silica fume-fly ash ternary system (C-SF-FA system by measuring the reaction degrees of the cement, SF, and FA. The effects that the supplementary cementitious material (SCM replacements exert on the evolution of the gel/space ratio are discussed both theoretically and practically. The relationship between the gel/space ratio and compressive strength is then explored, and the relationship disparities for different mix proportions are analyzed in detail. The results demonstrate that the SCM replacements promote the gel/space ratio evolution only when the SCM reaction degree is higher than a certain value, which is calculated and defined as the critical reaction degree (CRD. The effects of the SCM replacements can be predicted based on the CRD, and the theological predictions agree with the test results quite well. At low gel/space ratios, disparities in the relationship between the gel/space ratio and the compressive strength are caused by porosity, which has also been studied in cement unary systems. The ratio of cement-produced gel to SCM-produced gel ( G C to G S C M ratio is introduced for use in analyzing high gel/space ratios, in which it plays a major role in creating relationship disparities.

  6. Control System of Training Ship Keeping the Desired Path Consisting of Straight-lines and Circular Arcs

    Directory of Open Access Journals (Sweden)

    Krzysztof Kula

    2017-12-01

    Full Text Available Presented in this paper is a new, expanded approach to setting the planned route within restricted sea areas where there are permanent obstacles or other vessels, and then tracking it by the ship. In this project the desired path is represented using a combination of straight-lines and arcs. For this purpose a cascade control system of the ship motion has been designed. The task of the outer loop controller to prepare the reference signal is executed by the reference signal generator. The control of angular velocity takes place in an inner loop using IMC approach. Numerical simulation studies of control algorithms that have been developed for lake trials of the training ship are also carried out to demonstrate the effectiveness of the proposed tracking control system.

  7. Systems biology definition of the core proteome of metabolism and expression is consistent with high-throughput data

    DEFF Research Database (Denmark)

    Yang, Laurence; Tan, Justin; O'Brien, Edward J.

    2015-01-01

    Finding the minimal set of gene functions needed to sustain life is of both fundamental and practical importance. Minimal gene lists have been proposed by using comparative genomics-based core proteome definitions. A definition of a core proteome that is supported by empirical data, is understood...... based on proteomics data. This systems biology core proteome includes 212 genes not found in previous comparative genomics-based core proteome definitions, accounts for 65% of known essential genes in E. coli, and has 78% gene function overlap with minimal genomes (Buchnera aphidicola and Mycoplasma...... genitalium). Based on transcriptomics data across environmental and genetic backgrounds, the systems biology core proteome is significantly enriched in nondifferentially expressed genes and depleted in differentially expressed genes. Compared with the noncore, core gene expression levels are also similar...

  8. A Novel Degradation Estimation Method for a Hybrid Energy Storage System Consisting of Battery and Double-Layer Capacitor

    Directory of Open Access Journals (Sweden)

    Yuanbin Yu

    2016-01-01

    Full Text Available This paper presents a new method for battery degradation estimation using a power-energy (PE function in a battery/ultracapacitor hybrid energy storage system (HESS, and the integrated optimization which concerns both parameters matching and control for HESS has been done as well. A semiactive topology of HESS with double-layer capacitor (EDLC coupled directly with DC-link is adopted for a hybrid electric city bus (HECB. In the purpose of presenting the quantitative relationship between system parameters and battery serving life, the data during a 37-minute driving cycle has been collected and decomposed into discharging/charging fragments firstly, and then the optimal control strategy which is supposed to maximally use the available EDLC energy is presented to decompose the power between battery and EDLC. Furthermore, based on a battery degradation model, the conversion of power demand by PE function and PE matrix is applied to evaluate the relationship between the available energy stored in HESS and the serving life of battery pack. Therefore, according to the approach which could decouple parameters matching and optimal control of the HESS, the process of battery degradation and its serving life estimation for HESS has been summed up.

  9. Testing Postural Stability: Are the Star Excursion Balance Test and Biodex Balance System Limits of Stability Tests Consistent?

    Science.gov (United States)

    Glave, A Page; Didier, Jennifer J; Weatherwax, Jacqueline; Browning, Sarah J; Fiaud, Vanessa

    2016-01-01

    There are a variety of options to test postural stability; however many physical tests lack validity information. Two tests of postural stability - the Star Excursion Balance Test (SEBT) and Biodex Balance System Limits of Stability Test (LOS) - were examined to determine if similar components of balance were measured. Healthy adults (n=31) completed the LOS (levels 6 and 12) and SEBT (both legs). SEBT directions were offset by 180° to approximate LOS direction. Correlations and partial correlations controlling for height were analyzed. Correlations were significant for SEBT 45° and LOS back-left (6: r=-0.41; 12: r=-0.42; ptests seem to assess different components of balance. Research is needed to determine and define what specific components of balance are being assessed. Care must be taken when choosing balance tests to best match the test to the purpose of testing (fall risk, athletic performance, etc.). Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Analysis and comparison of NoSQL databases with an introduction to consistent references in big data storage systems

    Science.gov (United States)

    Dziedzic, Adam; Mulawka, Jan

    2014-11-01

    NoSQL is a new approach to data storage and manipulation. The aim of this paper is to gain more insight into NoSQL databases, as we are still in the early stages of understanding when to use them and how to use them in an appropriate way. In this submission descriptions of selected NoSQL databases are presented. Each of the databases is analysed with primary focus on its data model, data access, architecture and practical usage in real applications. Furthemore, the NoSQL databases are compared in fields of data references. The relational databases offer foreign keys, whereas NoSQL databases provide us with limited references. An intermediate model between graph theory and relational algebra which can address the problem should be created. Finally, the proposal of a new approach to the problem of inconsistent references in Big Data storage systems is introduced.

  11. Tree species composition in areas of Atlantic Forest in southeastern Brazil is consistent with a new system for classifying the vegetation of South America

    Directory of Open Access Journals (Sweden)

    Pedro Vasconcellos Eisenlohr

    2014-06-01

    Full Text Available Rigorous and well-defined criteria for the classification of vegetation constitute a prerequisite for effective biodiversity conservation strategies. In 2009, a new classification system was proposed for vegetation types in extra-Andean tropical and subtropical South America. The new system expanded upon the criteria established in the existing Brazilian Institute of Geography and Statistics classification system. Here, we attempted to determine whether the tree species composition of the formations within the Atlantic Forest Biome of Brazil is consistent with this new classification system. We compiled floristic surveys of 394 sites in southeastern Brazil (between 15º and 25ºS; and between the Atlantic coast and 55ºW. To assess the floristic consistency of the vegetation types, we performed non-metric multidimensional scaling (NMDS ordination analysis, followed by multifactorial ANOVA. The vegetation types, especially in terms of their thermal regimes, elevational belts and top-tier vegetation categories, were consistently discriminated in the first NMDS axis, and all assessed attributes showed at least one significant difference in the second axis. As was expected on the basis of the theoretical background, we found that tree species composition, in the areas of Atlantic Forest studied, was highly consistent with the new system of classification. Our findings not only help solidify the position of this new classification system but also contribute to expanding the knowledge of the patterns and underlying driving forces of the distribution of vegetation in the region.

  12. Consistent pattern of local adaptation during an experimental heat wave in a pipefish-trematode host-parasite system.

    Directory of Open Access Journals (Sweden)

    Susanne H Landis

    Full Text Available Extreme climate events such as heat waves are expected to increase in frequency under global change. As one indirect effect, they can alter magnitude and direction of species interactions, for example those between hosts and parasites. We simulated a summer heat wave to investigate how a changing environment affects the interaction between the broad-nosed pipefish (Syngnathus typhle as a host and its digenean trematode parasite (Cryptocotyle lingua. In a fully reciprocal laboratory infection experiment, pipefish from three different coastal locations were exposed to sympatric and allopatric trematode cercariae. In order to examine whether an extreme climatic event disrupts patterns of locally adapted host-parasite combinations we measured the parasite's transmission success as well as the host's adaptive and innate immune defence under control and heat wave conditions. Independent of temperature, sympatric cercariae were always more successful than allopatric ones, indicating that parasites are locally adapted to their hosts. Hosts suffered from heat stress as suggested by fewer cells of the adaptive immune system (lymphocytes compared to the same groups that were kept at 18°C. However, the proportion of the innate immune cells (monocytes was higher in the 18°C water. Contrary to our expectations, no interaction between host immune defence, parasite infectivity and temperature stress were found, nor did the pattern of local adaptation change due to increased water temperature. Thus, in this host-parasite interaction, the sympatric parasite keeps ahead of the coevolutionary dynamics across sites, even under increasing temperatures as expected under marine global warming.

  13. Hypergeometric resummation of self-consistent sunset diagrams for steady-state electron-boson quantum many-body systems out of equilibrium

    Science.gov (United States)

    Mera, Héctor; Pedersen, Thomas G.; Nikolić, Branislav K.

    2016-10-01

    A newly developed hypergeometric resummation technique [H. Mera et al., Phys. Rev. Lett. 115, 143001 (2015), 10.1103/PhysRevLett.115.143001] provides an easy-to-use recipe to obtain conserving approximations within the self-consistent nonequilibrium many-body perturbation theory. We demonstrate the usefulness of this technique by calculating the phonon-limited electronic current in a model of a single-molecule junction within the self-consistent Born approximation for the electron-phonon interacting system, where the perturbation expansion for the nonequilibrium Green's function in powers of the free bosonic propagator typically consists of a series of noncrossing sunset diagrams. Hypergeometric resummation preserves conservation laws and it is shown to provide substantial convergence acceleration relative to more standard approaches to self-consistency. This result strongly suggests that the convergence of the self-consistent sunset series is limited by a branch-cut singularity, which is accurately described by Gauss hypergeometric functions. Our results showcase an alternative approach to conservation laws and self-consistency where expectation values obtained from conserving divergent perturbation expansions are summed to their self-consistent value by analytic continuation functions able to mimic the convergence-limiting singularity structure.

  14. A preliminary analysis of the receipt of mental health services consistent with national standards among children in the child welfare system.

    Science.gov (United States)

    Raghavan, Ramesh; Inoue, Megumi; Ettner, Susan L; Hamilton, Barton H; Landsverk, John

    2010-04-01

    We sought to examine the extent to which children in the child welfare system receive mental health care consistent with national standards. We used data from 4 waves (3 years of follow-up) of the National Survey of Child and Adolescent Well-Being, the nation's first longitudinal study of children in the child welfare system, and the Area Resource File to examine rates of screening, assessment, and referral to mental health services among 3802 youths presenting to child welfare agencies. Weighted population-averaged logistic regression models were used to identify variables associated with standards-consistent care. Only half of all children in the sample received care consistent with any 1 national standard, and less than one tenth received care consistent with all of them. Older children, those exhibiting externalizing behaviors, and those placed in foster care had, on average, higher odds of receiving care consistent with national standards. Adverse consequences of childhood disadvantage cannot be reduced unless greater collaboration occurs between child welfare and mental health agencies. Current changes to Medicaid regulations that weaken entitlements to screening and assessment may also worsen mental health disparities among these vulnerable children.

  15. Microwave field controlled slow and fast light with a coupled system consisting of a nanomechanical resonator and a Cooper-pair box.

    Science.gov (United States)

    Ma, Peng-Cheng; Xiao, Yin; Yu, Ya-Fei; Zhang, Zhi-Ming

    2014-02-10

    We theoretically demonstrate an efficient method to control slow and fast light in microwave regime with a coupled system consisting of a nanomechanical resonator (NR) and a superconducting Cooper-pair box (CPB). Using the pump-probe technique, we find that both slow and fast light effects of the probe field can appear in this coupled system. Furthermore, we show that a tunable switch from slow light to fast light can be achieved by only adjusting the pump-CPB detuning from the NR frequency to zero. Our coupled system may have potential applications, for example, in optical communication, microwave photonics, and nonlinear optics.

  16. Development of a Scale-Consistent Soil-Vegetation-Atmosphere Modeling System Using COSMO, Community Land Model and ParFlow

    Science.gov (United States)

    Shrestha, P.; Sulis, M.; Masbou, M.; Kollet, S. J.; Simmer, C.

    2012-12-01

    Here, we present the development and application of a modular scale-consistent coupled soil vegetation atmosphere (SVA) modeling system. The SVA modeling system developed at the Transregional Collaborative Research Centre 32 (TR32, 2nd phase), consists of the Deutscher Wetterdienst (DWD, German Weather Service) regional climate and weather forecast model COSMO (Consortium for Small Scale Modeling), the National Centre for Atmospheric Research (NCAR) Community Land Model (CLM) and the 3D variably saturated groundwater hydrology model ParFlow ( originally developed at the Lawrence Livermore National Laboratory). The external coupler Ocean Atmosphere Sea Ice Soil (OASIS) developed at European Centre for Research and Advanced Training (CERFACS, Toulouse, France) is used to drive the SVA system and control the exchange of fluxes defined over different grids of the components of the modeling system. Idealized and realistic test case simulations are presented to show the capability of this SVA modeling system to simulate the land-atmosphere interactions including ground water dynamics. The development of this SVA modeling system will also allow the integration of various model advancement efforts (e.g. implementation of carbon cycle, crop dynamics, downscaling and upscaling algorithm, development of adjoint models) and ensures the availability of these developments to the scientific community.

  17. LETTER TO THE EDITOR: Thermally activated processes in magnetic systems consisting of rigid dipoles: equivalence of the Ito and Stratonovich stochastic calculus

    Science.gov (United States)

    Berkov, D. V.; Gorn, N. L.

    2002-04-01

    We demonstrate that the Ito and the Stratonovich stochastic calculus lead to identical results when applied to the stochastic dynamics study of magnetic systems consisting of dipoles with the constant magnitude, despite the multiplicative noise appearing in the corresponding Langevin equations. The immediate consequence of this statement is that any numerical method used for the solution of these equations will lead to the physically correct results.

  18. The Consistent Support System in The Society for Lifelong Sports : From a View Point of Self-Organization of Sports Club and Support

    OpenAIRE

    長岡, 雅美; 赤松, 喜久; Masami, Nagaoka; Yoshihisa, Akamatsu

    2008-01-01

    The purpose of this study is to clarify the concept of Guidance and Support on community sports and to specify the directionality of organization and support for achievement of the sports society through life. The authors have stressed that it is necessary for achievement of the society for longlife sports,to cooperate with other groups and to construct a consistent support system. This study is also to explore the condition of community sports club management through analyzing the Japan Juni...

  19. RPP-PRT-58489, Revision 1, One Systems Consistent Safety Analysis Methodologies Report. 24590-WTP-RPT-MGT-15-014

    Energy Technology Data Exchange (ETDEWEB)

    Gupta, Mukesh [URS Professional Solutions LLC, Aiken, SC (United States); Niemi, Belinda [Washington River Protection Solutions, LLC, Richland, WA (United States); Paik, Ingle [Washington River Protection Solutions, LLC, Richland, WA (United States)

    2015-09-02

    In 2012, One System Nuclear Safety performed a comparison of the safety bases for the Tank Farms Operations Contractor (TOC) and Hanford Tank Waste Treatment and Immobilization Plant (WTP) (RPP-RPT-53222 / 24590-WTP-RPT-MGT-12-018, “One System Report of Comparative Evaluation of Safety Bases for Hanford Waste Treatment and Immobilization Plant Project and Tank Operations Contract”), and identified 25 recommendations that required further evaluation for consensus disposition. This report documents ten NSSC approved consistent methodologies and guides and the results of the additional evaluation process using a new set of evaluation criteria developed for the evaluation of the new methodologies.

  20. Instruction via an Intelligent Videodisc System versus Classroom Instruction for Beginning College French Students. A Comparative Experiment.

    Science.gov (United States)

    1984-01-01

    McLuhan (1964) who believed that the wheel had "amputated" our feet and pondered what the computer as "an extension of our brain" might be doing to...Computerized instruction in second-language aquisition. Studies in Language Learning, 1975, 1(1), 145-150. McLuhan , M. Understanding media: The extension of man

  1. Equivalence between fractional exclusion statistics and self-consistent mean-field theory in interacting-particle systems in any number of dimensions.

    Science.gov (United States)

    Anghel, D V; Nemnes, G A; Gulminelli, F

    2013-10-01

    We describe a mean field interacting particle system in any number of dimensions and in a generic external potential as an ideal gas with fractional exclusion statistics (FES). We define the FES quasiparticle energies, we calculate the FES parameters of the system and we deduce the equations for the equilibrium particle populations. The FES gas is "ideal," in the sense that the quasiparticle energies do not depend on the other quasiparticle levels' populations and the sum of the quasiparticle energies is equal to the total energy of the system. We prove that the FES formalism is equivalent to the semiclassical or Thomas Fermi limit of the self-consistent mean-field theory and the FES quasiparticle populations may be calculated from the Landau quasiparticle populations by making the correspondence between the FES and the Landau quasiparticle energies. The FES provides a natural semiclassical ideal gas description of the interacting particle gas.

  2. Transmission and time delay properties of an integrated system consisting of atomic vapor cladding on top of a micro ring resonator.

    Science.gov (United States)

    Stern, Liron; Levy, Uriel

    2012-12-17

    In this paper we analyze the transmission and time delay properties of light propagating through a microring resonator (MRR) consisting of a solid core waveguide surrounded by an atomic vapor cladding. Using the atomic effective susceptibility of Rubidium we derive the complex transmission spectrum of the integrated system. We show, that when the system is under-coupled, the transmission can exceed the standalone MRR's background transmission and is accompanied by enhanced positive time delay. It is shown that in this case the contrast of the atomic lines is greatly enhanced. This allows achieving high optical densities at short propagation length. Furthermore, owing to its features such as small footprint, high tunability, and high delay-transmission product, this system may become an attractive choice for chip scale manipulations of light.

  3. Reading Development in two Alphabetic Systems Differing in Orthographic Consistency: A longitudinal study of French-speaking children enrolled in a Dutch immersion program

    Directory of Open Access Journals (Sweden)

    Katia Lecocq

    2009-06-01

    Full Text Available Studies examining reading development in bilinguals have led to conflicting conclusions regarding the language in which reading development should take place first. Whereas some studies suggest that reading instruction should take place in the most proficient language first, other studies suggest that reading acquisition should take place in the most consistent orthographic system first. The present study examined two research questions: (1 the relative impact of oral proficiency and orthographic transparency in second-language reading acquisition, and (2 the influence of reading acquisition in one language on the development of reading skills in the other language. To examine these questions, we compared reading development in French-native children attending a Dutch immersion program and learning to read either in Dutch first (most consistent orthography or in French first (least consistent orthography but native language. Following a longitudinal design, the data were gathered over different sessions spanning from Grade 1 to Grade 3. The children in immersion were presented with a series of experimental and standardised tasks examining their levels of oral proficiency as well as their reading abilities in their first and, subsequently in their second, languages of reading instruction. Their performances were compared to the ones of French and Dutch monolinguals. The results showed that by the end of Grade 2, the children instructed to read in Dutch first read in both languages as well as their monolingual peers. In contrast, the children instructed to read in French first lagged behind the other Dutch-speaking groups in Dutch reading tasks. These findings extend the notion that differences across languages in terms of orthographic transparency impact on reading development to the French-Dutch pair, and strongly support the view that there are potentially significant benefits to learn to read in the most consistent orthographic system first

  4. Comparison of the Mapleson C system and adult and paediatric self-inflating bags for delivering guideline-consistent ventilation during simulated adult cardiopulmonary resuscitation.

    Science.gov (United States)

    Sherren, P B; Lewinsohn, A; Jovaisa, T; Wijayatilake, D S

    2011-07-01

    There is a discrepancy between resuscitation teaching and witnessed clinical practice. Furthermore, deleterious outcomes are associated with hyperventilation. We therefore conducted a manikin-based study of a simulated cardiac arrest to evaluate the ability of three ventilating devices to provide guideline-consistent ventilation. Mean (SD) minute ventilation was reduced with the paediatric self-inflating bag (7.0 (3.2) l.min⁻¹) compared with the Mapleson C system (9.8 (3.5) l.min⁻¹) and adult self-inflating bag (9.7 (4.2) l.min⁻¹ ; p = 0.003). Tidal volume was also lower with the paediatric self-inflating bag (391 (52) ml) compared with the others (582 (87) ml and 625 (103) ml, respectively; p < 0.001), as was peak airway pressure (14.5 (5.2) cmH₂O vs 20.7 (9.0) cmH₂O and 30.3 (11.4) cmH₂O, respectively; p < 0.001). Participants hyperventilated patients' lungs in simulated cardiac arrest with all three devices. The paediatric self-inflating bag delivered the most guideline-consistent ventilation. Its use in adult cardiopulmonary resuscitation may ensure delivery of more guideline-consistent ventilation in patients with tracheal intubation. © 2011 The Authors. Anaesthesia © 2011 The Association of Anaesthetists of Great Britain and Ireland.

  5. Assessing the Internal Consistency of the Marine Carbon Dioxide System at High Latitudes: The Labrador Sea AR7W Line Study Case

    Science.gov (United States)

    Raimondi, L.; Azetsu-Scott, K.; Wallace, D.

    2016-02-01

    This work assesses the internal consistency of ocean carbon dioxide through the comparison of discrete measurements and calculated values of four analytical parameters of the inorganic carbon system: Total Alkalinity (TA), Dissolved Inorganic Carbon (DIC), pH and Partial Pressure of CO2 (pCO2). The study is based on 486 seawater samples analyzed for TA, DIC and pH and 86 samples for pCO2 collected during the 2014 Cruise along the AR7W line in Labrador Sea. The internal consistency has been assessed using all combinations of input parameters and eight sets of thermodynamic constants (K1, K2) in calculating each parameter through the CO2SYS software. Residuals of each parameter have been calculated as the differences between measured and calculated values (reported as ΔTA, ΔDIC, ΔpH and ΔpCO2). Although differences between the selected sets of constants were observed, the largest were obtained using different pairs of input parameters. As expected the couple pH-pCO2 produced to poorest results, suggesting that measurements of either TA or DIC are needed to define the carbonate system accurately and precisely. To identify signature of organic alkalinity we isolated the residuals in the bloom area. Therefore only ΔTA from surface waters (0-30 m) along the Greenland side of the basin were selected. The residuals showed that no measured value was higher than calculations and therefore we could not observe presence of organic bases in the shallower water column. The internal consistency in characteristic water masses of Labrador Sea (Denmark Strait Overflow Water, North East Atlantic Deep Water, Newly-ventilated Labrador Sea Water, Greenland and Labrador Shelf waters) will also be discussed.

  6. No consistent bimetric gravity?

    OpenAIRE

    Deser, S.; Sandora, M.; Waldron, A

    2013-01-01

    We discuss the prospects for a consistent, nonlinear, partially massless (PM), gauge symmetry of bimetric gravity (BMG). Just as for single metric massive gravity, we show that consistency of BMG relies on it having a PM extension; we then argue that it cannot.

  7. Evaluating statistical consistency in the ocean model component of the Community Earth System Model (pyCECT v2.0)

    Science.gov (United States)

    Baker, Allison H.; Hu, Yong; Hammerling, Dorit M.; Tseng, Yu-heng; Xu, Haiying; Huang, Xiaomeng; Bryan, Frank O.; Yang, Guangwen

    2016-07-01

    The Parallel Ocean Program (POP), the ocean model component of the Community Earth System Model (CESM), is widely used in climate research. Most current work in CESM-POP focuses on improving the model's efficiency or accuracy, such as improving numerical methods, advancing parameterization, porting to new architectures, or increasing parallelism. Since ocean dynamics are chaotic in nature, achieving bit-for-bit (BFB) identical results in ocean solutions cannot be guaranteed for even tiny code modifications, and determining whether modifications are admissible (i.e., statistically consistent with the original results) is non-trivial. In recent work, an ensemble-based statistical approach was shown to work well for software verification (i.e., quality assurance) on atmospheric model data. The general idea of the ensemble-based statistical consistency testing is to use a qualitative measurement of the variability of the ensemble of simulations as a metric with which to compare future simulations and make a determination of statistical distinguishability. The capability to determine consistency without BFB results boosts model confidence and provides the flexibility needed, for example, for more aggressive code optimizations and the use of heterogeneous execution environments. Since ocean and atmosphere models have differing characteristics in term of dynamics, spatial variability, and timescales, we present a new statistical method to evaluate ocean model simulation data that requires the evaluation of ensemble means and deviations in a spatial manner. In particular, the statistical distribution from an ensemble of CESM-POP simulations is used to determine the standard score of any new model solution at each grid point. Then the percentage of points that have scores greater than a specified threshold indicates whether the new model simulation is statistically distinguishable from the ensemble simulations. Both ensemble size and composition are important. Our

  8. Self-consistent random phase approximation - application to systems of strongly correlated fermions; Approximation des phases aleatoires self-consistante - applications a des systemes de fermions fortement correles

    Energy Technology Data Exchange (ETDEWEB)

    Jemai, M

    2004-07-01

    In the present thesis we have applied the self consistent random phase approximation (SCRPA) to the Hubbard model with a small number of sites (a chain of 2, 4, 6,... sites). Earlier SCRPA had produced very good results in other models like the pairing model of Richardson. It was therefore interesting to see what kind of results the method is able to produce in the case of a more complex model like the Hubbard model. To our great satisfaction the case of two sites with two electrons (half-filling) is solved exactly by the SCRPA. This may seem a little trivial but the fact is that other respectable approximations like 'GW' or the approach with the Gutzwiller wave function yield results still far from exact. With this promising starting point, the case of 6 sites at half filling was considered next. For that case, evidently, SCRPA does not any longer give exact results. However, they are still excellent for a wide range of values of the coupling constant U, covering for instance the phase transition region towards a state with non zero magnetisation. We consider this as a good success of the theory. Non the less the case of 4 sites (a plaquette), as indeed all cases with 4n sites at half filling, turned out to have a problem because of degeneracies at the Hartree Fock level. A generalisation of the present method, including in addition to the pairs, quadruples of Fermions operators (called second RPA) is proposed to also include exactly the plaquette case in our approach. This is therefore a very interesting perspective of the present work. (author)

  9. Tumor-directed gene therapy in mice using a composite nonviral gene delivery system consisting of the piggyBac transposon and polyethylenimine

    Directory of Open Access Journals (Sweden)

    Wu Chaoqun

    2009-04-01

    Full Text Available Abstract Background Compared with viral vectors, nonviral vectors are less immunogenic, more stable, safer and easier to replication for application in cancer gene therapy. However, nonviral gene delivery system has not been extensively used because of the low transfection efficiency and the short transgene expression, especially in vivo. It is desirable to develop a nonviral gene delivery system that can support stable genomic integration and persistent gene expression in vivo. Here, we used a composite nonviral gene delivery system consisting of the piggyBac (PB transposon and polyethylenimine (PEI for long-term transgene expression in mouse ovarian tumors. Methods A recombinant plasmid PB [Act-RFP, HSV-tk] encoding both the herpes simplex thymidine kinase (HSV-tk and the monomeric red fluorescent protein (mRFP1 under PB transposon elements was constructed. This plasmid and the PBase plasmid were injected into ovarian cancer tumor xenografts in mice by in vivo PEI system. The antitumor effects of HSV-tk/ganciclovir (GCV system were observed after intraperitoneal injection of GCV. Histological analysis and TUNEL assay were performed on the cryostat sections of the tumor tissue. Results Plasmid construction was confirmed by PCR analysis combined with restrictive enzyme digestion. mRFP1 expression could be visualized three weeks after the last transfection of pPB/TK under fluorescence microscopy. After GCV admission, the tumor volume of PB/TK group was significantly reduced and the tumor inhibitory rate was 81.96% contrasted against the 43.07% in the TK group. Histological analysis showed that there were extensive necrosis and lymphocytes infiltration in the tumor tissue of the PB/TK group but limited in the tissue of control group. TUNEL assays suggested that the transfected cells were undergoing apoptosis after GCV admission in vivo. Conclusion Our results show that the nonviral gene delivery system coupling PB transposon with PEI can be used

  10. Prizes for consistency

    Energy Technology Data Exchange (ETDEWEB)

    Hiscock, S.

    1986-07-01

    The importance of consistency in coal quality has become of increasing significance recently, with the current trend towards using coal from a range of sources. A significant development has been the swing in responsibilities for coal quality. The increasing demand for consistency in quality has led to a re-examination of where in the trade and transport chain the quality should be assessed and where further upgrading of inspection and preparation facilities are required. Changes are in progress throughout the whole coal transport chain which will improve consistency of delivered coal quality. These include installation of beneficiation plant at coal mines, export terminals, and on the premises of end users. It is suggested that one of the keys to success for the coal industry will be the ability to provide coal of a consistent quality.

  11. Consistent Probabilistic Social Choice

    OpenAIRE

    Brandl, Florian; Brandt, Felix; Seedig, Hans Georg

    2015-01-01

    Two fundamental axioms in social choice theory are consistency with respect to a variable electorate and consistency with respect to components of similar alternatives. In the context of traditional non-probabilistic social choice, these axioms are incompatible with each other. We show that in the context of probabilistic social choice, these axioms uniquely characterize a function proposed by Fishburn (Rev. Econ. Stud., 51(4), 683--692, 1984). Fishburn's function returns so-called maximal lo...

  12. Internal Consistency, Test–Retest Reliability and Measurement Error of the Self-Report Version of the Social Skills Rating System in a Sample of Australian Adolescents

    Science.gov (United States)

    Vaz, Sharmila; Parsons, Richard; Passmore, Anne Elizabeth; Andreou, Pantelis; Falkmer, Torbjörn

    2013-01-01

    The social skills rating system (SSRS) is used to assess social skills and competence in children and adolescents. While its characteristics based on United States samples (US) are published, corresponding Australian figures are unavailable. Using a 4-week retest design, we examined the internal consistency, retest reliability and measurement error (ME) of the SSRS secondary student form (SSF) in a sample of Year 7 students (N = 187), from five randomly selected public schools in Perth, western Australia. Internal consistency (IC) of the total scale and most subscale scores (except empathy) on the frequency rating scale was adequate to permit independent use. On the importance rating scale, most IC estimates for girls fell below the benchmark. Test–retest estimates of the total scale and subscales were insufficient to permit reliable use. ME of the total scale score (frequency rating) for boys was equivalent to the US estimate, while that for girls was lower than the US error. ME of the total scale score (importance rating) was larger than the error using the frequency rating scale. The study finding supports the idea of using multiple informants (e.g. teacher and parent reports), not just student as recommended in the manual. Future research needs to substantiate the clinical meaningfulness of the MEs calculated in this study by corroborating them against the respective Minimum Clinically Important Difference (MCID). PMID:24040116

  13. Internal consistency, test-retest reliability and measurement error of the self-report version of the social skills rating system in a sample of Australian adolescents.

    Directory of Open Access Journals (Sweden)

    Sharmila Vaz

    Full Text Available The social skills rating system (SSRS is used to assess social skills and competence in children and adolescents. While its characteristics based on United States samples (US are published, corresponding Australian figures are unavailable. Using a 4-week retest design, we examined the internal consistency, retest reliability and measurement error (ME of the SSRS secondary student form (SSF in a sample of Year 7 students (N = 187, from five randomly selected public schools in Perth, western Australia. Internal consistency (IC of the total scale and most subscale scores (except empathy on the frequency rating scale was adequate to permit independent use. On the importance rating scale, most IC estimates for girls fell below the benchmark. Test-retest estimates of the total scale and subscales were insufficient to permit reliable use. ME of the total scale score (frequency rating for boys was equivalent to the US estimate, while that for girls was lower than the US error. ME of the total scale score (importance rating was larger than the error using the frequency rating scale. The study finding supports the idea of using multiple informants (e.g. teacher and parent reports, not just student as recommended in the manual. Future research needs to substantiate the clinical meaningfulness of the MEs calculated in this study by corroborating them against the respective Minimum Clinically Important Difference (MCID.

  14. Thermal stability and microstructure of GMR-systems consisting of thin metallic films; Thermische Stabilitaet und Mikrostruktur von GMR-Systemen aus duennen metallischen Filmen

    Energy Technology Data Exchange (ETDEWEB)

    Ebert, Joerg

    2007-08-31

    In this work the short-term and long-term stability of the nanoscale metallic multilayers at elevated temperatures is studied. Reasons and mechanisms for breakdown of the GMR-effect have been analyzed by different physical methods. The multilayered samples investigated in this work exhibit a GMR effect of GMR (alloy)=20.7 % which is significantly smaller than the effect of the standard system with pure Cu interlayers (GMR(Cu)=25.2 %). For protection against oxidation during the use a passivation coating consisting of SiO{sub 2} and Si{sub 3}N{sub 4} has been deposited by the means of plasma CVD. Typical parameters for this process are times of t{sub short-term}=1 h in the temperature range of 200 Csystems will not break down under these conditions. By applying an Arrhenius-relation for the breakdown at the working point WP=12 mT for test sensors with non-alloyed Cu interlayers it was possible to predict a lifetime of t{sub failure}=56000 h for T{sub use}=150 C. Maximum lifetime for the alloyed systems was determined to be t{sub failure}=15000 h. The effective activation energy EA for the breakdown of the tested sensors of the long-term study was laying in the range of 1.4 eV{<=}EA{<=}1.6 eV, indicating for grain boundary diffusion as the dominating mechanism for the structural changes of the multilayered samples. These changes have been verified by applying different techniques like X-ray Reflectivity, Transmission Electron Microscopy, or Moessbauer Spectroscopy to three different combinations of materials: the CoFe/Cu basic system the CoFe/CuAgAu alloyed system the model system consisting of Fe/Cu multilayers The breakdown of the multilayered systems is passing three main stages: Samples examined directly after the preparation show mixing of different atomic species at the interfaces. This intermixing goes along with lattice disturbances, and together they cause an increased basic

  15. Immunomodulatory activity of enzymatically synthesized glycogen and its digested metabolite in a co-culture system consisting of differentiated Caco-2 cells and RAW264.7 macrophages.

    Science.gov (United States)

    Yasuda, Michiko; Furuyashiki, Takashi; Nakamura, Toshiyuki; Kakutani, Ryo; Takata, Hiroki; Ashida, Hitoshi

    2013-09-01

    Previously, we developed enzymatically synthesized glycogen (ESG) from starch, and showed its immunomodulatory and dietary fiber-like activities. In this study, we investigated the metabolism of ESG and its immunomodulatory activity using differentiated Caco-2 cells as a model of the intestinal barrier. In a co-culture system consisting of differentiated Caco-2 cells and RAW264.7 macrophages, mRNA expression of IL-6, IL-8, IL-1β and BAFF cytokines was up-regulated in Caco-2 cells and IL-8 production in basolateral medium was induced after 24 h apical treatment with 5 mg ml(-1) of ESG. The mRNA level of iNOS was also up-regulated in RAW264.7 macrophages. After characterization of the binding of anti-glycogen monoclonal antibodies (IV58B6 and ESG1A9) to ESG and its digested metabolite resistant glycogen (RG), an enzyme-linked immunosorbent assay (ELISA) system was developed to quantify ESG and RG. Using this system, we investigated the metabolism of ESG in differentiated Caco-2 cells. When ESG (7000 kDa, 5 mg ml(-1)) was added to the apical side of Caco-2 monolayers, ESG disappeared and RG (about 3000 kDa, 3.5 mg ml(-1)) appeared in the apical solution during a 24 h incubation. Neither ESG nor RG was detected in the basolateral solution. In addition, both ESG and RG were bound to TLR2 in Caco-2 cells. In conclusion, we suggest that ESG is metabolized to a RG-like structure in the intestine, and this metabolite activates the immune system via stimulation of the intestinal epithelium, although neither ESG nor its metabolite could permeate the intestinal cells under our experimental conditions. These results provide evidence for the beneficial function of ESG as a food ingredient.

  16. Consistent data recording across a health system and web-enablement allow service quality comparisons: online data for commissioning dermatology services.

    Science.gov (United States)

    Dmitrieva, Olga; Michalakidis, Georgios; Mason, Aaron; Jones, Simon; Chan, Tom; de Lusignan, Simon

    2012-01-01

    A new distributed model of health care management is being introduced in England. Family practitioners have new responsibilities for the management of health care budgets and commissioning of services. There are national datasets available about health care providers and the geographical areas they serve. These data could be better used to assist the family practitioner turned health service commissioners. Unfortunately these data are not in a form that is readily usable by these fledgling family commissioning groups. We therefore Web enabled all the national hospital dermatology treatment data in England combining it with locality data to provide a smart commissioning tool for local communities. We used open-source software including the Ruby on Rails Web framework and MySQL. The system has a Web front-end, which uses hypertext markup language cascading style sheets (HTML/CSS) and JavaScript to deliver and present data provided by the database. A combination of advanced caching and schema structures allows for faster data retrieval on every execution. The system provides an intuitive environment for data analysis and processing across a large health system dataset. Web-enablement has enabled data about in patients, day cases and outpatients to be readily grouped, viewed, and linked to other data. The combination of web-enablement, consistent data collection from all providers; readily available locality data; and a registration based primary system enables the creation of data, which can be used to commission dermatology services in small areas. Standardized datasets collected across large health enterprises when web enabled can readily benchmark local services and inform commissioning decisions.

  17. Self-consistent phonons: An accurate and practical method to account for anharmonic effects in equilibrium properties of general classical or quantum many-body systems

    Science.gov (United States)

    Brown, Sandra E.; Mandelshtam, Vladimir A.

    2016-12-01

    The self-consistent phonons (SCP) method is a practical approach for computing structural and dynamical properties of a general quantum or classical many-body system while incorporating anharmonic effects. However, a convincing demonstration of the accuracy of SCP and its advantages over the standard harmonic approximation is still lacking. Here we apply SCP to classical Lennard-Jones (LJ) clusters and compare with numerically exact results. The close agreement between the two reveals that SCP accurately describes structural properties of the classical LJ clusters from zero-temperature (where the method is exact) up to the temperatures at which the chosen cluster conformation becomes unstable. Given the similarities between thermal and quantum fluctuations, both physically and within the SCP ansatz, the accuracy of classical SCP over a range of temperatures suggests that quantum SCP is also accurate over a range of quantum de Boer parameter Λ = ℏ / (σ√{ mε }) , which describes the degree of quantum character of the system.

  18. Cache Consistency by Design

    NARCIS (Netherlands)

    Brinksma, Hendrik

    In this paper we present a proof of the sequential consistency of the lazy caching protocol of Afek, Brown, and Merritt. The proof will follow a strategy of stepwise refinement, developing the distributed caching memory in five transformation steps from a specification of the serial memory, whilst

  19. Exploring connections between statistical mechanics and Green's functions for realistic systems: Temperature dependent electronic entropy and internal energy from a self-consistent second-order Green's function

    Science.gov (United States)

    Welden, Alicia Rae; Rusakov, Alexander A.; Zgid, Dominika

    2016-11-01

    Including finite-temperature effects from the electronic degrees of freedom in electronic structure calculations of semiconductors and metals is desired; however, in practice it remains exceedingly difficult when using zero-temperature methods, since these methods require an explicit evaluation of multiple excited states in order to account for any finite-temperature effects. Using a Matsubara Green's function formalism remains a viable alternative, since in this formalism it is easier to include thermal effects and to connect the dynamic quantities such as the self-energy with static thermodynamic quantities such as the Helmholtz energy, entropy, and internal energy. However, despite the promising properties of this formalism, little is known about the multiple solutions of the non-linear equations present in the self-consistent Matsubara formalism and only a few cases involving a full Coulomb Hamiltonian were investigated in the past. Here, to shed some light onto the iterative nature of the Green's function solutions, we self-consistently evaluate the thermodynamic quantities for a one-dimensional (1D) hydrogen solid at various interatomic separations and temperatures using the self-energy approximated to second-order (GF2). At many points in the phase diagram of this system, multiple phases such as a metal and an insulator exist, and we are able to determine the most stable phase from the analysis of Helmholtz energies. Additionally, we show the evolution of the spectrum of 1D boron nitride to demonstrate that GF2 is capable of qualitatively describing the temperature effects influencing the size of the band gap.

  20. Exploring connections between statistical mechanics and Green's functions for realistic systems: Temperature dependent electronic entropy and internal energy from a self-consistent second-order Green's function.

    Science.gov (United States)

    Welden, Alicia Rae; Rusakov, Alexander A; Zgid, Dominika

    2016-11-28

    Including finite-temperature effects from the electronic degrees of freedom in electronic structure calculations of semiconductors and metals is desired; however, in practice it remains exceedingly difficult when using zero-temperature methods, since these methods require an explicit evaluation of multiple excited states in order to account for any finite-temperature effects. Using a Matsubara Green's function formalism remains a viable alternative, since in this formalism it is easier to include thermal effects and to connect the dynamic quantities such as the self-energy with static thermodynamic quantities such as the Helmholtz energy, entropy, and internal energy. However, despite the promising properties of this formalism, little is known about the multiple solutions of the non-linear equations present in the self-consistent Matsubara formalism and only a few cases involving a full Coulomb Hamiltonian were investigated in the past. Here, to shed some light onto the iterative nature of the Green's function solutions, we self-consistently evaluate the thermodynamic quantities for a one-dimensional (1D) hydrogen solid at various interatomic separations and temperatures using the self-energy approximated to second-order (GF2). At many points in the phase diagram of this system, multiple phases such as a metal and an insulator exist, and we are able to determine the most stable phase from the analysis of Helmholtz energies. Additionally, we show the evolution of the spectrum of 1D boron nitride to demonstrate that GF2 is capable of qualitatively describing the temperature effects influencing the size of the band gap.

  1. Reporting consistently on CSR

    DEFF Research Database (Denmark)

    Thomsen, Christa; Nielsen, Anne Ellerup

    2006-01-01

    This chapter first outlines theory and literature on CSR and Stakeholder Relations focusing on the different perspectives and the contextual and dynamic character of the CSR concept. CSR reporting challenges are discussed and a model of analysis is proposed. Next, our paper presents the results...... of a case study showing that companies use different and not necessarily consistent strategies for reporting on CSR. Finally, the implications for managerial practice are discussed. The chapter concludes by highlighting the value and awareness of the discourse and the discourse types adopted...... in the reporting material. By implementing consistent discourse strategies that interact according to a well-defined pattern or order, it is possible to communicate a strong social commitment on the one hand, and to take into consideration the expectations of the shareholders and the other stakeholders...

  2. Geometrically Consistent Mesh Modification

    KAUST Repository

    Bonito, A.

    2010-01-01

    A new paradigm of adaptivity is to execute refinement, coarsening, and smoothing of meshes on manifolds with incomplete information about their geometry and yet preserve position and curvature accuracy. We refer to this collectively as geometrically consistent (GC) mesh modification. We discuss the concept of discrete GC, show the failure of naive approaches, and propose and analyze a simple algorithm that is GC and accuracy preserving. © 2010 Society for Industrial and Applied Mathematics.

  3. Description of nuclear systems with a self-consistent configuration-mixing approach. II. Application to structure and reactions in even-even s d -shell nuclei

    Science.gov (United States)

    Robin, C.; Pillet, N.; Dupuis, M.; Le Bloas, J.; Peña Arteaga, D.; Berger, J.-F.

    2017-04-01

    Background: The variational multiparticle-multihole configuration mixing approach to nuclei has been proposed about a decade ago. While the first applications followed rapidly, the implementation of the full formalism of this method has only been recently completed and applied in C. Robin, N. Pillet, D. Peña Arteaga, and J.-F. Berger, [Phys. Rev. C 93, 024302 (2016)], 10.1103/PhysRevC.93.024302 to 12C as a test-case. Purpose: The main objective of the present paper is to carry on the study that was initiated in that reference, in order to put the variational multiparticle-multihole configuration mixing method to more stringent tests. To that aim we perform a systematic study of even-even s d -shell nuclei. Method: The wave function of these nuclei is taken as a configuration mixing built on orbitals of the s d -shell, and both the mixing coefficients of the nuclear state and the single-particle wave functions are determined consistently from the same variational principle. As in the previous works, the calculations are done using the D1S Gogny force. Results: Various ground-state properties are analyzed. In particular, the correlation content and composition of the wave function as well as the single-particle orbitals and energies are examined. Binding energies and charge radii are also calculated and compared to experiment. The description of the first excited state is also examined and the corresponding transition densities are used as input for the calculation of reaction processes such as inelastic electron and proton scattering. Special attention is paid to the effect of the optimization of the single-particle states consistently with the correlations of the system. Conclusions: The variational multiparticle-multihole configuration mixing approach is systematically applied to the description of even-even s d -shell nuclei. Globally, the results are satisfying and encouraging. In particular, charge radii and excitation energies are nicely reproduced. However

  4. Theoretical modeling of large molecular systems. Advances in the local self consistent field method for mixed quantum mechanics/molecular mechanics calculations.

    Science.gov (United States)

    Monari, Antonio; Rivail, Jean-Louis; Assfeld, Xavier

    2013-02-19

    Molecular mechanics methods can efficiently compute the macroscopic properties of a large molecular system but cannot represent the electronic changes that occur during a chemical reaction or an electronic transition. Quantum mechanical methods can accurately simulate these processes, but they require considerably greater computational resources. Because electronic changes typically occur in a limited part of the system, such as the solute in a molecular solution or the substrate within the active site of enzymatic reactions, researchers can limit the quantum computation to this part of the system. Researchers take into account the influence of the surroundings by embedding this quantum computation into a calculation of the whole system described at the molecular mechanical level, a strategy known as the mixed quantum mechanics/molecular mechanics (QM/MM) approach. The accuracy of this embedding varies according to the types of interactions included, whether they are purely mechanical or classically electrostatic. This embedding can also introduce the induced polarization of the surroundings. The difficulty in QM/MM calculations comes from the splitting of the system into two parts, which requires severing the chemical bonds that link the quantum mechanical subsystem to the classical subsystem. Typically, researchers replace the quantoclassical atoms, those at the boundary between the subsystems, with a monovalent link atom. For example, researchers might add a hydrogen atom when a C-C bond is cut. This Account describes another approach, the Local Self Consistent Field (LSCF), which was developed in our laboratory. LSCF links the quantum mechanical portion of the molecule to the classical portion using a strictly localized bond orbital extracted from a small model molecule for each bond. In this scenario, the quantoclassical atom has an apparent nuclear charge of +1. To achieve correct bond lengths and force constants, we must take into account the inner shell of

  5. Generating Consistent Program Tutorials

    DEFF Research Database (Denmark)

    Vestdam, Thomas

    2002-01-01

    In this paper we present a tool that supports construction of program tutorials. A program tutorial provides the reader with an understanding of an example program by interleaving fragments of source code and explaining text. An example program can for example illustrate how to use a library...... or a framework. We present a means for specifying the fragments of a program that are to be in-lined in the tutorial text. These in-line fragments are defined by addressing named syntactical elements, such as classes and methods, but it is also possible to address individual code lines by labeling them...... with source markers. The tool helps ensuring consistency between program tutorial and example programs by extracting fragments of source code based on the fragment specifications and by detecting when a program tutorial is addressing program fragments that do not exist. The program tutorials are presented...

  6. Evaluation of the accuracy, consistency, and stability of measurements of the Planck constant used in the redefinition of the international system of units

    Science.gov (United States)

    Possolo, Antonio; Schlamminger, Stephan; Stoudt, Sara; Pratt, Jon R.; Williams, Carl J.

    2018-02-01

    The Consultative Committee for Mass and related quantities (CCM), of the International Committee for weights and measures (CIPM), has recently declared the readiness of the community to support the redefinition of the international system of units (SI) at the next meeting of the General Conference on Weights and Measures (CGPM) scheduled for November, 2018. Such redefinition will replace the international prototype of the Kilogram (IPK), as the definition and sole primary realization of the unit of mass, with a definition involving the Planck constant, h. This redefinition in terms of a fundamental constant of nature will enable widespread primary realizations not only of the kilogram but also of its multiples and sub-multiples, best to address the full range of practical needs in the measurement of mass. We review and discuss the statistical models and statistical data reductions, uncertainty evaluations, and substantive arguments that support the verification of several technical preconditions for the redefinition that the CCM has established, and whose verification the CCM has affirmed. These conditions relate to the accuracy and mutual consistency of qualifying measurement results. We review also an issue that has surfaced only recently, concerning the convergence toward a stable value, of the historical values that the task group on fundamental constants of the committee on Data for Science and Technology CODATA-TGFC has recommended for h over the years, even though the CCM has not deemed this issue to be relevant. We conclude that no statistically significant trend can be substantiated for these recommended values, but note that cumulative consensus values that may be derived from the historical measurement results for h seem to have converged while continuing to exhibit fluctuations that are typical of a process in statistical control. Finally, we argue that the most recent consensus value derived from the best measurements available for h, obtained using

  7. Electrical Trees in a Composite Insulating System Consisted of Epoxy Resin and Mica: The Case of Multiple Mica Sheets For Machine Insulation

    Directory of Open Access Journals (Sweden)

    V. A. Kioussis

    2014-08-01

    Full Text Available Epoxy resin and mica sheets consist the essential insulation of rotating machine stator bars. Such an insulation, although very resistant to partial discharges, is subjected to considerable electrical stresses and consequently electrical trees may ensue. In this paper, an effort is made to simulate electrical tree propagation in multiple epoxy resin/mica sheets with the aid of Cellular Automata (CA. An attempt to compare the simulation results with experimental results is also made.

  8. EQUATION OF STATE IN FORM WHICH RELATES MOL FRACTION AND MOLARITY OF TWO (OR MORE COMPONENT THERMODYNAMIC SYSTEM CONSISTED OF IDEAL GASES, AND IT'S APPLICATIONS

    Directory of Open Access Journals (Sweden)

    Marko Popović

    2010-01-01

    Full Text Available Most people would face a problem if there is a need to calculate the mole fraction of a substance A in a gaseous solution (a thermodynamic system containing two or more ideal gases knowing its molarity at a given temperature and pressure. For most it would take a lot of time and calculations to find the answer, especially because the quantities of other substances in the system aren't given. An even greater problem arises when we try to understand how special relativity affects gaseous systems, especially solutions and systems in equilibrium. In this paper formulas are suggested that greatly shorten the process of conversion from molarity to mole fraction and give us a better insight into the relativistic effects on a gaseous system.

  9. The systemic integration of international law by domestic courts: domestic judges as architects of the consistency of the international legal order

    NARCIS (Netherlands)

    d' Aspremont, J.; Fauchald, O.K.; Nollkaemper, A.

    2012-01-01

    The paper aims at appraising whether domestic courts, because of different legal and institutional constraints, construe the systemic character of the international legal order differently from international courts and international legal scholars. After recalling the extent to which international

  10. Exploring connections between statistical mechanics and Green's functions for realistic systems. Temperature dependent entropy and internal energy from a self-consistent second-order Green's function

    CERN Document Server

    Welden, Alicia Rae; Zgid, Dominika

    2016-01-01

    Including finite-temperature effects into electronic structure calculations of semiconductors and metals is frequently necessary, but can become cumbersome using zero-temperature methods, which require an explicit evaluation of excited states to extend the approach to finite-temperature. Using a Matsubara Green's function formalism, it is easy to include the effects of temperature and to connect dynamic quantities such as the self-energy with static thermodynamic quantities such as the Helmholtz energy, entropy, and internal energy. We evaluate the thermodynamic quantities with a self-consistent Green's function where the self-energy is approximated to second-order (GF2). To validate our method, we benchmark it against finite temperature full configuration interaction (FCI) calculations for a hydrogen fluoride (HF) molecule and find excellent agreement at high temperatures and very good agreement at low temperatures. Then, we proceed to evaluate thermodynamic quantities for a one-dimension hydrogen solid at v...

  11. The expression pattern of EVA1C, a novel Slit receptor, is consistent with an axon guidance role in the mouse nervous system.

    Directory of Open Access Journals (Sweden)

    Gregory James

    Full Text Available The Slit/Robo axon guidance families play a vital role in the formation of neural circuitry within select regions of the developing mouse nervous system. Typically Slits signal through the Robo receptors, however they also have Robo-independent functions. The novel Slit receptor Eva-1, recently discovered in C. elegans, and the human orthologue of which is located in the Down syndrome critical region on chromosome 21, could account for some of these Robo independent functions as well as provide selectivity to Robo-mediated axon responses to Slit. Here we investigate the expression of the mammalian orthologue EVA1C in regions of the developing mouse nervous system which have been shown to exhibit Robo-dependent and -independent responses to Slit. We report that EVA1C is expressed by axons contributing to commissures, tracts and nerve pathways of the developing spinal cord and forebrain. Furthermore it is expressed by axons that display both Robo-dependent and -independent functions of Slit, supporting a role for EVA1C in Slit/Robo mediated neural circuit formation in the developing nervous system.

  12. Fc-based delivery system enhances immunogenicity of a tuberculosis subunit vaccine candidate consisting of the ESAT-6:CFP-10 complex.

    Science.gov (United States)

    Farsiani, Hadi; Mosavat, Arman; Soleimanpour, Saman; Sadeghian, Hamid; Akbari Eydgahi, Mohammad Reza; Ghazvini, Kiarash; Sankian, Mojtaba; Aryan, Ehsan; Jamehdar, Saeid Amel; Rezaee, Seyed Abdolrahim

    2016-06-21

    Tuberculosis (TB) remains a major global health threat despite chemotherapy and Bacilli Calmette-Guérin (BCG) vaccination. Therefore, a safer and more effective vaccine against TB is urgently needed. This study evaluated the immunogenicity of a recombinant fusion protein consisting of early secreted antigenic target protein 6 kDa (ESAT-6), culture filtrate protein 10 kDa (CFP-10) and the Fc-domain of mouse IgG2a as a novel subunit vaccine. The recombinant expression vectors (pPICZαA-ESAT-6:CFP-10:Fcγ2a and pPICZαA-ESAT-6:CFP-10:His) were transferred into Pichia pastoris. After SDS-PAGE and immunoblotting, the immunogenicity of the recombinant proteins was evaluated in mice. When both recombinant proteins (ESAT-6:CFP-10:Fcγ2a and ESAT-6:CFP-10:His) were used for vaccination, Th1-type cellular responses were induced producing high levels of IFN-γ and IL-12. However, the Fc-tagged recombinant protein induced more effective Th1-type cellular responses with a small increase in IL-4 as compared to the BCG and ESAT-6:CFP-10:His groups. Moreover, mice primed with BCG and then supplemented with ESAT-6:CFP-10:Fcγ2a produced the highest levels of IFN-γ and IL-12 in immunized groups. The findings indicate that when Fcγ2a is fused to the ESAT-6:CFP-10 complex, as a delivery vehicle, there could be an increase in the immunogenicity of this type of subunit vaccine. Therefore, additional investigations are necessary for the development of appropriate Fc-based tuberculosis vaccines.

  13. Informal uncertainty analysis (GLUE) of continuous flow simulation in a hybrid sewer system with infiltration inflow - Consistency of containment ratios in calibration and validation?

    DEFF Research Database (Denmark)

    Breinholt, Anders; Grum, Morten; Madsen, Henrik

    2013-01-01

    Monitoring of flows in sewer systems is increasingly applied to calibrate urban drainage models used for long-term simulation. However, most often models are calibrated without considering the uncertainties. The generalized likelihood uncertainty estimation (GLUE) methodology is here applied...... to assess parameter and flow simulation uncertainty using a simplified lumped sewer model that accounts for three separate flow contributions: wastewater, fast runoff from paved areas, and slow infiltrating water from permeable areas. Recently GLUE methodology has been critisised for generating prediction...... rain inputs and more accurate flow observations to reduce parameter and model simulation uncertainty. © Author(s) 2013....

  14. Development of natural treatment system consisting of black soil and Kentucky bluegrass for the post-treatment of anaerobically digested strong wastewater.

    Science.gov (United States)

    Chen, Xiaochen; Fukushi, Kensuke

    2016-03-01

    To develop a sound post-treatment process for anaerobically-digested strong wastewater, a novel natural treatment system comprising two units is put forward. The first unit, a trickling filter, provides for further reduction of biochemical oxygen demand and adjustable nitrification. The subsequent soil-plant unit aims at removing and recovering the nutrients nitrogen (N), phosphorus (P) and potassium (K). As a lab-scale feasibility study, a soil column test was conducted, in which black soil and valuable Kentucky bluegrass were integrated to treat artificial nutrient-enriched wastewater. After a long-term operation, the nitrification function was well established in the top layers, despite the need for an improved denitrification process prior to discharge. P and K were retained by the soil through distinct mechanisms. Since they either partially or totally remained in plant-available forms in the soil, indirect nutrient reuse could be achieved. As for Kentucky bluegrass, it displayed better growth status when receiving wastewater, with direct recovery of 8%, 6% and 14% of input N, P and K, respectively. Furthermore, the indispensable role of Kentucky bluegrass for better treatment performance was proved, as it enhanced the cell-specific nitrification potential of the soil nitrifying microorganisms inhabiting the rhizosphere. After further upgrade, the proposed system is expected to become a new solution for strong wastewater pollution. Copyright © 2015. Published by Elsevier B.V.

  15. The impact of water on dislocation content and slip system activity in olivine constrained by HR-EBSD and visco-plastic self-consistent simulations

    Science.gov (United States)

    Wallis, D.; Hansen, L. N.; Tasaka, M.; Kumamoto, K. M.; Lloyd, G. E.; Parsons, A. J.; Kohlstedt, D. L.; Wilkinson, A. J.

    2016-12-01

    Changes in concentration of H+ ions in olivine have impacts on its rheological behaviour and therefore on tectonic processes involving mantle deformation. Deformation experiments on aggregates of wet olivine exhibit different evolution of crystal preferred orientations (CPO) and substructure from experiments on dry olivine, suggesting that elevated H+ concentrations impact activity of dislocation slip-systems. We use high angular-resolution electron backscatter diffraction (HR-EBSD) to map densities of different types of geometrically necessary dislocations (GND) in polycrystalline olivine deformed experimentally under wet and dry conditions and also in nature. HR-EBSD provides unprecedented angular resolution, resolving misorientations lherzolite xenolith from Lesotho reveals the same unusual CPO and similar proportions of dislocation types to `wet' experimental samples, supporting the applicability of these findings to natural deformation conditions. These results support suggestions that H+ impacts the flow properties of olivine by altering dislocation activity and climb, while also providing full quantification of GND content. In particular, the relative proportions of dislocation types may provide a basis for identifying olivine deformed under wet and dry conditions.

  16. The oldest known digestive system consisting of both paired digestive glands and a crop from exceptionally preserved trilobites of the Guanshan Biota (Early Cambrian, China.

    Directory of Open Access Journals (Sweden)

    Melanie J Hopkins

    Full Text Available The early Cambrian Guanshan biota of eastern Yunnan, China, contains exceptionally preserved animals and algae. Most diverse and abundant are the arthropods, of which there are at least 11 species of trilobites represented by numerous specimens. Many trilobite specimens show soft-body preservation via iron oxide pseudomorphs of pyrite replacement. Here we describe digestive structures from two species of trilobite, Palaeolenus lantenoisi and Redlichia mansuyi. Multiple specimens of both species contain the preserved remains of an expanded stomach region (a "crop" under the glabella, a structure which has not been observed in trilobites this old, despite numerous examples of trilobite gut traces from other Cambrian Lagerstätten. In addition, at least one specimen of Palaeolenus lantenoisi shows the preservation of an unusual combination of digestive structures: a crop and paired digestive glands along the alimentary tract. This combination of digestive structures has also never been observed in trilobites this old, and is rare in general, with prior evidence of it from one juvenile trilobite specimen from the late Cambrian Orsten fauna of Sweden and possibly one adult trilobite specimen from the Early Ordovician Fezouata Lagerstätte. The variation in the fidelity of preservation of digestive structures within and across different Lagerstätten may be due to variation in the type, quality, and point of digestion of food among specimens in addition to differences in mode of preservation. The presence and combination of these digestive features in the Guanshan trilobites contradicts current models of how the trilobite digestive system was structured and evolved over time. Most notably, the crop is not a derived structure as previously proposed, although it is possible that the relative size of the crop increased over the evolutionary history of the clade.

  17. The oldest known digestive system consisting of both paired digestive glands and a crop from exceptionally preserved trilobites of the Guanshan Biota (Early Cambrian, China).

    Science.gov (United States)

    Hopkins, Melanie J; Chen, Feiyang; Hu, Shixue; Zhang, Zhifei

    2017-01-01

    The early Cambrian Guanshan biota of eastern Yunnan, China, contains exceptionally preserved animals and algae. Most diverse and abundant are the arthropods, of which there are at least 11 species of trilobites represented by numerous specimens. Many trilobite specimens show soft-body preservation via iron oxide pseudomorphs of pyrite replacement. Here we describe digestive structures from two species of trilobite, Palaeolenus lantenoisi and Redlichia mansuyi. Multiple specimens of both species contain the preserved remains of an expanded stomach region (a "crop") under the glabella, a structure which has not been observed in trilobites this old, despite numerous examples of trilobite gut traces from other Cambrian Lagerstätten. In addition, at least one specimen of Palaeolenus lantenoisi shows the preservation of an unusual combination of digestive structures: a crop and paired digestive glands along the alimentary tract. This combination of digestive structures has also never been observed in trilobites this old, and is rare in general, with prior evidence of it from one juvenile trilobite specimen from the late Cambrian Orsten fauna of Sweden and possibly one adult trilobite specimen from the Early Ordovician Fezouata Lagerstätte. The variation in the fidelity of preservation of digestive structures within and across different Lagerstätten may be due to variation in the type, quality, and point of digestion of food among specimens in addition to differences in mode of preservation. The presence and combination of these digestive features in the Guanshan trilobites contradicts current models of how the trilobite digestive system was structured and evolved over time. Most notably, the crop is not a derived structure as previously proposed, although it is possible that the relative size of the crop increased over the evolutionary history of the clade.

  18. A new approach to hull consistency

    Directory of Open Access Journals (Sweden)

    Kolev Lubomir

    2016-06-01

    Full Text Available Hull consistency is a known technique to improve the efficiency of iterative interval methods for solving nonlinear systems describing steady-states in various circuits. Presently, hull consistency is checked in a scalar manner, i.e. successively for each equation of the nonlinear system with respect to a single variable. In the present poster, a new more general approach to implementing hull consistency is suggested which consists in treating simultaneously several equations with respect to the same number of variables.

  19. A Survey of Videodisc Technology.

    Science.gov (United States)

    1985-12-01

    machine. Turing’s computer ( Colossus ), using vacuum tubes instead of telephone relays, was used to break these random letter codes. After nine ver- sions...of Colossus were built and by wars end the British were able to unscramble German secret codes in a matter of a few minutes. By the mid 1940s vacuum... Rhode Island, Vermont, and New Hampshire. SOUTH CENTRAL (n=100) 30% Alabama, Arkansas, Delaware, Florida, Georgia, Kentucky, Louisiana, Maryland

  20. Information, Consistent Estimation and Dynamic System Identification.

    Science.gov (United States)

    1976-11-01

    tended in general termsz to the infinite parameter case. It is shown that under uniqueness conditions on the output statistics of linea ~r dynamical...and U is a 0- algebra of subsets of fi. The observation sequence (z n) is a stochastic process on a probability space (0, U, P*) with values in a...and 8 is the a- algebra of Borel sets in R . We call P. the true measure and * the true parameter. The parameter space S is a set such that for each s e

  1. Depression and Logical Consistency of Personal Constructs.

    Science.gov (United States)

    Chambers, W. V.; And Others

    1986-01-01

    Explored Neimeyer's notion that moderately depressed people have relatively disorganized personal construct systems, that non-depressed people see themselves consistently positively, highly depressed people view themselves negatively, and moderately depressed people view themselves with ambivalence. Using a grid measure of logical consistency,…

  2. Decentralized Consistent Updates in SDN

    KAUST Repository

    Nguyen, Thanh Dang

    2017-04-10

    We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and blackholes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.

  3. The self-consistent calculation of pseudo-molecule energy levels, construction of energy level correlation diagrams and an automated computation system for SCF-X(Alpha)-SW calculations

    Science.gov (United States)

    Schlosser, H.

    1981-01-01

    The self consistent calculation of the electronic energy levels of noble gas pseudomolecules formed when a metal surface is bombarded by noble gas ions is discussed along with the construction of energy level correlation diagrams as a function of interatomic spacing. The self consistent field x alpha scattered wave (SCF-Xalpha-SW) method is utilized. Preliminary results on the Ne-Mg system are given. An interactive x alpha programming system, implemented on the LeRC IBM 370 computer, is described in detail. This automated system makes use of special PROCDEFS (procedure definitions) to minimize the data to be entered manually at a remote terminal. Listings of the special PROCDEFS and of typical input data are given.

  4. Corfu: A Platform for Scalable Consistency

    OpenAIRE

    Wei, Michael

    2017-01-01

    Corfu is a platform for building systems which are extremely scalable, strongly consistent and robust. Unlike other systems which weaken guarantees to provide better performance, we have built Corfu with a resilient fabric tuned and engineered for scalability and strong consistency at its core: the Corfu shared log. On top of the Corfu log, we have built a layer of advanced data services which leverage the properties of the Corfu log. Today, Corfu is already replacing data platforms in commer...

  5. Consistency argued students of fluid

    Science.gov (United States)

    Viyanti; Cari; Suparmi; Winarti; Slamet Budiarti, Indah; Handika, Jeffry; Widyastuti, Fatma

    2017-01-01

    Problem solving for physics concepts through consistency arguments can improve thinking skills of students and it is an important thing in science. The study aims to assess the consistency of the material Fluid student argmentation. The population of this study are College students PGRI Madiun, UIN Sunan Kalijaga Yogyakarta and Lampung University. Samples using cluster random sampling, 145 samples obtained by the number of students. The study used a descriptive survey method. Data obtained through multiple-choice test and interview reasoned. Problem fluid modified from [9] and [1]. The results of the study gained an average consistency argmentation for the right consistency, consistency is wrong, and inconsistent respectively 4.85%; 29.93%; and 65.23%. Data from the study have an impact on the lack of understanding of the fluid material which is ideally in full consistency argued affect the expansion of understanding of the concept. The results of the study as a reference in making improvements in future studies is to obtain a positive change in the consistency of argumentations.

  6. Coordinating user interfaces for consistency

    CERN Document Server

    Nielsen, Jakob

    2001-01-01

    In the years since Jakob Nielsen's classic collection on interface consistency first appeared, much has changed, and much has stayed the same. On the one hand, there's been exponential growth in the opportunities for following or disregarding the principles of interface consistency-more computers, more applications, more users, and of course the vast expanse of the Web. On the other, there are the principles themselves, as persistent and as valuable as ever. In these contributed chapters, you'll find details on many methods for seeking and enforcing consistency, along with bottom-line analys

  7. Process Fairness and Dynamic Consistency

    NARCIS (Netherlands)

    S.T. Trautmann (Stefan); P.P. Wakker (Peter)

    2010-01-01

    textabstractAbstract: When process fairness deviates from outcome fairness, dynamic inconsistencies can arise as in nonexpected utility. Resolute choice (Machina) can restore dynamic consistency under nonexpected utility without using Strotz's precommitment. It can similarly justify dynamically

  8. Time-consistent and market-consistent evaluations

    NARCIS (Netherlands)

    Pelsser, A.; Stadje, M.A.

    2014-01-01

    We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  9. Market-consistent actuarial valuation

    CERN Document Server

    Wüthrich, Mario V

    2016-01-01

    This is the third edition of this well-received textbook, presenting powerful methods for measuring insurance liabilities and assets in a consistent way, with detailed mathematical frameworks that lead to market-consistent values for liabilities. Topics covered are stochastic discounting with deflators, valuation portfolio in life and non-life insurance, probability distortions, asset and liability management, financial risks, insurance technical risks, and solvency. Including updates on recent developments and regulatory changes under Solvency II, this new edition of Market-Consistent Actuarial Valuation also elaborates on different risk measures, providing a revised definition of solvency based on industry practice, and presents an adapted valuation framework which takes a dynamic view of non-life insurance reserving risk.

  10. The Principle of Energetic Consistency

    Science.gov (United States)

    Cohn, Stephen E.

    2009-01-01

    A basic result in estimation theory is that the minimum variance estimate of the dynamical state, given the observations, is the conditional mean estimate. This result holds independently of the specifics of any dynamical or observation nonlinearity or stochasticity, requiring only that the probability density function of the state, conditioned on the observations, has two moments. For nonlinear dynamics that conserve a total energy, this general result implies the principle of energetic consistency: if the dynamical variables are taken to be the natural energy variables, then the sum of the total energy of the conditional mean and the trace of the conditional covariance matrix (the total variance) is constant between observations. Ensemble Kalman filtering methods are designed to approximate the evolution of the conditional mean and covariance matrix. For them the principle of energetic consistency holds independently of ensemble size, even with covariance localization. However, full Kalman filter experiments with advection dynamics have shown that a small amount of numerical dissipation can cause a large, state-dependent loss of total variance, to the detriment of filter performance. The principle of energetic consistency offers a simple way to test whether this spurious loss of variance limits ensemble filter performance in full-blown applications. The classical second-moment closure (third-moment discard) equations also satisfy the principle of energetic consistency, independently of the rank of the conditional covariance matrix. Low-rank approximation of these equations offers an energetically consistent, computationally viable alternative to ensemble filtering. Current formulations of long-window, weak-constraint, four-dimensional variational methods are designed to approximate the conditional mode rather than the conditional mean. Thus they neglect the nonlinear bias term in the second-moment closure equation for the conditional mean. The principle of

  11. Consistent force fields for saccharides

    DEFF Research Database (Denmark)

    Rasmussen, Kjeld

    1999-01-01

    Consistent force fields for carbohydrates were hitherto developed by extensive optimization ofpotential energy function parameters on experimental data and on ab initio results. A wide range of experimental data is used: internal structures obtained from gas phase electron diffraction and from x......-anomeric effects are accounted for without addition of specific terms. The work is done in the framework of the Consistent Force Field which originatedin Israel and was further developed in Denmark. The actual methods and strategies employed havebeen described previously. Extensive testing of the force field...

  12. Anchoring Vignettes and Response Consistency

    OpenAIRE

    Arie Kapteyn; Smith, James P.; Arthur van Soest

    2011-01-01

    The use of anchoring vignettes to correct for differential item functioning rests upon two identifying assumptions: vignette equivalence and response consistency. To test the second assumption the authors conduct an experiment in which respondents in an Internet panel are asked to both describe their health in a number of domains and rate their health in these domains. In a subsequent interview respondents are shown vignettes that are in fact descriptions of their own health. Under response c...

  13. On Modal Refinement and Consistency

    DEFF Research Database (Denmark)

    Nyman, Ulrik; Larsen, Kim Guldstrand; Wasowski, Andrzej

    2007-01-01

    Almost 20 years after the original conception, we revisit several fundamental question about modal transition systems. First, we demonstrate the incompleteness of the standard modal refinement using a counterexample due to Hüttel. Deciding any refinement, complete with respect to the standard...

  14. The Consistent Vehicle Routing Problem

    Energy Technology Data Exchange (ETDEWEB)

    Groer, Christopher S [ORNL; Golden, Bruce [University of Maryland; Edward, Wasil [American University

    2009-01-01

    In the small package shipping industry (as in other industries), companies try to differentiate themselves by providing high levels of customer service. This can be accomplished in several ways, including online tracking of packages, ensuring on-time delivery, and offering residential pickups. Some companies want their drivers to develop relationships with customers on a route and have the same drivers visit the same customers at roughly the same time on each day that the customers need service. These service requirements, together with traditional constraints on vehicle capacity and route length, define a variant of the classical capacitated vehicle routing problem, which we call the consistent VRP (ConVRP). In this paper, we formulate the problem as a mixed-integer program and develop an algorithm to solve the ConVRP that is based on the record-to-record travel algorithm. We compare the performance of our algorithm to the optimal mixed-integer program solutions for a set of small problems and then apply our algorithm to five simulated data sets with 1,000 customers and a real-world data set with more than 3,700 customers. We provide a technique for generating ConVRP benchmark problems from vehicle routing problem instances given in the literature and provide our solutions to these instances. The solutions produced by our algorithm on all problems do a very good job of meeting customer service objectives with routes that have a low total travel time.

  15. Consistency relations in effective field theory

    Science.gov (United States)

    Munshi, Dipak; Regan, Donough

    2017-06-01

    The consistency relations in large scale structure relate the lower-order correlation functions with their higher-order counterparts. They are direct outcome of the underlying symmetries of a dynamical system and can be tested using data from future surveys such as Euclid. Using techniques from standard perturbation theory (SPT), previous studies of consistency relation have concentrated on continuity-momentum (Euler)-Poisson system of an ideal fluid. We investigate the consistency relations in effective field theory (EFT) which adjusts the SPT predictions to account for the departure from the ideal fluid description on small scales. We provide detailed results for the 3D density contrast δ as well as the scaled divergence of velocity bar theta. Assuming a ΛCDM background cosmology, we find the correction to SPT results becomes important at k gtrsim 0.05 h/Mpc and that the suppression from EFT to SPT results that scales as square of the wave number k, can reach 40% of the total at k ≈ 0.25 h/Mpc at z = 0. We have also investigated whether effective field theory corrections to models of primordial non-Gaussianity can alter the squeezed limit behaviour, finding the results to be rather insensitive to these counterterms. In addition, we present the EFT corrections to the squeezed limit of the bispectrum in redshift space which may be of interest for tests of theories of modified gravity.

  16. Development of a portable PEM fuel cell system with bipolar plates consisting an electronically conductive thermoplastic Compound material; Entwicklung eines portablen PEM-Brennstoffzellensystems mit Bipolarplatten aus einem elektronisch leitfaehigen thermoplastischen Compound-Material

    Energy Technology Data Exchange (ETDEWEB)

    Niemzig, O.C.

    2005-07-18

    In order to meet the cost targets of PEM fuel cells for commercialization significant cost reductions of cell stack components like membrane/electrode assemblies and bipolar plates have become key aspects of research and development. Central topics of his work are the bipolar plates and humidification for portable applications. Best results concerning conductivity of an extensive screening of a variety of carbon polymer compounds with polypropylene as matrix could be achieved with the carbon black/graphite/polypropylene-base system. Successful tests of this material in a fuel cell stack could be performed as well as the proof of suitability concerning material- and manufacturing costs. Dependent on application a decrease of material cost to 2 Euro/kg to 1,8 Euro/kW seems to be possible. Finally bipolar plates consisting of a selected carbon polymer compound were successfully integrated and tested in a 20-cell stack which was implemented in a portable PEFC-demonstrator unit with a power output between 50 and 150 W. (orig.)

  17. RE-Shaping. Shaping an effective and efficient European renewable energy market. D20 Report. Consistency with other EU policies, System and Market integration. A Smart Power Market at the Centre of a Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Neuhoff, K.; Boyd, R.; Grau, T. [Climate Policy Initiative, German Institute for Economic Research (DIW Berlin), Berlin (Germany); Hobbs, B.; Newbery, D. [Electricity Policy Research Group, University of Cambridge, Cambridge (United Kingdom); Borggrefe, F. [University of Cologne, Cologne (Germany); Barquin, J.; Echavarren, F. [Universidad Pontificia Comillas, Madrid (Spain); Bialek, J.; Dent, C. [Durham University, Durham (United Kingdom); Con Hirschhausen, C. [Technical University of Berlin, Berlin (Germany); Kunz, F.; Weigt, H. [Technical University of Dresden, Dresden (Germany); Nabe, C.; Papaefthymiou, G. [Ecofys Germany, Berlin (Germany); Weber, C. [Duisberg-Essen University, Duisburg-Essen (Germany)

    2011-10-15

    The core objective of the RE-Shaping project is to assist Member State governments in preparing for the implementation of Directive 2009/28/EC (on the promotion of the use of energy from renewable sources) and to guide a European policy for RES (renewable energy sources) in the mid- to long term. The past and present success of policies for renewable energies will be evaluated and recommendations derived to improve future RES support schemes. The core content of this collaborative research activity comprises: Developing a comprehensive policy background for RES support instruments; Providing the European Commission and Member States with scientifically based and statistically robust indicators to measure the success of currently implemented RES policies; Proposing innovative financing schemes for lower costs and better capital availability in RES financing; Initiation of National Policy Processes which attempt to stimulate debate and offer key stakeholders a meeting place to set and implement RES targets as well as options to improve the national policies fostering RES market penetration; Assessing options to coordinate or even gradually harmonize national RES policy approaches. In the EU, at least 200 gigawatts (GWs) of new and additional renewable electricity sources may be needed by 2020. The aim of this report is to analyse whether the current electricity market and system design is consistent with such an ambitious target. Using an international comparison, we identify opportunities to improve the power market design currently in place across EU countries so as to support the large scale integration of renewable energy sources.

  18. Self-consistency in Capital Markets

    Science.gov (United States)

    Benbrahim, Hamid

    2013-03-01

    Capital Markets are considered, at least in theory, information engines whereby traders contribute to price formation with their diverse perspectives. Regardless whether one believes in efficient market theory on not, actions by individual traders influence prices of securities, which in turn influence actions by other traders. This influence is exerted through a number of mechanisms including portfolio balancing, margin maintenance, trend following, and sentiment. As a result market behaviors emerge from a number of mechanisms ranging from self-consistency due to wisdom of the crowds and self-fulfilling prophecies, to more chaotic behavior resulting from dynamics similar to the three body system, namely the interplay between equities, options, and futures. This talk will address questions and findings regarding the search for self-consistency in capital markets.

  19. Thermodynamically consistent model calibration in chemical kinetics

    Directory of Open Access Journals (Sweden)

    Goutsias John

    2011-05-01

    Full Text Available Abstract Background The dynamics of biochemical reaction systems are constrained by the fundamental laws of thermodynamics, which impose well-defined relationships among the reaction rate constants characterizing these systems. Constructing biochemical reaction systems from experimental observations often leads to parameter values that do not satisfy the necessary thermodynamic constraints. This can result in models that are not physically realizable and may lead to inaccurate, or even erroneous, descriptions of cellular function. Results We introduce a thermodynamically consistent model calibration (TCMC method that can be effectively used to provide thermodynamically feasible values for the parameters of an open biochemical reaction system. The proposed method formulates the model calibration problem as a constrained optimization problem that takes thermodynamic constraints (and, if desired, additional non-thermodynamic constraints into account. By calculating thermodynamically feasible values for the kinetic parameters of a well-known model of the EGF/ERK signaling cascade, we demonstrate the qualitative and quantitative significance of imposing thermodynamic constraints on these parameters and the effectiveness of our method for accomplishing this important task. MATLAB software, using the Systems Biology Toolbox 2.1, can be accessed from http://www.cis.jhu.edu/~goutsias/CSS lab/software.html. An SBML file containing the thermodynamically feasible EGF/ERK signaling cascade model can be found in the BioModels database. Conclusions TCMC is a simple and flexible method for obtaining physically plausible values for the kinetic parameters of open biochemical reaction systems. It can be effectively used to recalculate a thermodynamically consistent set of parameter values for existing thermodynamically infeasible biochemical reaction models of cellular function as well as to estimate thermodynamically feasible values for the parameters of new

  20. Gentzen's centenary the quest for consistency

    CERN Document Server

    Rathjen, Michael

    2015-01-01

    Gerhard Gentzen has been described as logic’s lost genius, whom Gödel called a better logician than himself. This work comprises articles by leading proof theorists, attesting to Gentzen’s enduring legacy to mathematical logic and beyond. The contributions range from philosophical reflections and re-evaluations of Gentzen’s original consistency proofs to the most recent developments in proof theory. Gentzen founded modern proof theory. His sequent calculus and natural deduction system beautifully explain the deep symmetries of logic. They underlie modern developments in computer science such as automated theorem proving and type theory.

  1. Consistent seasonal snow cover depth and duration variability over ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Earth System Science; Volume 125; Issue 7. Consistent seasonal ... Consistent seasonal snow cover depth and duration, delay days and early melt days of consistent seasonal snow cover at 11 stations spread across different mountain ranges over the WH were analyzed. Mean, maximum and ...

  2. Physically Consistent Eddy-resolving State Estimation and Prediction of the Coupled Pan-Arctic Climate System at Daily to Interannual Time Scales Using the Regional Arctic Climate Model (RACM)

    Science.gov (United States)

    2014-09-30

    qualitatively suggests that the anisotropic rheology may improve central arctic thickness evolution , a comparison of modeled and satellite-derived sea...constrain RASM’s climate and sea ice evolution . In addition, the tendency for the circulation biases in WRF to be largest at model top is consistent...Greenland Ice Sheet, ice caps, mountain glaciers and dynamic land vegetation and (ii) investigation of the role of mesoscale eddies and tides on ocean

  3. The Multibeam Advisory Committee (MAC): a search for solutions for collecting consistent high quality multibeam data across multiple ships, systems, and operators in the U.S. Academic Fleet.

    Science.gov (United States)

    Johnson, P. D.; Ferrini, V. L.; Jerram, K.

    2016-12-01

    In 2015 the National Science Foundation funded the University of New Hampshire's Center for Coastal and Ocean Mapping and Lamont-Doherty Earth Observatory, for the second time, to coordinate the effort of standardizing the quality of multibeam echosounder (MBES) data across the U.S. academic fleet. This effort supports 9 different ship operating institutions who manage a total of 12 multibeam-equipped ships carrying 6 different MBES systems, manufactured by two different companies. These MBES are designed to operate over a very wide range of depths and operational modes. The complexity of this endeavor led to the creation of the Multibeam Advisory Committee (MAC), a team of academic and industry experts whose mission is to support the needs of the U.S academic fleet's multibeam echo sounders through all of the phases of the "life" of a MBES system and its data, from initial acceptance of the system, to recommendations on at-sea acquisition of data, to validation of already installed systems, and finally to the post-survey data evaluation. The main activities of the MAC include 1.) standardizing both the Shipboard Acceptance Testing of all new systems and Quality Assurance Testing of already installed systems, 2.) working with the both the ship operators/technicians and the manufacturers of the multibeam systems to guarantee that each MBES is working at its peak performance level, 3.) developing tools that aid in the collection of data, assessment of the MBES hardware, and evaluation of the quality of the MBES data, 4.) creating "best practices" documentation concerning data acquisition and workflow, and 5.) providing a website, http://mac.unols.org, to host technical information, tools, reports, and a "help desk" for operators of the systems to ask questions concerning issues that they see with their systems.

  4. A Primer on Memory Consistency and Cache Coherence

    CERN Document Server

    Sorin, Daniel; Wood, David

    2011-01-01

    Many modern computer systems and most multicore chips (chip multiprocessors) support shared memory in hardware. In a shared memory system, each of the processor cores may read and write to a single shared address space. For a shared memory machine, the memory consistency model defines the architecturally visible behavior of its memory system. Consistency definitions provide rules about loads and stores (or memory reads and writes) and how they act upon memory. As part of supporting a memory consistency model, many machines also provide cache coherence protocols that ensure that multiple cached

  5. 50 CFR 38.8 - Consistency with Federal law.

    Science.gov (United States)

    2010-10-01

    ... 50 Wildlife and Fisheries 6 2010-10-01 2010-10-01 false Consistency with Federal law. 38.8 Section 38.8 Wildlife and Fisheries UNITED STATES FISH AND WILDLIFE SERVICE, DEPARTMENT OF THE INTERIOR (CONTINUED) THE NATIONAL WILDLIFE REFUGE SYSTEM MIDWAY ATOLL NATIONAL WILDLIFE REFUGE Prohibitions § 38.8 Consistency with Federal law. Any provisions o...

  6. Carl Rogers during Initial Interviews: A Moderate and Consistent Therapist.

    Science.gov (United States)

    Edwards, H. P.; And Others

    1982-01-01

    Analyzed two initial interviews by Carl Rogers in their entirety using the Carkhuff scales, Hill's category system, and a brief grammatical analysis to establish the level and consistency with which Rogers provides facilitative conditions. Results indicated his behavior as counselor was stable and consistent within and across interviews. (Author)

  7. A Consistent Phylogenetic Backbone for the Fungi

    Science.gov (United States)

    Ebersberger, Ingo; de Matos Simoes, Ricardo; Kupczok, Anne; Gube, Matthias; Kothe, Erika; Voigt, Kerstin; von Haeseler, Arndt

    2012-01-01

    The kingdom of fungi provides model organisms for biotechnology, cell biology, genetics, and life sciences in general. Only when their phylogenetic relationships are stably resolved, can individual results from fungal research be integrated into a holistic picture of biology. However, and despite recent progress, many deep relationships within the fungi remain unclear. Here, we present the first phylogenomic study of an entire eukaryotic kingdom that uses a consistency criterion to strengthen phylogenetic conclusions. We reason that branches (splits) recovered with independent data and different tree reconstruction methods are likely to reflect true evolutionary relationships. Two complementary phylogenomic data sets based on 99 fungal genomes and 109 fungal expressed sequence tag (EST) sets analyzed with four different tree reconstruction methods shed light from different angles on the fungal tree of life. Eleven additional data sets address specifically the phylogenetic position of Blastocladiomycota, Ustilaginomycotina, and Dothideomycetes, respectively. The combined evidence from the resulting trees supports the deep-level stability of the fungal groups toward a comprehensive natural system of the fungi. In addition, our analysis reveals methodologically interesting aspects. Enrichment for EST encoded data—a common practice in phylogenomic analyses—introduces a strong bias toward slowly evolving and functionally correlated genes. Consequently, the generalization of phylogenomic data sets as collections of randomly selected genes cannot be taken for granted. A thorough characterization of the data to assess possible influences on the tree reconstruction should therefore become a standard in phylogenomic analyses. PMID:22114356

  8. Consistent quadrupole-octupole collective model

    Science.gov (United States)

    Dobrowolski, A.; Mazurek, K.; Góźdź, A.

    2016-11-01

    Within this work we present a consistent approach to quadrupole-octupole collective vibrations coupled with the rotational motion. A realistic collective Hamiltonian with variable mass-parameter tensor and potential obtained through the macroscopic-microscopic Strutinsky-like method with particle-number-projected BCS (Bardeen-Cooper-Schrieffer) approach in full vibrational and rotational, nine-dimensional collective space is diagonalized in the basis of projected harmonic oscillator eigensolutions. This orthogonal basis of zero-, one-, two-, and three-phonon oscillator-like functions in vibrational part, coupled with the corresponding Wigner function is, in addition, symmetrized with respect to the so-called symmetrization group, appropriate to the collective space of the model. In the present model it is D4 group acting in the body-fixed frame. This symmetrization procedure is applied in order to provide the uniqueness of the Hamiltonian eigensolutions with respect to the laboratory coordinate system. The symmetrization is obtained using the projection onto the irreducible representation technique. The model generates the quadrupole ground-state spectrum as well as the lowest negative-parity spectrum in 156Gd nucleus. The interband and intraband B (E 1 ) and B (E 2 ) reduced transition probabilities are also calculated within those bands and compared with the recent experimental results for this nucleus. Such a collective approach is helpful in searching for the fingerprints of the possible high-rank symmetries (e.g., octahedral and tetrahedral) in nuclear collective bands.

  9. Interactive Videodisc for Learners of French.

    Science.gov (United States)

    Hancock, Roger

    1985-01-01

    Describes development of a learning module for students studying foreign languages, which usually depicts social behavior between the foreign language nationals to teach communicative competence in face-to-face interactions between nationals and students. Marketing of the module, which was first produced on videotape and later on interactive…

  10. Computer architecture for solving consistent labeling problems

    Energy Technology Data Exchange (ETDEWEB)

    Ullmann, J.R.; Haralick, R.M.; Shapiro, L.G.

    1982-01-01

    Consistent labeling problems are a family of np-complete constraint satisfaction problems such as school timetabling, for which a conventional computer may be too slow. There are a variety of techniques for reducing the elapsed time to find one or all solutions to a consistent labeling problem. The paper discusses and illustrates solutions consisting of special hardware to accomplish the required constraint propagation and an asynchronous network of intercommunicating computers to accomplish the tree search in parallel. 5 references.

  11. Modeling and Testing Legacy Data Consistency Requirements

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard

    2003-01-01

    . This paper addresses the need for new techniques that enable the modeling and consistency checking for legacy data sources. Specifically, the paper contributes to the development of a framework that enables consistency testing of data coming from different types of data sources. The vehicle is UML and its...... accompanying XMI. The paper presents techniques for modeling consistency requirements using OCL and other UML modeling elements: it studies how models that describe the required consistencies among instances of legacy models can be designed in standard UML tools that support XMI. The paper also considers...

  12. Radiometric consistency in source specifications for lithography

    Science.gov (United States)

    Rosenbluth, Alan E.; Tirapu Azpiroz, Jaione; Lai, Kafai; Tian, Kehan; Melville, David O. S.; Totzeck, Michael; Blahnik, Vladan; Koolen, Armand; Flagello, Donis

    2008-03-01

    There is a surprising lack of clarity about the exact quantity that a lithographic source map should specify. Under the plausible interpretation that input source maps should tabulate radiance, one will find with standard imaging codes that simulated wafer plane source intensities appear to violate the brightness theorem. The apparent deviation (a cosine factor in the illumination pupil) represents one of many obliquity/inclination factors involved in propagation through the imaging system whose interpretation in the literature is often somewhat obscure, but which have become numerically significant in today's hyper-NA OPC applications. We show that the seeming brightness distortion in the illumination pupil arises because the customary direction-cosine gridding of this aperture yields non-uniform solid-angle subtense in the source pixels. Once the appropriate solid angle factor is included, each entry in the source map becomes proportional to the total |E|^2 that the associated pixel produces on the mask. This quantitative definition of lithographic source distributions is consistent with the plane-wave spectrum approach adopted by litho simulators, in that these simulators essentially propagate |E|^2 along the interfering diffraction orders from the mask input to the resist film. It can be shown using either the rigorous Franz formulation of vector diffraction theory, or an angular spectrum approach, that such an |E|^2 plane-wave weighting will provide the standard inclination factor if the source elements are incoherent and the mask model is accurate. This inclination factor is usually derived from a classical Rayleigh-Sommerfeld diffraction integral, and we show that the nominally discrepant inclination factors used by the various diffraction integrals of this class can all be made to yield the same result as the Franz formula when rigorous mask simulation is employed, and further that these cosine factors have a simple geometrical interpretation. On this basis

  13. Photocatalytic CO2 reduction using water as an electron donor by a powdered Z-scheme system consisting of metal sulfide and an RGO-TiO2 composite.

    Science.gov (United States)

    Takayama, Tomoaki; Sato, Ko; Fujimura, Takehiro; Kojima, Yuki; Iwase, Akihide; Kudo, Akihiko

    2017-06-02

    CuGaS2, (AgInS2)x-(ZnS)2-2x, Ag2ZnGeS4, Ni- or Pb-doped ZnS, (ZnS)0.9-(CuCl)0.1, and ZnGa0.5In1.5S4 showed activities for CO2 reduction to form CO and/or HCOOH in an aqueous solution containing K2SO3 and Na2S as electron donors under visible light irradiation. Among them, CuGaS2 and Ni-doped ZnS photocatalysts showed relatively high activities for CO and HCOOH formation, respectively. CuGaS2 was applied in a powdered Z-scheme system combining with reduced graphene oxide (RGO)-incorporated TiO2 as an O2-evolving photocatalyst. The powdered Z-scheme system produced CO from CO2 in addition to H2 and O2 due to water splitting. Oxygen evolution with an almost stoichiometric amount indicates that water was consumed as an electron donor in the Z-schematic CO2 reduction. Thus, we successfully demonstrated CO2 reduction of artificial photosynthesis using a simple Z-scheme system in which two kinds of photocatalyst powders (CuGaS2 and an RGO-TiO2 composite) were only dispersed in water under 1 atm of CO2.

  14. Testing the visual consistency of web sites

    NARCIS (Netherlands)

    van der Geest, Thea; Loorbach, N.R.

    2005-01-01

    Consistency in the visual appearance of Web pages is often checked by experts, such as designers or reviewers. This article reports a card sort study conducted to determine whether users rather than experts could distinguish visual (in-)consistency in Web elements and pages. The users proved to

  15. Student Effort, Consistency, and Online Performance

    Science.gov (United States)

    Patron, Hilde; Lopez, Salvador

    2011-01-01

    This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas…

  16. Incompatible multiple consistent sets of histories and measures of quantumness

    Science.gov (United States)

    Halliwell, J. J.

    2017-07-01

    In the consistent histories approach to quantum theory probabilities are assigned to histories subject to a consistency condition of negligible interference. The approach has the feature that a given physical situation admits multiple sets of consistent histories that cannot in general be united into a single consistent set, leading to a number of counterintuitive or contrary properties if propositions from different consistent sets are combined indiscriminately. An alternative viewpoint is proposed in which multiple consistent sets are classified according to whether or not there exists any unifying probability for combinations of incompatible sets which replicates the consistent histories result when restricted to a single consistent set. A number of examples are exhibited in which this classification can be made, in some cases with the assistance of the Bell, Clauser-Horne-Shimony-Holt, or Leggett-Garg inequalities together with Fine's theorem. When a unifying probability exists logical deductions in different consistent sets can in fact be combined, an extension of the "single framework rule." It is argued that this classification coincides with intuitive notions of the boundary between classical and quantum regimes and in particular, the absence of a unifying probability for certain combinations of consistent sets is regarded as a measure of the "quantumness" of the system. The proposed approach and results are closely related to recent work on the classification of quasiprobabilities and this connection is discussed.

  17. Consistency relations in multi-field inflation

    Science.gov (United States)

    Gong, Jinn-Ouk; Seo, Min-Seok

    2018-02-01

    We study the consequences of spatial coordinate transformation in multi-field inflation. Among the spontaneously broken de Sitter isometries, only dilatation in the comoving gauge preserves the form of the metric and thus results in quantum-protected Slavnov-Taylor identities. We derive the corresponding consistency relations between correlation functions of cosmological perturbations in two different ways, by the connected and one-particle-irreducible Green's functions. The lowest-order consistency relations are explicitly given, and we find that even in multi-field inflation the consistency relations in the soft limit are independent of the detail of the matter sector.

  18. Global Adaptation Controlled by an Interactive Consistency Protocol

    Directory of Open Access Journals (Sweden)

    Alina Lenz

    2017-05-01

    Full Text Available Static schedules for systems can lead to an inefficient usage of the resources, because the system’s behavior cannot be adapted at runtime. To improve the runtime system performance in current time-triggered Multi-Processor System on Chip (MPSoC, a dynamic reaction to events is performed locally on the cores. The effects of this optimization can be increased by coordinating the changes globally. To perform such global changes, a consistent view on the system state is needed, on which to base the adaptation decisions. This paper proposes such an interactive consistency protocol with low impact on the system w.r.t. latency and overhead. We show that an energy optimizing adaptation controlled by the protocol can enable a system to save up to 43% compared to a system without adaptation.

  19. Consistency in the World Wide Web

    DEFF Research Database (Denmark)

    Thomsen, Jakob Grauenkjær

    Tim Berners-Lee envisioned that computers will behave as agents of humans on the World Wide Web, where they will retrieve, extract, and interact with information from the World Wide Web. A step towards this vision is to make computers capable of extracting this information in a reliable...... and consistent way. In this dissertation we study steps towards this vision by showing techniques for the specication, the verication and the evaluation of the consistency of information in the World Wide Web. We show how to detect certain classes of errors in a specication of information, and we show how...... the World Wide Web, in order to help perform consistent evaluations of web extraction techniques. These contributions are steps towards having computers reliable and consistently extract information from the World Wide Web, which in turn are steps towards achieving Tim Berners-Lee's vision. ii...

  20. Integrating perspectives on vocal performance and consistency

    OpenAIRE

    Sakata, Jon T.; Vehrencamp, Sandra L.

    2012-01-01

    Recent experiments in divergent fields of birdsong have revealed that vocal performance is important for reproductive success and under active control by distinct neural circuits. Vocal consistency, the degree to which the spectral properties (e.g. dominant or fundamental frequency) of song elements are produced consistently from rendition to rendition, has been highlighted as a biologically important aspect of vocal performance. Here, we synthesize functional, developmental and mechanistic (...

  1. Personalized recommendation based on unbiased consistence

    Science.gov (United States)

    Zhu, Xuzhen; Tian, Hui; Zhang, Ping; Hu, Zheng; Zhou, Tao

    2015-08-01

    Recently, in physical dynamics, mass-diffusion-based recommendation algorithms on bipartite network provide an efficient solution by automatically pushing possible relevant items to users according to their past preferences. However, traditional mass-diffusion-based algorithms just focus on unidirectional mass diffusion from objects having been collected to those which should be recommended, resulting in a biased causal similarity estimation and not-so-good performance. In this letter, we argue that in many cases, a user's interests are stable, and thus bidirectional mass diffusion abilities, no matter originated from objects having been collected or from those which should be recommended, should be consistently powerful, showing unbiased consistence. We further propose a consistence-based mass diffusion algorithm via bidirectional diffusion against biased causality, outperforming the state-of-the-art recommendation algorithms in disparate real data sets, including Netflix, MovieLens, Amazon and Rate Your Music.

  2. Smoothing of Fused Spectral Consistent Satellite Images

    DEFF Research Database (Denmark)

    Sveinsson, Johannes; Aanæs, Henrik; Benediktsson, Jon Atli

    2006-01-01

    Several widely used methods have been proposed for fusing high resolution panchromatic data and lower resolution multi-channel data. However, many of these methods fail to maintain spectral consistency of the fused high resolution image, which is of high importance to many of the applications based...... in a statistically meaningful way. The fusion method was called spectral consistent panshapen- ing (SC) and it was shown that spectral consistency was a direct consequence of imaging physics and hence guaranteed by the SCP. In this paper exploit this framework and investigate two smoothing methods of the fused image...... obtain by SCP. The first smoothing method is based on Markov random field (MRF) model, while the second method uses wavelet domain hidden Markov models (HMM) for smoothing of the SCP fused image....

  3. Quantifying the consistency of scientific databases.

    Science.gov (United States)

    Šubelj, Lovro; Bajec, Marko; Boshkoska, Biljana Mileva; Kastrin, Andrej; Levnajić, Zoran

    2015-01-01

    Science is a social process with far-reaching impact on our modern society. In recent years, for the first time we are able to scientifically study the science itself. This is enabled by massive amounts of data on scientific publications that is increasingly becoming available. The data is contained in several databases such as Web of Science or PubMed, maintained by various public and private entities. Unfortunately, these databases are not always consistent, which considerably hinders this study. Relying on the powerful framework of complex networks, we conduct a systematic analysis of the consistency among six major scientific databases. We found that identifying a single "best" database is far from easy. Nevertheless, our results indicate appreciable differences in mutual consistency of different databases, which we interpret as recipes for future bibliometric studies.

  4. Dynamically consistent Jacobian inverse for mobile manipulators

    Science.gov (United States)

    Ratajczak, Joanna; Tchoń, Krzysztof

    2016-06-01

    By analogy to the definition of the dynamically consistent Jacobian inverse for robotic manipulators, we have designed a dynamically consistent Jacobian inverse for mobile manipulators built of a non-holonomic mobile platform and a holonomic on-board manipulator. The endogenous configuration space approach has been exploited as a source of conceptual guidelines. The new inverse guarantees a decoupling of the motion in the operational space from the forces exerted in the endogenous configuration space and annihilated by the dual Jacobian inverse. A performance study of the new Jacobian inverse as a tool for motion planning is presented.

  5. Consistent Visual Analyses of Intrasubject Data

    Science.gov (United States)

    Kahng, SungWoo; Chung, Kyong-Mee; Gutshall, Katharine; Pitts, Steven C.; Kao, Joyce; Girolami, Kelli

    2010-01-01

    Visual inspection of single-case data is the primary method of interpretation of the effects of an independent variable on a dependent variable in applied behavior analysis. The purpose of the current study was to replicate and extend the results of DeProspero and Cohen (1979) by reexamining the consistency of visual analysis across raters. We…

  6. Effecting Consistency across Curriculum: A Case Study

    Science.gov (United States)

    Devasagayam, P. Raj; Mahaffey, Thomas R.

    2008-01-01

    Continuous quality improvement is the clarion call across all business schools which is driving the emphasis on assessing the attainment of learning outcomes. An issue that deems special attention in assurance of learning outcomes is related to consistency across courses and, more specifically, across multiple sections of the same course taught by…

  7. 'Ionic crystals' consisting of trinuclear macrocations and ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Chemical Sciences; Volume 129; Issue 8. 'Ionic crystals' consisting of trinuclear macrocations and polyoxometalate anions exhibiting single crystal to single crystal transformation: breathing of crystals. T ARUMUGANATHAN ASHA SIDDIKHA SAMAR K DAS. REGULAR ARTICLE Volume 129 ...

  8. Proteolysis and consistency of Meshanger cheese

    NARCIS (Netherlands)

    Jong, de L.

    1978-01-01

    Proteolysis in Meshanger cheese, estimated by quantitative polyacrylamide gel electrophoresis is discussed. The conversion of α s1 -casein was proportional to rennet concentration in the cheese. Changes in consistency, after a maximum, were correlated to breakdown of

  9. On Consistency Maintenance In Service Discovery

    NARCIS (Netherlands)

    Sundramoorthy, V.; Hartel, Pieter H.; Scholten, Johan

    Communication and node failures degrade the ability of a service discovery protocol to ensure Users receive the correct service information when the service changes. We propose that service discovery protocols employ a set of recovery techniques to recover from failures and regain consistency. We

  10. Consistent feeding positions of great tit parents

    NARCIS (Netherlands)

    Lessells, C.M.; Poelman, E.H.; Mateman, A.C.; Cassey, P.

    2006-01-01

    When parent birds arrive at the nest to provision their young, their position on the nest rim may influence which chick or chicks are fed. As a result, the consistency of feeding positions of the individual parents, and the difference in position between the parents, may affect how equitably food is

  11. Consistency in multi-viewpoint architectural design

    NARCIS (Netherlands)

    Dijkman, R.M.; Dijkman, Remco Matthijs

    2006-01-01

    This thesis presents a framework that aids in preserving consistency in multi-viewpoint designs. In a multi-viewpoint design each stakeholder constructs his own design part. We call each stakeholder’s design part the view of that stakeholder. To construct his view, a stakeholder has a viewpoint.

  12. Developing consistent pronunciation models for phonemic variants

    CSIR Research Space (South Africa)

    Davel, M

    2006-09-01

    Full Text Available from a lexicon containing variants. In this paper we (the authors) address both these issues by creating ‘pseudo-phonemes’ associated with sets of ‘generation restriction rules’ to model those pronunciations that are consistently realised as two or more...

  13. FTA Transit Intelligent Transportation System Architecture Consistency Review - 2010 Update

    Science.gov (United States)

    2011-07-01

    This report provides an assessment on the level of compliance among the FTA grantees with the National ITS Architecture Policy, specifically examining three items: 1. The use and maintenance of Regional ITS Architectures by transit agencies to plan, ...

  14. Consistency among integral measurements of aggregate decay heat power

    Energy Technology Data Exchange (ETDEWEB)

    Takeuchi, H.; Sagisaka, M.; Oyamatsu, K.; Kukita, Y. [Nagoya Univ. (Japan)

    1998-03-01

    Persisting discrepancies between summation calculations and integral measurements force us to assume large uncertainties in the recommended decay heat power. In this paper, we develop a hybrid method to calculate the decay heat power of a fissioning system from those of different fissioning systems. Then, this method is applied to examine consistency among measured decay heat powers of {sup 232}Th, {sup 233}U, {sup 235}U, {sup 238}U and {sup 239}Pu at YAYOI. The consistency among the measured values are found to be satisfied for the {beta} component and fairly well for the {gamma} component, except for cooling times longer than 4000 s. (author)

  15. Consistent metagenomic biomarker detection via robust PCA.

    Science.gov (United States)

    Alshawaqfeh, Mustafa; Bashaireh, Ahmad; Serpedin, Erchin; Suchodolski, Jan

    2017-01-31

    Recent developments of high throughput sequencing technologies allow the characterization of the microbial communities inhabiting our world. Various metagenomic studies have suggested using microbial taxa as potential biomarkers for certain diseases. In practice, the number of available samples varies from experiment to experiment. Therefore, a robust biomarker detection algorithm is needed to provide a set of potential markers irrespective of the number of available samples. Consistent performance is essential to derive solid biological conclusions and to transfer these findings into clinical applications. Surprisingly, the consistency of a metagenomic biomarker detection algorithm with respect to the variation in the experiment size has not been addressed by the current state-of-art algorithms. We propose a consistency-classification framework that enables the assessment of consistency and classification performance of a biomarker discovery algorithm. This evaluation protocol is based on random resampling to mimic the variation in the experiment size. Moreover, we model the metagenomic data matrix as a superposition of two matrices. The first matrix is a low-rank matrix that models the abundance levels of the irrelevant bacteria. The second matrix is a sparse matrix that captures the abundance levels of the bacteria that are differentially abundant between different phenotypes. Then, we propose a novel Robust Principal Component Analysis (RPCA) based biomarker discovery algorithm to recover the sparse matrix. RPCA belongs to the class of multivariate feature selection methods which treat the features collectively rather than individually. This provides the proposed algorithm with an inherent ability to handle the complex microbial interactions. Comprehensive comparisons of RPCA with the state-of-the-art algorithms on two realistic datasets are conducted. Results show that RPCA consistently outperforms the other algorithms in terms of classification accuracy and

  16. Deriving consistent GSM schemas from DCR graphs

    DEFF Research Database (Denmark)

    Eshuis, Rik; Debois, Søren; Slaats, Tijs

    2016-01-01

    Case Management (CM) is a BPM technology for supporting flexible services orchestration. CM approaches like CMMN, an OMG standard, and GSM, one of CMMN’s core influences, use Event- Condition-Action rules, which can be inconsistent due to cyclic interdependencies between the rules; repairing...... such an inconsistent case management schema is difficult. To avoid the problem of inconsistencies altogether, we provide a technique for automatically deriving consistent GSM case management schemas from higher-level business policies defined as DCR graphs, an alternative CM approach. Concretely, we define a behaviour......-preserving mapping that (1) removes the burden from the modeller of GSM schemas to prove consistency and define the ordering of rules, (2) provides high-level patterns for modelling GSM schemas, and (3) gives a way to define a notion of progress (liveness) and acceptance for GSM instances. The mapping is illustrated...

  17. Consistency relation for cosmic magnetic fields

    DEFF Research Database (Denmark)

    Jain, R. K.; Sloth, M. S.

    2012-01-01

    to be extremely useful to test some recent calculations in the literature. Apart from primordial non-Gaussianity induced by the curvature perturbations, such a cross correlation might provide a new observational probe of inflation and can in principle reveal the primordial nature of cosmic magnetic fields. DOI......If cosmic magnetic fields are indeed produced during inflation, they are likely to be correlated with the scalar metric perturbations that are responsible for the cosmic microwave background anisotropies and large scale structure. Within an archetypical model of inflationary magnetogenesis, we show...... that there exists a new simple consistency relation for the non-Gaussian cross correlation function of the scalar metric perturbation with two powers of the magnetic field in the squeezed limit where the momentum of the metric perturbation vanishes. We emphasize that such a consistency relation turns out...

  18. Student Effort, Consistency and Online Performance

    Directory of Open Access Journals (Sweden)

    Hilde Patron

    2011-07-01

    Full Text Available This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas effort, or total minutes spent online, is not. Other independent variables include GPA and the difference between a pre-test and a post-test. The GPA is used as a measure of motivation, and the difference between a post-test and pre-test as marginal learning. As expected, the level of motivation is found statistically significant at a 99% confidence level, and marginal learning is also significant at a 95% level.

  19. Cloud Standardization: Consistent Business Processes and Information

    OpenAIRE

    Razvan Daniel ZOTA; Lucian-Alexandru FRATILA

    2013-01-01

    Cloud computing represents one of the latest emerging trends in distributed computing that enables the existence of hardware infrastructure and software applications as services. The present paper offers a general approach to the cloud computing standardization as a mean of improving the speed of adoption for the cloud technologies. Moreover, this study tries to show out how organizations may achieve more consistent business processes while operating with cloud computing technologies.

  20. Cloud Standardization: Consistent Business Processes and Information

    Directory of Open Access Journals (Sweden)

    Razvan Daniel ZOTA

    2013-01-01

    Full Text Available Cloud computing represents one of the latest emerging trends in distributed computing that enables the existence of hardware infrastructure and software applications as services. The present paper offers a general approach to the cloud computing standardization as a mean of improving the speed of adoption for the cloud technologies. Moreover, this study tries to show out how organizations may achieve more consistent business processes while operating with cloud computing technologies.

  1. Self-consistent structure of metallic hydrogen

    Science.gov (United States)

    Straus, D. M.; Ashcroft, N. W.

    1977-01-01

    A calculation is presented of the total energy of metallic hydrogen for a family of face-centered tetragonal lattices carried out within the self-consistent phonon approximation. The energy of proton motion is large and proper inclusion of proton dynamics alters the structural dependence of the total energy, causing isotropic lattices to become favored. For the dynamic lattice the structural dependence of terms of third and higher order in the electron-proton interaction is greatly reduced from static lattice equivalents.

  2. On the Consistent Migration of Unsplittable Flows

    DEFF Research Database (Denmark)

    Förster, Klaus-Tycho

    2017-01-01

    In consistent flow migration, the task is to change the paths the flows take in the network, but without inducing congestion during the  update process. Even though the rise of Software Defined Networks allows for centralized control of path changes, the execution is still performed in an inheren......In consistent flow migration, the task is to change the paths the flows take in the network, but without inducing congestion during the  update process. Even though the rise of Software Defined Networks allows for centralized control of path changes, the execution is still performed...... of consistently migrating splittable flows is well understood, for the practically more relevant unsplittable flows, few non-heuristic results are known – and upper complexity bounds are missing. We give a dynamic programming algorithm for unsplittable flows, showing the containment in EXPTIME, for both...... computation time and schedule length. In particular, there are cases where flows must switch between paths back and forth repeatedly: as thus, flow migration is not just an ordering problem. We also study lower bounds and show NP-hardness already for two flows, via reduction from edge-disjoint path problems...

  3. Autonomous Navigation with Constrained Consistency for C-Ranger

    Directory of Open Access Journals (Sweden)

    Shujing Zhang

    2014-06-01

    Full Text Available Autonomous underwater vehicles (AUVs have become the most widely used tools for undertaking complex exploration tasks in marine environments. Their synthetic ability to carry out localization autonomously and build an environmental map concurrently, in other words, simultaneous localization and mapping (SLAM, are considered to be pivotal requirements for AUVs to have truly autonomous navigation. However, the consistency problem of the SLAM system has been greatly ignored during the past decades. In this paper, a consistency constrained extended Kalman filter (EKF SLAM algorithm, applying the idea of local consistency, is proposed and applied to the autonomous navigation of the C-Ranger AUV, which is developed as our experimental platform. The concept of local consistency (LC is introduced after an explicit theoretical derivation of the EKF-SLAM system. Then, we present a locally consistency-constrained EKF-SLAM design, LC-EKF, in which the landmark estimates used for linearization are fixed at the beginning of each local time period, rather than evaluated at the latest landmark estimates. Finally, our proposed LC-EKF algorithm is experimentally verified, both in simulations and sea trials. The experimental results show that the LC-EKF performs well with regard to consistency, accuracy and computational efficiency.

  4. Consistency of color representation in smart phones.

    Science.gov (United States)

    Dain, Stephen J; Kwan, Benjamin; Wong, Leslie

    2016-03-01

    One of the barriers to the construction of consistent computer-based color vision tests has been the variety of monitors and computers. Consistency of color on a variety of screens has necessitated calibration of each setup individually. Color vision examination with a carefully controlled display has, as a consequence, been a laboratory rather than a clinical activity. Inevitably, smart phones have become a vehicle for color vision tests. They have the advantage that the processor and screen are associated and there are fewer models of smart phones than permutations of computers and monitors. Colorimetric consistency of display within a model may be a given. It may extend across models from the same manufacturer but is unlikely to extend between manufacturers especially where technologies vary. In this study, we measured the same set of colors in a JPEG file displayed on 11 samples of each of four models of smart phone (iPhone 4s, iPhone5, Samsung Galaxy S3, and Samsung Galaxy S4) using a Photo Research PR-730. The iPhones are white LED backlit LCD and the Samsung are OLEDs. The color gamut varies between models and comparison with sRGB space shows 61%, 85%, 117%, and 110%, respectively. The iPhones differ markedly from the Samsungs and from one another. This indicates that model-specific color lookup tables will be needed. Within each model, the primaries were quite consistent (despite the age of phone varying within each sample). The worst case in each model was the blue primary; the 95th percentile limits in the v' coordinate were ±0.008 for the iPhone 4 and ±0.004 for the other three models. The u'v' variation in white points was ±0.004 for the iPhone4 and ±0.002 for the others, although the spread of white points between models was u'v'±0.007. The differences are essentially the same for primaries at low luminance. The variation of colors intermediate between the primaries (e.g., red-purple, orange) mirror the variation in the primaries. The variation in

  5. Evaluating Temporal Consistency in Marine Biodiversity Hotspots

    Science.gov (United States)

    Barner, Allison K.; Benkwitt, Cassandra E.; Boersma, Kate S.; Cerny-Chipman, Elizabeth B.; Ingeman, Kurt E.; Kindinger, Tye L.; Lindsley, Amy J.; Nelson, Jake; Reimer, Jessica N.; Rowe, Jennifer C.; Shen, Chenchen; Thompson, Kevin A.; Heppell, Selina S.

    2015-01-01

    With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monitoring dataset collected over an eight year period off the US Pacific Coast, we developed a methodological approach for avoiding biases associated with hotspot delineation. We aggregated benthic fish species data from research trawls and calculated mean hotspot thresholds for fish species richness and Shannon’s diversity indices over the eight year dataset. We used a spatial frequency distribution method to assign hotspot designations to the grid cells annually. We found no areas containing consistently high biodiversity through the entire study period based on the mean thresholds, and no grid cell was designated as a hotspot for greater than 50% of the time-series. To test if our approach was sensitive to sampling effort and the geographic extent of the survey, we followed a similar routine for the northern region of the survey area. Our finding of low consistency in benthic fish biodiversity hotspots over time was upheld, regardless of biodiversity metric used, whether thresholds were calculated per year or across all years, or the spatial extent for which we calculated thresholds and identified hotspots. Our results suggest that static measures of benthic fish biodiversity off the US West Coast are insufficient for identification of hotspots and that long-term data are required to appropriately identify patterns of high temporal variability in biodiversity for these highly mobile taxa. Given that ecological communities are responding to a changing climate and other

  6. Evaluating Temporal Consistency in Marine Biodiversity Hotspots.

    Science.gov (United States)

    Piacenza, Susan E; Thurman, Lindsey L; Barner, Allison K; Benkwitt, Cassandra E; Boersma, Kate S; Cerny-Chipman, Elizabeth B; Ingeman, Kurt E; Kindinger, Tye L; Lindsley, Amy J; Nelson, Jake; Reimer, Jessica N; Rowe, Jennifer C; Shen, Chenchen; Thompson, Kevin A; Heppell, Selina S

    2015-01-01

    With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monitoring dataset collected over an eight year period off the US Pacific Coast, we developed a methodological approach for avoiding biases associated with hotspot delineation. We aggregated benthic fish species data from research trawls and calculated mean hotspot thresholds for fish species richness and Shannon's diversity indices over the eight year dataset. We used a spatial frequency distribution method to assign hotspot designations to the grid cells annually. We found no areas containing consistently high biodiversity through the entire study period based on the mean thresholds, and no grid cell was designated as a hotspot for greater than 50% of the time-series. To test if our approach was sensitive to sampling effort and the geographic extent of the survey, we followed a similar routine for the northern region of the survey area. Our finding of low consistency in benthic fish biodiversity hotspots over time was upheld, regardless of biodiversity metric used, whether thresholds were calculated per year or across all years, or the spatial extent for which we calculated thresholds and identified hotspots. Our results suggest that static measures of benthic fish biodiversity off the US West Coast are insufficient for identification of hotspots and that long-term data are required to appropriately identify patterns of high temporal variability in biodiversity for these highly mobile taxa. Given that ecological communities are responding to a changing climate and other

  7. Consistency Checking of Web Service Contracts

    DEFF Research Database (Denmark)

    Cambronero, M. Emilia; Okika, Joseph C.; Ravn, Anders Peter

    2008-01-01

    Behavioural properties are analyzed for web service contracts formulated in Business Process Execution Language (BPEL) and Choreography Description Language (CDL). The key result reported is an automated technique to check consistency between protocol aspects of the contracts. The contracts...... are abstracted to (timed) automata and from there a simulation is set up, which is checked using automated tools for analyzing networks of finite state processes. Here we use the Concurrency Work Bench. The proposed techniques are illustrated with a case study that include otherwise difficult to analyze fault...

  8. Improving analytical tomographic reconstructions through consistency conditions

    CERN Document Server

    Arcadu, Filippo; Stampanoni, Marco; Marone, Federica

    2016-01-01

    This work introduces and characterizes a fast parameterless filter based on the Helgason-Ludwig consistency conditions, used to improve the accuracy of analytical reconstructions of tomographic undersampled datasets. The filter, acting in the Radon domain, extrapolates intermediate projections between those existing. The resulting sinogram, doubled in views, is then reconstructed by a standard analytical method. Experiments with simulated data prove that the peak-signal-to-noise ratio of the results computed by filtered backprojection is improved up to 5-6 dB, if the filter is used prior to reconstruction.

  9. Numerical Self-Consistent Analysis of VCSELs

    Directory of Open Access Journals (Sweden)

    Robert Sarzała

    2012-01-01

    Full Text Available Vertical-cavity surface-emitting lasers (VCSELs yield single-longitudinal-mode operation, low-divergence circular output beam, and low threshold current. This paper gives an overview on theoretical, self-consistent modelling of physical phenomena occurring in a VCSEL. The model has been experimentally confirmed. We present versatile numerical methods for nitride, arsenide, and phosphide VCSELs emitting light at wavelengths varying from violet to near infrared. We also discuss different designs with respect to optical confinement: gain guidance using tunnel junctions and index guidance using oxide confinement or photonic crystal and we focus on the problem of single-transverse-mode operation.

  10. 36 CFR 219.24 - Science consistency evaluations.

    Science.gov (United States)

    2010-07-01

    ... 36 Parks, Forests, and Public Property 2 2010-07-01 2010-07-01 false Science consistency evaluations. 219.24 Section 219.24 Parks, Forests, and Public Property FOREST SERVICE, DEPARTMENT OF AGRICULTURE PLANNING National Forest System Land and Resource Management Planning The Contribution of Science...

  11. Subgame consistent cooperation a comprehensive treatise

    CERN Document Server

    Yeung, David W K

    2016-01-01

    Strategic behavior in the human and social world has been increasingly recognized in theory and practice. It is well known that non-cooperative behavior could lead to suboptimal or even highly undesirable outcomes. Cooperation suggests the possibility of obtaining socially optimal solutions and the calls for cooperation are prevalent in real-life problems. Dynamic cooperation cannot be sustainable if there is no guarantee that the agreed upon optimality principle at the beginning is maintained throughout the cooperation duration. It is due to the lack of this kind of guarantees that cooperative schemes fail to last till its end or even fail to get started. The property of subgame consistency in cooperative dynamic games and the corresponding solution mechanism resolve this “classic” problem in game theory. This book is a comprehensive treatise on subgame consistent dynamic cooperation covering the up-to-date state of the art analyses in this important topic. It sets out to provide the theory, solution tec...

  12. Consistency of extreme flood estimation approaches

    Science.gov (United States)

    Felder, Guido; Paquet, Emmanuel; Penot, David; Zischg, Andreas; Weingartner, Rolf

    2017-04-01

    Estimations of low-probability flood events are frequently used for the planning of infrastructure as well as for determining the dimensions of flood protection measures. There are several well-established methodical procedures to estimate low-probability floods. However, a global assessment of the consistency of these methods is difficult to achieve, the "true value" of an extreme flood being not observable. Anyway, a detailed comparison performed on a given case study brings useful information about the statistical and hydrological processes involved in different methods. In this study, the following three different approaches for estimating low-probability floods are compared: a purely statistical approach (ordinary extreme value statistics), a statistical approach based on stochastic rainfall-runoff simulation (SCHADEX method), and a deterministic approach (physically based PMF estimation). These methods are tested for two different Swiss catchments. The results and some intermediate variables are used for assessing potential strengths and weaknesses of each method, as well as for evaluating the consistency of these methods.

  13. Consistent mutational paths predict eukaryotic thermostability

    Directory of Open Access Journals (Sweden)

    van Noort Vera

    2013-01-01

    Full Text Available Abstract Background Proteomes of thermophilic prokaryotes have been instrumental in structural biology and successfully exploited in biotechnology, however many proteins required for eukaryotic cell function are absent from bacteria or archaea. With Chaetomium thermophilum, Thielavia terrestris and Thielavia heterothallica three genome sequences of thermophilic eukaryotes have been published. Results Studying the genomes and proteomes of these thermophilic fungi, we found common strategies of thermal adaptation across the different kingdoms of Life, including amino acid biases and a reduced genome size. A phylogenetics-guided comparison of thermophilic proteomes with those of other, mesophilic Sordariomycetes revealed consistent amino acid substitutions associated to thermophily that were also present in an independent lineage of thermophilic fungi. The most consistent pattern is the substitution of lysine by arginine, which we could find in almost all lineages but has not been extensively used in protein stability engineering. By exploiting mutational paths towards the thermophiles, we could predict particular amino acid residues in individual proteins that contribute to thermostability and validated some of them experimentally. By determining the three-dimensional structure of an exemplar protein from C. thermophilum (Arx1, we could also characterise the molecular consequences of some of these mutations. Conclusions The comparative analysis of these three genomes not only enhances our understanding of the evolution of thermophily, but also provides new ways to engineer protein stability.

  14. Action orientation, consistency and feelings of regret

    Directory of Open Access Journals (Sweden)

    Todd McElroy

    2007-12-01

    Full Text Available Previous research has demonstrated that consistency between people's behavior and their dispositions has predictive validity for judgments of regret. Research has also shown that differences in the personality variable of action orientation can influence ability to regulate negative affect. The present set of studies was designed to investigate how both consistency factors and action-state personality orientation influence judgments of regret. In Study 1, we used a recalled life event to provide a situation in which the person had experienced either an action or inaction. Individuals with an action orientation experienced more regret for situations involving inaction (staying home than situations involving action (going out. State-oriented individuals, however, maintained high levels of regret and did not differ in their regret ratings across either the action or inaction situations. In Study 2, participants made realistic choices involving either an action or inaction. Our findings revealed the same pattern of results: action-oriented individuals who chose an option that involved not acting (inaction had more regret that individuals who chose an option that involved acting (action. State-oriented individuals experienced high levels of regret regardless of whether they chose to act or not to act.

  15. Evaluating the hydrological consistency of evaporation products

    KAUST Repository

    Lopez Valencia, Oliver Miguel

    2017-01-18

    Advances in space-based observations have provided the capacity to develop regional- to global-scale estimates of evaporation, offering insights into this key component of the hydrological cycle. However, the evaluation of large-scale evaporation retrievals is not a straightforward task. While a number of studies have intercompared a range of these evaporation products by examining the variance amongst them, or by comparison of pixel-scale retrievals against ground-based observations, there is a need to explore more appropriate techniques to comprehensively evaluate remote-sensing-based estimates. One possible approach is to establish the level of product agreement between related hydrological components: for instance, how well do evaporation patterns and response match with precipitation or water storage changes? To assess the suitability of this "consistency"-based approach for evaluating evaporation products, we focused our investigation on four globally distributed basins in arid and semi-arid environments, comprising the Colorado River basin, Niger River basin, Aral Sea basin, and Lake Eyre basin. In an effort to assess retrieval quality, three satellite-based global evaporation products based on different methodologies and input data, including CSIRO-PML, the MODIS Global Evapotranspiration product (MOD16), and Global Land Evaporation: the Amsterdam Methodology (GLEAM), were evaluated against rainfall data from the Global Precipitation Climatology Project (GPCP) along with Gravity Recovery and Climate Experiment (GRACE) water storage anomalies. To ensure a fair comparison, we evaluated consistency using a degree correlation approach after transforming both evaporation and precipitation data into spherical harmonics. Overall we found no persistent hydrological consistency in these dryland environments. Indeed, the degree correlation showed oscillating values between periods of low and high water storage changes, with a phase difference of about 2–3 months

  16. Consistent Stochastic Modelling of Meteocean Design Parameters

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Sterndorff, M. J.

    2000-01-01

    height from neighboring directional sectors. Numerical examples are presented where the models are calibrated using the Maximum Likelihood method to data from the central part of the North Sea. The calibration of the directional distributions is made such that the stochastic model for the omnidirectional...... maximum wave height is statistically consistent with the directional distribution functions. Finally, it is shown how the stochastic models can be used to estimate characteristic values and in reliability assessment of offshore structures....... velocity, and water level is presented. The stochastic model includes statistical uncertainty and dependency between the four stochastic variables. Further, a new stochastic model for annual maximum directional significant wave heights is presented. The model includes dependency between the maximum wave...

  17. Internal Branding and Employee Brand Consistent Behaviours

    DEFF Research Database (Denmark)

    Mazzei, Alessandra; Ravazzani, Silvia

    2017-01-01

    -normative and constitutive approach to internal branding by proposing an enablement-oriented communication approach. The conceptual background presents a holistic model of the inside-out process of brand building. This model adopts a theoretical approach to internal branding as a nonnormative practice that facilitates...... constitutive processes. In particular, the paper places emphasis on the role and kinds of communication practices as a central part of the nonnormative and constitutive internal branding process. The paper also discusses an empirical study based on interviews with 32 Italian and American communication managers...... and 2 focus groups with Italian communication managers. Findings show that, in order to enhance employee brand consistent behaviours, the most effective communication practices are those characterised as enablement-oriented. Such a communication creates the organizational conditions adequate to sustain...

  18. Quantum cosmological consistency condition for inflation

    Energy Technology Data Exchange (ETDEWEB)

    Calcagni, Gianluca [Instituto de Estructura de la Materia, CSIC, calle Serrano 121, 28006 Madrid (Spain); Kiefer, Claus [Institut für Theoretische Physik, Universität zu Köln, Zülpicher Strasse 77, 50937 Köln (Germany); Steinwachs, Christian F., E-mail: calcagni@iem.cfmac.csic.es, E-mail: kiefer@thp.uni-koeln.de, E-mail: christian.steinwachs@physik.uni-freiburg.de [Physikalisches Institut, Albert-Ludwigs-Universität Freiburg, Hermann-Herder-Str. 3, 79104 Freiburg (Germany)

    2014-10-01

    We investigate the quantum cosmological tunneling scenario for inflationary models. Within a path-integral approach, we derive the corresponding tunneling probability distribution. A sharp peak in this distribution can be interpreted as the initial condition for inflation and therefore as a quantum cosmological prediction for its energy scale. This energy scale is also a genuine prediction of any inflationary model by itself, as the primordial gravitons generated during inflation leave their imprint in the B-polarization of the cosmic microwave background. In this way, one can derive a consistency condition for inflationary models that guarantees compatibility with a tunneling origin and can lead to a testable quantum cosmological prediction. The general method is demonstrated explicitly for the model of natural inflation.

  19. Economic costs of power interruptions: a consistent model and methodology

    Energy Technology Data Exchange (ETDEWEB)

    Ghajar, Raymond F. [School of Engineering and Architecture Lebanese American University P.O. Box 36, Byblos (Lebanon); Billinton, Roy [Power Systems Research Group University of Saskatchewan Saskatoon, Sask., S7N 5A9 (Canada)

    2006-01-15

    One of the most basic requirements in cost/benefit assessments of generation and transmission systems are the costs incurred by customers due to power interruptions. This paper provides a consistent set of cost of interruption data that can be used to assess the reliability worth of a power system. In addition to this basic data, methodologies for calculating the customer damage functions and the interrupted energy assessment rates for individual load points in the system and for the entire service area are also presented. The proposed model and methodology are illustrated by application to the IEEE-reliability test system (IEEE-RTS) [A Report Prepared by the Reliability Test System Task Force of the Application of Probability Methods Subcommittee, IEEE Reliability Test System, IEEE Trans. on PAS, Vol. PAS-98, No.6, pp. 2047-2054, November/December 1979. [1

  20. Consistency and Refinement for Interval Markov Chains

    DEFF Research Database (Denmark)

    Delahaye, Benoit; Larsen, Kim Guldstrand; Legay, Axel

    2012-01-01

    Interval Markov Chains (IMC), or Markov Chains with probability intervals in the transition matrix, are the base of a classic specification theory for probabilistic systems [18]. The standard semantics of IMCs assigns to a specification the set of all Markov Chains that satisfy its interval...

  1. [Consistent Declarative Memory with Depressive Symptomatology].

    Science.gov (United States)

    Botelho de Oliveira, Silvia; Flórez, Ruth Natalia Suárez; Caballero, Diego Andrés Vásquez

    2012-12-01

    Some studies have suggested that potentiated remembrance of negative events on people with depressive disorders seems to be an important factor in the etiology, course and maintenance of depression. Evaluate the emotional memory in people with and without depressive symptomatology by means of an audio-visual test. 73 university students were evaluated, male and female, between 18 and 40 years old, distributed in two groups: with depressive symptomatology (32) and without depressive symptomatology (40), using the Scale from the Center of Epidemiologic Studies for Depression (CES-D, English Abbreviation) and a cutting point of 20. There were not meaningful differences between free and voluntary recalls, with and without depressive symptomatology, in spite of the fact that both groups had granted a higher emotional value to the audio-visual test and that they had associated it with emotional sadness. People with depressive symptomatology did not exhibit the effect of mnemonic potentiation generally associated to the content of the emotional version of the test; therefore, the hypothesis of emotional consistency was not validated. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  2. Consistency between GRUAN sondes, LBLRTM and IASI

    Directory of Open Access Journals (Sweden)

    X. Calbet

    2017-06-01

    Full Text Available Radiosonde soundings from the GCOS Reference Upper-Air Network (GRUAN data record are shown to be consistent with Infrared Atmospheric Sounding Instrument (IASI-measured radiances via LBLRTM (Line-By-Line Radiative Transfer Model in the part of the spectrum that is mostly affected by water vapour absorption in the upper troposphere (from 700 hPa up. This result is key for climate data records, since GRUAN, IASI and LBLRTM constitute reference measurements or a reference radiative transfer model in each of their fields. This is specially the case for night-time radiosonde measurements. Although the sample size is small (16 cases, daytime GRUAN radiosonde measurements seem to have a small dry bias of 2.5 % in absolute terms of relative humidity, located mainly in the upper troposphere, with respect to LBLRTM and IASI. Full metrological closure is not yet possible and will not be until collocation uncertainties are better characterized and a full uncertainty covariance matrix is clarified for GRUAN.

  3. Trisomy 21 consistently activates the interferon response

    Science.gov (United States)

    Sullivan, Kelly D; Lewis, Hannah C; Hill, Amanda A; Pandey, Ahwan; Jackson, Leisa P; Cabral, Joseph M; Smith, Keith P; Liggett, L Alexander; Gomez, Eliana B; Galbraith, Matthew D; DeGregori, James; Espinosa, Joaquín M

    2016-01-01

    Although it is clear that trisomy 21 causes Down syndrome, the molecular events acting downstream of the trisomy remain ill defined. Using complementary genomics analyses, we identified the interferon pathway as the major signaling cascade consistently activated by trisomy 21 in human cells. Transcriptome analysis revealed that trisomy 21 activates the interferon transcriptional response in fibroblast and lymphoblastoid cell lines, as well as circulating monocytes and T cells. Trisomy 21 cells show increased induction of interferon-stimulated genes and decreased expression of ribosomal proteins and translation factors. An shRNA screen determined that the interferon-activated kinases JAK1 and TYK2 suppress proliferation of trisomy 21 fibroblasts, and this defect is rescued by pharmacological JAK inhibition. Therefore, we propose that interferon activation, likely via increased gene dosage of the four interferon receptors encoded on chromosome 21, contributes to many of the clinical impacts of trisomy 21, and that interferon antagonists could have therapeutic benefits. DOI: http://dx.doi.org/10.7554/eLife.16220.001 PMID:27472900

  4. Exploring the Consistent behavior of Information Services

    Directory of Open Access Journals (Sweden)

    Kapidakis Sarantos

    2016-01-01

    Full Text Available Computer services are normally assumed to work well all the time. This usually happens for crucial services like bank electronic services, but not necessarily so for others, that there is no commercial interest in their operation. In this work we examined the operation and the errors of information services and tried to find clues that will help predicting the consistency of the behavior and the quality of the harvesting, which is harder because of the transient conditions and the many services and the huge amount of harvested information. We found many unexpected situations. The services that always successfully satisfy a request may in fact return part of it. A significant part of the OAI services have ceased working while many other serves occasionally fail to respond. Some services fail in the same way each time, and we pronounce them dead, as we do not see a way to overcome that. Others also always, or sometimes fail, but not in the same way, and we hope that their behavior is affected by temporary factors, that may improve later on. We categorized the services into classes, to study their behavior in more detail.

  5. Generalized arc consistency for global cardinality constraint

    Energy Technology Data Exchange (ETDEWEB)

    Regin, J.C. [ILOG S.A., Gentilly (France)

    1996-12-31

    A global cardinality constraint (gcc) is specified in terms of a set of variables X = (x{sub 1},..., x{sub p}) which take their values in a subset of V = (v{sub 1},...,v{sub d}). It constrains the number of times a value v{sub i} {epsilon} V is assigned to a variable in X to be in an interval [l{sub i}, c{sub i}]. Cardinality constraints have proved very useful in many real-life problems, such as scheduling, timetabling, or resource allocation. A gcc is more general than a constraint of difference, which requires each interval to be. In this paper, we present an efficient way of implementing generalized arc consistency for a gcc. The algorithm we propose is based on a new theorem of flow theory. Its space complexity is O({vert_bar}X{vert_bar} {times} {vert_bar}V{vert_bar}) and its time complexity is O({vert_bar}X{vert_bar}{sup 2} {times} {vert_bar}V{vert_bar}). We also show how this algorithm can efficiently be combined with other filtering techniques.

  6. [The accuracy and consistency of dental radiometers].

    Science.gov (United States)

    Rossouw, S

    2001-11-01

    Radiometers are used in dentistry to evaluate the intensity of light emitted by curing lights. This article discusses the accuracy and consistency of radiometers. The study was done as two experiments, dividing radiometers by age. In experiment 1, one Heliolux II curing light was tested nine times with each of four old radiometers. In experiment 2 the same curing light was tested with three very new radiometers, under identical circumstances. In experiment 1, the average intensities measured by the radiometers ranged between 262 and 348 mW/cm2, while the standard deviation varied between 7.59 and 42.03. In experiment 2, the average intensities measured by the radiometers ranged between 240 and 283.75 mW/cm2, while the standard deviation varied between 0.00 and 4.63. The seven radiometers differed significantly (P radiometers differs depending on the age of the unit, the state of repair of the unit, and how often it is standardised. In this study it was impossible to evaluate the accuracy of the radiometers.

  7. Merging By Decentralized Eventual Consistency Algorithms

    Directory of Open Access Journals (Sweden)

    Ahmed-Nacer Mehdi

    2015-12-01

    Full Text Available Merging mechanism is an essential operation for version control systems. When each member of collaborative development works on an individual copy of the project, software merging allows to reconcile modifications made concurrently as well as managing software change through branching. The collaborative system is in charge to propose a merge result that includes user’s modifications. Theusers now have to check and adapt this result. The adaptation should be as effort-less as possible, otherwise, the users may get frustrated and will quit the collaboration. This paper aims to reduce the conflicts during the collaboration and im prove the productivity. It has three objectives: study the users’ behavior during the collaboration, evaluate the quality of textual merging results produced by specific algorithms and propose a solution to improve the r esult quality produced by the default merge tool of distributed version control systems. Through a study of eight open-source repositories totaling more than 3 million lines of code, we observe the behavior of the concurrent modifications during t he merge p rocedure. We i dentified when th e ex isting merge techniques under-perform, and we propose solutions to improve the quality of the merge. We finally compare with the traditional merge tool through a large corpus of collaborative editing.

  8. Sound practices for consistent human visual inspection.

    Science.gov (United States)

    Melchore, James A

    2011-03-01

    Numerous presentations and articles on manual inspection of pharmaceutical drug products have been released, since the pioneering articles on inspection by Knapp and associates Knapp and Kushner (J Parenter Drug Assoc 34:14, 1980); Knapp and Kushner (Bull Parenter Drug Assoc 34:369, 1980); Knapp and Kushner (J Parenter Sci Technol 35:176, 1981); Knapp and Kushner (J Parenter Sci Technol 37:170, 1983). This original work by Knapp and associates provided the industry with a statistical means of evaluating inspection performance. This methodology enabled measurement of individual inspector performance, performance of the entire inspector pool and provided basic suggestions for the conduct of manual inspection. Since that time, numerous subject matter experts (SMEs) have presented additional valuable information for the conduct of manual inspection Borchert et al. (J Parenter Sci Technol 40:212, 1986); Knapp and Abramson (J Parenter Sci Technol 44:74, 1990); Shabushnig et al. (1994); Knapp (1999); Knapp (2005); Cherris (2005); Budd (2005); Barber and Thomas (2005); Knapp (2005); Melchore (2007); Leversee and Ronald (2007); Melchore (2009); Budd (2007); Borchert et al. (1986); Berdovich (2005); Berdovich (2007); Knapp (2007); Leversee and Shabushing (2009); Budd (2009). Despite this abundance of knowledge, neither government regulations nor the multiple compendia provide more than minimal guidance or agreement for the conduct of manual inspection. One has to search the literature for useful information that has been published by SMEs in the field of Inspection. The purpose of this article is to restate the sound principles proclaimed by SMEs with the hope that they serve as a useful guideline to bring greater consistency to the conduct of manual inspection. © 2010 American Association of Pharmaceutical Scientists

  9. ER=EPR, GHZ, and the Consistency of Quantum Measurements

    OpenAIRE

    Susskind, Leonard

    2014-01-01

    This paper illustrates various aspects of the ER=EPR conjecture.It begins with a brief heuristic argument, using the Ryu-Takayanagi correspondence, for why entanglement between black holes implies the existence of Einstein-Rosen bridges. The main part of the paper addresses a fundamental question: Is ER=EPR consistent with the standard postulates of quantum mechanics? Naively it seems to lead to an inconsistency between observations made on entangled systems by different observers. The resolu...

  10. Bilateral choroidal tumors consistent with metastatic malignant paraganglioma.

    Science.gov (United States)

    Aaberg, Thomas M; Benjamin, Erin P; Biscotti, Charles V; Singh, Arun D

    2013-01-01

    To report a patient with bilateral choroidal metastasis from a malignant paraganglioma. Clinicopathologic case report and literature review. A 68-year-old woman presented with bilateral amelanotic focal choroidal lesions. A thorough systemic work-up for a primary cancer revealed a paraganglioma (extraadrenal pheochromocytoma) and a pheochromocytoma of the left adrenal gland. Fine-needle aspiration biopsy of the choroidal lesion was consistent with metastatic paraganglioma. Metastatic paraganglioma, although rare, has the ability to metastasize to the choroid.

  11. Time-Consistent and Market-Consistent Evaluations (Revised version of 2012-086)

    NARCIS (Netherlands)

    Stadje, M.A.; Pelsser, A.

    2014-01-01

    Abstract: We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  12. Globally consistent registration of terrestrial laser scans via graph optimization

    Science.gov (United States)

    Theiler, Pascal Willy; Wegner, Jan Dirk; Schindler, Konrad

    2015-11-01

    In this paper we present a framework for the automatic registration of multiple terrestrial laser scans. The proposed method can handle arbitrary point clouds with reasonable pairwise overlap, without knowledge about their initial orientation and without the need for artificial markers or other specific objects. The framework is divided into a coarse and a fine registration part, which each start with pairwise registration and then enforce consistent global alignment across all scans. While we put forward a complete, functional registration system, the novel contribution of the paper lies in the coarse global alignment step. Merging multiple scans into a consistent network creates loops along which the relative transformations must add up. We pose the task of finding a global alignment as picking the best candidates from a set of putative pairwise registrations, such that they satisfy the loop constraints. This yields a discrete optimization problem that can be solved efficiently with modern combinatorial methods. Having found a coarse global alignment in this way, the framework proceeds by pairwise refinement with standard ICP, followed by global refinement to evenly spread the residual errors. The framework was tested on six challenging, real-world datasets. The discrete global alignment step effectively detects, removes and corrects failures of the pairwise registration procedure, finally producing a globally consistent coarse scan network which can be used as initial guess for the highly non-convex refinement. Our overall system reaches success rates close to 100% at acceptable runtimes < 1 h, even in challenging conditions such as scanning in the forest.

  13. Stochastic multi-configurational self-consistent field theory

    CERN Document Server

    Thomas, Robert E; Alavi, Ali; Booth, George H

    2015-01-01

    The multi-configurational self-consistent field theory is considered the standard starting point for almost all multireference approaches required for strongly-correlated molecular problems. The limitation of the approach is generally given by the number of strongly-correlated orbitals in the molecule, as its cost will grow exponentially with this number. We present a new multi-configurational self-consistent field approach, wherein linear determinant coefficients of a multi-configurational wavefunction are optimized via the stochastic full configuration interaction quantum Monte Carlo technique at greatly reduced computational cost, with non-linear orbital rotation parameters updated variationally based on this sampled wavefunction. This extends this approach to strongly-correlated systems with far larger active spaces than it is possible to treat by conventional means. By comparison with this traditional approach, we demonstrate that the introduction of stochastic noise in both the determinant amplitudes an...

  14. A Consistent Adaptive-resolution Smoothed Particle Hydrodynamics Method

    Science.gov (United States)

    Pan, Wenxiao; Hu, Wei; Hu, Xiaozhe; Negrut, Dan; Univ of Wisconsin, Madison Collaboration; Tufts University Collaboration

    2017-11-01

    We seek to accelerate and increase the size of simulations for fluid-structure interactions (FSI) by using adaptive resolutions in the spatial discretization of the equations governing the time evolution of systems displaying two-way fluid-solid coupling. To this end, we propose an adaptive-resolution smoothed particle hydrodynamics (SPH) approach, in which spatial resolutions adaptively vary according to a recovery-based error estimator of velocity gradient as flow evolves. The second-order consistent discretization of spatial differential operators is employed to ensure the accuracy of the proposed method. The convergence, accuracy, and efficiency attributes of the new method are assessed by simulating different flows. In this process, the numerical results are compared to the analytical, finite element, and consistent SPH single-resolution solutions. We anticipate that the proposed adaptive-resolution method will enlarge the class of SPH-tractable FSI applications.

  15. Structures, profile consistency, and transport scaling in electrostatic convection

    DEFF Research Database (Denmark)

    Bian, N.H.; Garcia, O.E.

    2005-01-01

    that for interchange modes, profile consistency is in fact due to mixing by persistent large-scale convective cells. This mechanism is not a turbulent diffusion, cannot occur in collisionless systems, and is the analog of the well-known laminar "magnetic flux expulsion" in magneiohydrodynamics. This expulsion process...... involves a "pinch" across closed streamlines and further results in the formation of pressure fingers along the-separatrix of the convective cells. By nature, these coherent structures are dissipative because the mixing process that leads to their formation relies on a finite amount of collisional...... diffusion. Numerical simulations of two-dimensional interchange modes confirm the role of laminar expulsion by convective cells, for profile consistency and structure formation. They also show that the fingerlike pressure structures ultimately control the rate of heat transport across the plasma layer...

  16. Multiconfigurational self-consistent reaction field theory for nonequilibrium solvation

    DEFF Research Database (Denmark)

    Mikkelsen, Kurt V.; Cesar, Amary; Ågren, Hans

    1995-01-01

    We present multiconfigurational self-consistent reaction field theory and implementation for solvent effects on a solute molecular system that is not in equilibrium with the outer solvent. The approach incorporates two different polarization vectors for studying the influence of the solvent...... states influenced by the two types of polarization vectors. The general treatment of the correlation problem through the use of complete and restricted active space methodologies makes the present multiconfigurational self-consistent reaction field approach general in that it can handle any type of state......, open-shell, excited, and transition states. We demonstrate the theory by computing solvatochromatic shifts in optical/UV spectra of some small molecules and electron ionization and electron detachment energies of the benzene molecule. It is shown that the dependency of the solvent induced affinity...

  17. Consistently Showing Your Best Side? Intra-individual Consistency in #Selfie Pose Orientation.

    Science.gov (United States)

    Lindell, Annukka K

    2017-01-01

    Painted and photographic portraits of others show an asymmetric bias: people favor their left cheek. Both experimental and database studies confirm that the left cheek bias extends to selfies. To date all such selfie studies have been cross-sectional; whether individual selfie-takers tend to consistently favor the same pose orientation, or switch between multiple poses, remains to be determined. The present study thus examined intra-individual consistency in selfie pose orientations. Two hundred selfie-taking participants (100 male and 100 female) were identified by searching #selfie on Instagram. The most recent 10 single-subject selfies for the each of the participants were selected and coded for type of selfie (normal; mirror) and pose orientation (left, midline, right), resulting in a sample of 2000 selfies. Results indicated that selfie-takers do tend to consistently adopt a preferred pose orientation (α = 0.72), with more participants showing an overall left cheek bias (41%) than would be expected by chance (overall right cheek bias = 31.5%; overall midline bias = 19.5%; no overall bias = 8%). Logistic regression modellng, controlling for the repeated measure of participant identity, indicated that sex did not affect pose orientation. However, selfie type proved a significant predictor when comparing left and right cheek poses, with a stronger left cheek bias for mirror than normal selfies. Overall, these novel findings indicate that selfie-takers show intra-individual consistency in pose orientation, and in addition, replicate the previously reported left cheek bias for selfies and other types of portrait, confirming that the left cheek bias also presents within individuals' selfie corpora.

  18. Consistently Showing Your Best Side? Intra-individual Consistency in #Selfie Pose Orientation

    Science.gov (United States)

    Lindell, Annukka K.

    2017-01-01

    Painted and photographic portraits of others show an asymmetric bias: people favor their left cheek. Both experimental and database studies confirm that the left cheek bias extends to selfies. To date all such selfie studies have been cross-sectional; whether individual selfie-takers tend to consistently favor the same pose orientation, or switch between multiple poses, remains to be determined. The present study thus examined intra-individual consistency in selfie pose orientations. Two hundred selfie-taking participants (100 male and 100 female) were identified by searching #selfie on Instagram. The most recent 10 single-subject selfies for the each of the participants were selected and coded for type of selfie (normal; mirror) and pose orientation (left, midline, right), resulting in a sample of 2000 selfies. Results indicated that selfie-takers do tend to consistently adopt a preferred pose orientation (α = 0.72), with more participants showing an overall left cheek bias (41%) than would be expected by chance (overall right cheek bias = 31.5%; overall midline bias = 19.5%; no overall bias = 8%). Logistic regression modellng, controlling for the repeated measure of participant identity, indicated that sex did not affect pose orientation. However, selfie type proved a significant predictor when comparing left and right cheek poses, with a stronger left cheek bias for mirror than normal selfies. Overall, these novel findings indicate that selfie-takers show intra-individual consistency in pose orientation, and in addition, replicate the previously reported left cheek bias for selfies and other types of portrait, confirming that the left cheek bias also presents within individuals’ selfie corpora. PMID:28270790

  19. Feeling Expression Using Avatars and Its Consistency for Subjective Annotation

    Science.gov (United States)

    Ito, Fuyuko; Sasaki, Yasunari; Hiroyasu, Tomoyuki; Miki, Mitsunori

    Consumer Generated Media(CGM) is growing rapidly and the amount of content is increasing. However, it is often difficult for users to extract important contents and the existence of contents recording their experiences can easily be forgotten. As there are no methods or systems to indicate the subjective value of the contents or ways to reuse them, subjective annotation appending subjectivity, such as feelings and intentions, to contents is needed. Representation of subjectivity depends on not only verbal expression, but also nonverbal expression. Linguistically expressed annotation, typified by collaborative tagging in social bookmarking systems, has come into widespread use, but there is no system of nonverbally expressed annotation on the web. We propose the utilization of controllable avatars as a means of nonverbal expression of subjectivity, and confirmed the consistency of feelings elicited by avatars over time for an individual and in a group. In addition, we compared the expressiveness and ease of subjective annotation between collaborative tagging and controllable avatars. The result indicates that the feelings evoked by avatars are consistent in both cases, and using controllable avatars is easier than collaborative tagging for representing feelings elicited by contents that do not express meaning, such as photos.

  20. Consistent vapour-liquid equilibrium data containing lipids

    DEFF Research Database (Denmark)

    Cunico, Larissa; Ceriani, Roberta; Sarup, Bent

    Consistent physical and thermodynamic properties of pure components and their mixtures are important for process design, simulation, and optimization as well as design of chemical based products. In the case of lipids, it was observed a lack of experimental data for pure compounds and also...... compositions were calculated using Wilson, NRTL, UNIQUAC, and original UNIFAC models and bubble-point calculations. The relevance of enlarging experimental databank of lipids systems data in order to improve the performance of predictive thermodynamic models was confirmed in this work by analyzing...... the calculated values of original UNIFAC model and by proposing new interaction parameters for UNIFAC model and lipids systems. Also PC-SAFT model were analysed for lipids and a modification is proposed....

  1. 49 CFR 385.311 - What will the safety audit consist of?

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 5 2010-10-01 2010-10-01 false What will the safety audit consist of? 385.311... SAFETY FITNESS PROCEDURES New Entrant Safety Assurance Program § 385.311 What will the safety audit consist of? The safety audit will consist of a review of the new entrant's safety management systems and a...

  2. Bootstrap embedding: An internally consistent fragment-based method

    Science.gov (United States)

    Welborn, Matthew; Tsuchimochi, Takashi; Van Voorhis, Troy

    2016-08-01

    Strong correlation poses a difficult problem for electronic structure theory, with computational cost scaling quickly with system size. Fragment embedding is an attractive approach to this problem. By dividing a large complicated system into smaller manageable fragments "embedded" in an approximate description of the rest of the system, we can hope to ameliorate the steep cost of correlated calculations. While appealing, these methods often converge slowly with fragment size because of small errors at the boundary between fragment and bath. We describe a new electronic embedding method, dubbed "Bootstrap Embedding," a self-consistent wavefunction-in-wavefunction embedding theory that uses overlapping fragments to improve the description of fragment edges. We apply this method to the one dimensional Hubbard model and a translationally asymmetric variant, and find that it performs very well for energies and populations. We find Bootstrap Embedding converges rapidly with embedded fragment size, overcoming the surface-area-to-volume-ratio error typical of many embedding methods. We anticipate that this method may lead to a low-scaling, high accuracy treatment of electron correlation in large molecular systems.

  3. Variation in clinical coding lists in UK general practice: a barrier to consistent data entry?

    Directory of Open Access Journals (Sweden)

    Tracy Waize

    2007-09-01

    Conclusions Current systems for clinical coding promote diversity rather than consistency of clinical coding. As the UK moves towards an integrated health IT system consistency of coding will become more important. A standardised, limited list of codes for primary care might help address this need.

  4. Consistency in performance evaluation reports and medical records.

    Science.gov (United States)

    Lu, Mingshan; Ma, Ching-to Albert

    2002-12-01

    In the health care market managed care has become the latest innovation for the delivery of services. For efficient implementation, the managed care organization relies on accurate information. So clinicians are often asked to report on patients before referrals are approved, treatments authorized, or insurance claims processed. What are clinicians responses to solicitation for information by managed care organizations? The existing health literature has already pointed out the importance of provider gaming, sincere reporting, nudging, and dodging the rules. We assess the consistency of clinicians reports on clients across administrative data and clinical records. For about 1,000 alcohol abuse treatment episodes, we compare clinicians reports across two data sets. The first one, the Maine Addiction Treatment System (MATS), was an administrative data set; the state government used it for program performance monitoring and evaluation. The second was a set of medical record abstracts, taken directly from the clinical records of treatment episodes. A clinician s reporting practice exhibits an inconsistency if the information reported in MATS differs from the information reported in the medical record in a statistically significant way. We look for evidence of inconsistencies in five categories: admission alcohol use frequency, discharge alcohol use frequency, termination status, admission employment status, and discharge employment status. Chi-square tests, Kappa statistics, and sensitivity and specificity tests are used for hypothesis testing. Multiple imputation methods are employed to address the problem of missing values in the record abstract data set. For admission and discharge alcohol use frequency measures, we find, respectively, strong and supporting evidence for inconsistencies. We find equally strong evidence for consistency in reports of admission and discharge employment status, and mixed evidence on report consistency on termination status. Patterns of

  5. Factors affecting the consistent use of barrier methods of contraception.

    Science.gov (United States)

    Beckman, L J; Harvey, S M

    1996-09-01

    To discuss the major issues involved in the consistent and effective use of barrier methods of contraception. Major research and review articles on barrier methods published within the last 10 years were considered. One major source of articles was Family Planning Perspectives. This paper is a focused review and integration of recent literature rather than a comprehensive literature review. Only selected articles published since 1986 that are pertinent to the issues raised are included. All barrier methods have common characteristics that influence their patterns of use. The correct and consistent use of such methods is determined by the complex interaction of characteristics of the methods themselves, characteristics of users, and the situational context. Method characteristics include the extent of interference with sexual spontaneity and enjoyment, the amount of partner cooperation required, and the ability of the method to protect against human immunodeficiency virus and other sexually transmitted diseases. User characteristics include motivation to avoid unintended pregnancy, ability to plan, comfort with sexuality, and previous contraceptive use. Stage of sexual career, relationship characteristics, and physical and sexual abuse are important situational influences. Even though most barrier methods can be obtained without a prescription from a provider, clinicians have an extremely important role in promoting effective and consistent method use. Four major ways to improve the use of barrier methods currently available include: 1) improve method characteristics and the distribution systems; 2) change consumers' perceptions of method attributes; 3) train consumers to use methods correctly and overcome-perceived negative characteristics of the methods; and 4) change values about the perceived importance of method characteristics. There also is an urgent need for the development of better barrier methods.

  6. Characterization of Consistent Completion of Reciprocal Comparison Matrices

    Directory of Open Access Journals (Sweden)

    Julio Benítez

    2014-01-01

    Full Text Available Analytic hierarchy process (AHP is a leading multi-attribute decision-aiding model that is designed to help make better choices when faced with complex decisions involving several dimensions. AHP, which enables qualitative analysis using a combination of subjective and objective information, is a multiple criteria decision analysis approach that uses hierarchical structured pairwise comparisons. One of the drawbacks of AHP is that a pairwise comparison cannot be completed by an actor or stakeholder not fully familiar with all the aspects of the problem. The authors have developed a completion based on a process of linearization that minimizes the matrix distance defined in terms of the Frobenius norm (a strictly convex minimization problem. In this paper, we characterize when an incomplete, positive, and reciprocal matrix can be completed to become a consistent matrix. We show that this characterization reduces the problem to the solution of a linear system of equations—a straightforward procedure. Various properties of such a completion are also developed using graph theory, including explicit calculation formulas. In real decision-making processes, facilitators conducting the study could use these characterizations to accept an incomplete comparison body given by an actor or to encourage the actor to further develop the comparison for the sake of consistency.

  7. Self-consistent Simulation of Microparticle and Ion Wakefield Configuration

    Science.gov (United States)

    Sanford, Dustin; Brooks, Beau; Ellis, Naoki; Matthews, Lorin; Hyde, Truell

    2017-10-01

    In a complex plasma, positively charged ions often have a directed flow with respect to the negatively charged dust grains. The resulting interaction between the dust and the flowing plasma creates an ion wakefield downstream from the dust particles, with the resulting positive space region modifying the interaction between the grains and contributing to the observed dynamics and equilibrium structure of the system. Here we present a proof of concept method that uses a molecular dynamics simulation to model the ion wakefield allowing the dynamics of the dust particles to be determined self-consistently. The trajectory of each ion is calculated including the forces from all other ions, which are treated as ``Yukawa particles'' and shielded from thermal electrons and the forces of the charged dust particles. Both the dust grain charge and the wakefield structure are also self-consistently determined for various particle configurations. The resultant wakefield potentials are then used to provide dynamic simulations of dust particle pairs. These results will be employed to analyze the formation and dynamics of field-aligned chains in CASPER's PK4 experiment onboard the International Space Station, allowing examination of extended dust chains without the masking force of gravity. This work was supported by the National Science Foundation under Grants PHY-1414523 and PHY-1740203.

  8. ER=EPR, GHZ, and the consistency of quantum measurements

    Energy Technology Data Exchange (ETDEWEB)

    Susskind, Leonard [Stanford Institute for Theoretical Physics and Department of Physics, Stanford University, Stanford, CA (United States)

    2016-01-15

    This paper illustrates various aspects of the ER=EPR conjecture. It begins with a brief heuristic argument, using the Ryu-Takayanagi correspondence, for why entanglement between black holes implies the existence of Einstein-Rosen bridges. The main part of the paper addresses a fundamental question: Is ER=EPR consistent with the standard postulates of quantum mechanics? Naively it seems to lead to an inconsistency between observations made on entangled systems by different observers. The resolution of the paradox lies in the properties of multiple black holes, entangled in the Greenberger-Horne-Zeilinger pattern. The last part of the paper is about entanglement as a resource for quantum communication. ER=EPR provides a way to visualize protocols like quantum teleportation. In some sense teleportation takes place through the wormhole, but as usual, classical communication is necessary to complete the protocol. (copyright 2016 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  9. Self-consistent estimation of mislocated fixations during reading.

    Directory of Open Access Journals (Sweden)

    Ralf Engbert

    Full Text Available During reading, we generate saccadic eye movements to move words into the center of the visual field for word processing. However, due to systematic and random errors in the oculomotor system, distributions of within-word landing positions are rather broad and show overlapping tails, which suggests that a fraction of fixations is mislocated and falls on words to the left or right of the selected target word. Here we propose a new procedure for the self-consistent estimation of the likelihood of mislocated fixations in normal reading. Our approach is based on iterative computation of the proportions of several types of oculomotor errors, the underlying probabilities for word-targeting, and corrected distributions of landing positions. We found that the average fraction of mislocated fixations ranges from about 10% to more than 30% depending on word length. These results show that fixation probabilities are strongly affected by oculomotor errors.

  10. Structure and internal consistency of a shoulder model.

    Science.gov (United States)

    Högfors, C; Karlsson, D; Peterson, B

    1995-07-01

    A three-dimensional biomechanical model of the shoulder is developed for force predictions in 46 shoulder structures. The model is directed towards the analysis of static working situations where the load is low or moderate. Arbitrary static arm postures in the natural shoulder range may be considered, as well as different kinds of external loads including different force and moment directions. The model can predict internal forces for the shoulder muscles, for the glenohumeral, the acromioclavicular and the sternoclavicular joint as well as for the coracohumeral ligament. A solution to the statistically indeterminate force system is obtained by minimising an objective function. The default function chosen for this is the sum of the squared muscle stresses, but other objective functions may be used as well. The structure of the model is described and its ingredients discussed. The internal consistency of the model, its structural stability and the compatibility of the elements that go into it, is investigated.

  11. Maintaining consistent quality and clinical performance of biopharmaceuticals.

    Science.gov (United States)

    Lamanna, William C; Holzmann, Johann; Cohen, Hillel P; Guo, Xinghua; Schweigler, Monika; Stangler, Thomas; Seidl, Andreas; Schiestl, Martin

    2018-01-10

    Biopharmaceuticals are large protein based drugs which are heterogeneous by nature due to post translational modifications resulting from cellular production, processing and storage. Changes in the abundance of different variants over time are inherent to biopharmaceuticals due to their sensitivity to subtle process differences and the necessity for regular manufacturing changes. Product variability must thus be carefully controlled to ensure that it does not result in changes in safety or efficacy. Areas covered: The focus of this manuscript is to provide improved understanding of the science and strategies used to maintain the quality and clinical performance of biopharmaceuticals, including biosimilars, throughout their lifecycle. This review summarizes rare historical instances where clinically relevant changes have occurred, defined here as clinical drift, and discusses modern tools used to prevent such changes, including improved analytics, quality systems and regulatory frameworks. Expert opinion: Despite their size complexity and heterogeneity, modern analytics, manufacturing quality systems and comparability requirements for the evaluation of manufacturing changes cumulatively help to ensure the consistent quality and clinical performance of biopharmaceuticals throughout their product lifecycle. Physicians and patients can expect the same safety and efficacy from biopharmaceuticals and their respective biosimilars irrespective of batch or production history.

  12. Characterization of consistent triggers of migraine with aura

    DEFF Research Database (Denmark)

    Hauge, Anne Werner; Kirchmann, Malene; Olesen, Jes

    2011-01-01

    The aim of the present study was to characterize perceived consistent triggers of migraine with aura (MA).......The aim of the present study was to characterize perceived consistent triggers of migraine with aura (MA)....

  13. Enterprise process consistency expressed by a formal description of transactions

    Directory of Open Access Journals (Sweden)

    Milan Mišovič

    2005-01-01

    Full Text Available Using a progressive Information Technology for development of Software Modules for Enterprise Information Systems brings a lot of practical and theoretical problems. One of them is a verification of results achieved in Life Cycle Stages. Object Oriented Analysis has the main position in the Object Life Cycle of Information Systems. It gives fundamental diagrams that will be processed in the Design and Implementation phases. We mention diagrams for enterprise processes and their refining to enterprise transaction diagrams. The Unified Modeling Language (UML has been very often used to enterprise processes and transactions modeling. There is one of very practical and theoretical problems concerning enterprise processes – their internal consistency that can be observed on the level of object transactions. We have in mind such problem as transaction feasibility and object cooperation feasibility in transactions that are strongly bound with states of objects. Therefore, a testing of object complex transactions before their programming appears to be very useful activity.This article introduces a formal description of the object cooperation logic. Therefore there is defined not only an elementary transaction feasibility but also an elementary object cooperation feasibility. It enables to search the feasibility of certain strings of elementary transactions and elementary object collaborations. One string of elementary transactions is very often regarded as a path. There are found two different systems of state logical equations. The first describes path transaction feasibility and the se- cond path object cooperation feasibility. The functional correctness of any complex transaction is founded on a functional correctness of all its paths.

  14. Personality consistency analysis in cloned quarantine dog candidates

    Directory of Open Access Journals (Sweden)

    Jin Choi

    2017-01-01

    Full Text Available In recent research, personality consistency has become an important characteristic. Diverse traits and human-animal interactions, in particular, are studied in the field of personality consistency in dogs. Here, we investigated the consistency of dominant behaviours in cloned and control groups followed by the modified Puppy Aptitude Test, which consists of ten subtests to ascertain the influence of genetic identity. In this test, puppies are exposed to stranger, restraint, prey-like object, noise, startling object, etc. Six cloned and four control puppies participated and the consistency of responses at ages 7–10 and 16 weeks in the two groups was compared. The two groups showed different consistencies in the subtests. While the average scores of the cloned group were consistent (P = 0.7991, those of the control group were not (P = 0.0089. Scores of Pack Drive and Fight or Flight Drive were consistent in the cloned group, however, those of the control group were not. Scores of Prey Drive were not consistent in either the cloned or the control group. Therefore, it is suggested that consistency of dominant behaviour is affected by genetic identity and some behaviours can be influenced more than others. Our results suggest that cloned dogs could show more consistent traits than non-cloned. This study implies that personality consistency could be one of the ways to analyse traits of puppies.

  15. 48 CFR 52.230-3 - Disclosure and Consistency of Cost Accounting Practices.

    Science.gov (United States)

    2010-10-01

    ... of Cost Accounting Practices. 52.230-3 Section 52.230-3 Federal Acquisition Regulations System... Text of Provisions and Clauses 52.230-3 Disclosure and Consistency of Cost Accounting Practices. As prescribed in 30.201-4(b)(1), insert the following clause: Disclosure and Consistency of Cost Accounting...

  16. Decentralized Consistent Network Updates in SDN with ez-Segway

    KAUST Repository

    Nguyen, Thanh Dang

    2017-03-06

    We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and black-holes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.

  17. MOPED enables discoveries through consistently processed proteomics data.

    Science.gov (United States)

    Higdon, Roger; Stewart, Elizabeth; Stanberry, Larissa; Haynes, Winston; Choiniere, John; Montague, Elizabeth; Anderson, Nathaniel; Yandl, Gregory; Janko, Imre; Broomall, William; Fishilevich, Simon; Lancet, Doron; Kolker, Natali; Kolker, Eugene

    2014-01-03

    The Model Organism Protein Expression Database (MOPED, http://moped.proteinspire.org) is an expanding proteomics resource to enable biological and biomedical discoveries. MOPED aggregates simple, standardized and consistently processed summaries of protein expression and metadata from proteomics (mass spectrometry) experiments from human and model organisms (mouse, worm, and yeast). The latest version of MOPED adds new estimates of protein abundance and concentration as well as relative (differential) expression data. MOPED provides a new updated query interface that allows users to explore information by organism, tissue, localization, condition, experiment, or keyword. MOPED supports the Human Proteome Project's efforts to generate chromosome- and diseases-specific proteomes by providing links from proteins to chromosome and disease information as well as many complementary resources. MOPED supports a new omics metadata checklist to harmonize data integration, analysis, and use. MOPED's development is driven by the user community, which spans 90 countries and guides future development that will transform MOPED into a multiomics resource. MOPED encourages users to submit data in a simple format. They can use the metadata checklist to generate a data publication for this submission. As a result, MOPED will provide even greater insights into complex biological processes and systems and enable deeper and more comprehensive biological and biomedical discoveries.

  18. Police Districts, ZonePoly-The data set is a polygon feature consisting of 25 zones representing Spillman CAD patrol zones. It was created to maintain the Spillman computer aided dispatch system for the sheriff's office., Published in 2004, Davis County Government.

    Data.gov (United States)

    NSGIC Local Govt | GIS Inventory — Police Districts dataset current as of 2004. ZonePoly-The data set is a polygon feature consisting of 25 zones representing Spillman CAD patrol zones. It was created...

  19. Police Districts, CommonPlaces-The data set is a point feature consisting of 830 common place points representing Spillman CAD common places. It was created to maintain the Spillman computer aided dispatch system for the sheriff office., Published in 2004, Davis County Government.

    Data.gov (United States)

    NSGIC Local Govt | GIS Inventory — Police Districts dataset current as of 2004. CommonPlaces-The data set is a point feature consisting of 830 common place points representing Spillman CAD common...

  20. PLSS Townships and Sections, This data set consists of a set of diagrams containing control survey information for the Southeastern Wisconsin Region. The information on the Control Survey Summary Diagrams is compiled from the records of U.S. Public Land Survey System (USPLSS) survey, Published in Not Provided, 1:2400 (1in=200ft) scale, Racine County Government.

    Data.gov (United States)

    NSGIC Local Govt | GIS Inventory — PLSS Townships and Sections dataset current as of unknown. This data set consists of a set of diagrams containing control survey information for the Southeastern...

  1. Antibiotics for respiratory, ear and urinary tract disorders and consistency among GPs.

    NARCIS (Netherlands)

    Ong, D.S.Y.; Kuyvenhoven, M.M.; Dijk, L. van; Verheij, T.J.M.

    2008-01-01

    Objectives: To describe specific diagnoses for which systemic antibiotics are prescribed, to assess adherence of antibiotic choice to national guidelines and to assess consistency among general practitioners (GPs) in prescribed volumes of antibiotics for respiratory, ear and urinary tract disorders.

  2. Consistency in use through model based user interface development

    OpenAIRE

    Trapp, M.; Schmettow, M.

    2006-01-01

    In dynamic environments envisioned under the concept of Ambient Intelligence the consistency of user interfaces is of particular importance. To encounter this, the variability of the environment has to be transformed to a coherent user experience. In this paper we explain several dimension of consistency and present our ideas and recent results on achieving adaptive and consistent user interfaces by exploiting the technology of model driven user interface development.

  3. Bootstrap-Based Inference for Cube Root Consistent Estimators

    DEFF Research Database (Denmark)

    Cattaneo, Matias D.; Jansson, Michael; Nagasawa, Kenichi

    This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known to be inconsis......This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known...

  4. Checking Consistency of Pedigree Information is NP-complete

    DEFF Research Database (Denmark)

    Aceto, Luca; Hansen, Jens A.; Ingolfsdottir, Anna

    Consistency checking is a fundamental computational problem in genetics. Given a pedigree and information on the genotypes of some of the individuals in it, the aim of consistency checking is to determine whether these data are consistent with the classic Mendelian laws of inheritance. This problem...... arose originally from the geneticists' need to filter their input data from erroneous information, and is well motivated from both a biological and a sociological viewpoint. This paper shows that consistency checking is NP-complete, even in the presence of three alleles. Several other results...

  5. Evaluating the hydrological consistency of satellite based water cycle components

    KAUST Repository

    Lopez Valencia, Oliver Miguel

    2016-06-15

    Advances in multi-satellite based observations of the earth system have provided the capacity to retrieve information across a wide-range of land surface hydrological components and provided an opportunity to characterize terrestrial processes from a completely new perspective. Given the spatial advantage that space-based observations offer, several regional-to-global scale products have been developed, offering insights into the multi-scale behaviour and variability of hydrological states and fluxes. However, one of the key challenges in the use of satellite-based products is characterizing the degree to which they provide realistic and representative estimates of the underlying retrieval: that is, how accurate are the hydrological components derived from satellite observations? The challenge is intrinsically linked to issues of scale, since the availability of high-quality in-situ data is limited, and even where it does exist, is generally not commensurate to the resolution of the satellite observation. Basin-scale studies have shown considerable variability in achieving water budget closure with any degree of accuracy using satellite estimates of the water cycle. In order to assess the suitability of this type of approach for evaluating hydrological observations, it makes sense to first test it over environments with restricted hydrological inputs, before applying it to more hydrological complex basins. Here we explore the concept of hydrological consistency, i.e. the physical considerations that the water budget impose on the hydrologic fluxes and states to be temporally and spatially linked, to evaluate the reproduction of a set of large-scale evaporation (E) products by using a combination of satellite rainfall (P) and Gravity Recovery and Climate Experiment (GRACE) observations of storage change, focusing on arid and semi-arid environments, where the hydrological flows can be more realistically described. Our results indicate no persistent hydrological

  6. Personality Consistency in Dogs: A Meta-Analysis

    Science.gov (United States)

    Fratkin, Jamie L.; Sinn, David L.; Patall, Erika A.; Gosling, Samuel D.

    2013-01-01

    Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that ‘puppy tests’ measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family). Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43). Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., ‘puppy tests’) versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed. PMID:23372787

  7. Decentralized Consistency Checking in Cross-organizational Workflows

    NARCIS (Netherlands)

    Wombacher, Andreas

    Service Oriented Architectures facilitate loosely coupled composed services, which are established in a decentralized way. One challenge for such composed services is to guarantee consistency, i.e., deadlock-freeness. This paper presents a decentralized approach to consistency checking, which

  8. Numerical consistency check between two approaches to radiative ...

    Indian Academy of Sciences (India)

    We briefly outline the two popular approaches on radiative corrections to neutrino masses and mixing angles, and then carry out a detailed numerical analysis for a consistency check between them in MSSM. We find that the two approaches are nearly consistent with a discrepancy factor of 4.2% with running vacuum ...

  9. Delimiting Coefficient a from Internal Consistency and Unidimensionality

    Science.gov (United States)

    Sijtsma, Klaas

    2015-01-01

    I discuss the contribution by Davenport, Davison, Liou, & Love (2015) in which they relate reliability represented by coefficient a to formal definitions of internal consistency and unidimensionality, both proposed by Cronbach (1951). I argue that coefficient a is a lower bound to reliability and that concepts of internal consistency and…

  10. Consistency of the Babinski reflex and its variants.

    Science.gov (United States)

    Singerman, J; Lee, L

    2008-09-01

    The Babinski Reflex, first described in 1896, is still an integral part of the neurological examination. Many have studied the consistency of this reflex, but none have compared the inter- and intra-observer consistency of the Babinski reflex and its variants. Thirty-four subjects were examined by six neurologists. The Babinski, Gordon, Chaddock, and Oppenheim reflexes were tested, and each neurologist concluded if the plantar response was flexor or extensor. Six subjects were re-tested 1 week later to determine intra-observer consistency. The Babinski reflex had the highest interobserver consistency with a kappa value of 0.5491. The Chaddock, Oppenheim, and Gordon reflexes had kappa values of 0.4065, 0.3739, and 0.3515, respectively. For intra-observer consistency, Gordon was the most consistent with a kappa value of 0.6731. When reflexes were combined in pairs, the Babinski and Chaddock reflexes together were the most reliable. The Babinski reflex was shown to be the most consistent between examiners. The Gordon reflex had the highest intra-observer consistency; however, the small sample size should limit conclusions drawn from this calculation. Clinicians often utilize more than one reflex to examine the plantar response; the combination of the Babinski and Chaddock reflexes was the most reliable.

  11. A comparison of consistency and taste of five commercial thickeners.

    Science.gov (United States)

    Pelletier, C A

    1997-01-01

    This study presents the results of a blinded test of the performance of five commercial thickners. Experimental variables considered are brand of commercial thickener, type of liquid, desired thickness, and thickening time. Success of outcome is defined by a numerical rating scale comparing the consistency and taste to actual liquid samples. The findings suggest that no one commercial thickener consistently produces a desired consistency or was consistently superior regarding taste. Success in producing certain liquid consistencies and "good taste" varied according to brand, type of liquid, desired thickness, and thickening time used. It is suggested that specific recipes be developed for each brand and liquid to be thickened. Flavorings should be tested to enhance taste.

  12. Personality and Situation Predictors of Consistent Eating Patterns.

    Directory of Open Access Journals (Sweden)

    Uku Vainik

    Full Text Available A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studied.A community-based sample of 164 women completed various personality tests, and 139 of them also reported their eating behaviour 6 times/day over 10 observational days. We focused on observations with meals (breakfast, lunch, or dinner. The participants indicated if their momentary eating patterns were consistent with their own baseline eating patterns in terms of healthiness or size of the meal. Further, participants described various characteristics of each eating situation.Eating consistency was positively predicted by trait self-control. Eating consistency was undermined by eating in the evening, eating with others, eating away from home, having consumed alcohol and having undertaken physical exercise. Interactions emerged between personality traits and situations, including punishment sensitivity, restraint, physical activity and alcohol consumption.Trait self-control and several eating situation variables were related to eating consistency. These findings provide a starting point for targeting interventions to improve consistency, suggesting that a focus on self-control skills, together with addressing contextual factors such as social situations and time of day, may be most promising. This work is a first step to provide people with the tools they need to maintain a consistently healthy lifestyle in a food-rich environment.

  13. Facial Mimicry and Emotion Consistency: Influences of Memory and Context.

    Science.gov (United States)

    Kirkham, Alexander J; Hayes, Amy E; Pawling, Ralph; Tipper, Steven P

    2015-01-01

    This study investigates whether mimicry of facial emotions is a stable response or can instead be modulated and influenced by memory of the context in which the emotion was initially observed, and therefore the meaning of the expression. The study manipulated emotion consistency implicitly, where a face expressing smiles or frowns was irrelevant and to be ignored while participants categorised target scenes. Some face identities always expressed emotions consistent with the scene (e.g., smiling with a positive scene), whilst others were always inconsistent (e.g., frowning with a positive scene). During this implicit learning of face identity and emotion consistency there was evidence for encoding of face-scene emotion consistency, with slower RTs, a reduction in trust, and inhibited facial EMG for faces expressing incompatible emotions. However, in a later task where the faces were subsequently viewed expressing emotions with no additional context, there was no evidence for retrieval of prior emotion consistency, as mimicry of emotion was similar for consistent and inconsistent individuals. We conclude that facial mimicry can be influenced by current emotion context, but there is little evidence of learning, as subsequent mimicry of emotionally consistent and inconsistent faces is similar.

  14. Facial Mimicry and Emotion Consistency: Influences of Memory and Context.

    Directory of Open Access Journals (Sweden)

    Alexander J Kirkham

    Full Text Available This study investigates whether mimicry of facial emotions is a stable response or can instead be modulated and influenced by memory of the context in which the emotion was initially observed, and therefore the meaning of the expression. The study manipulated emotion consistency implicitly, where a face expressing smiles or frowns was irrelevant and to be ignored while participants categorised target scenes. Some face identities always expressed emotions consistent with the scene (e.g., smiling with a positive scene, whilst others were always inconsistent (e.g., frowning with a positive scene. During this implicit learning of face identity and emotion consistency there was evidence for encoding of face-scene emotion consistency, with slower RTs, a reduction in trust, and inhibited facial EMG for faces expressing incompatible emotions. However, in a later task where the faces were subsequently viewed expressing emotions with no additional context, there was no evidence for retrieval of prior emotion consistency, as mimicry of emotion was similar for consistent and inconsistent individuals. We conclude that facial mimicry can be influenced by current emotion context, but there is little evidence of learning, as subsequent mimicry of emotionally consistent and inconsistent faces is similar.

  15. Protective Factors, Risk Indicators, and Contraceptive Consistency Among College Women.

    Science.gov (United States)

    Morrison, Leslie F; Sieving, Renee E; Pettingell, Sandra L; Hellerstedt, Wendy L; McMorris, Barbara J; Bearinger, Linda H

    2016-01-01

    To explore risk and protective factors associated with consistent contraceptive use among emerging adult female college students and whether effects of risk indicators were moderated by protective factors. Secondary analysis of National Longitudinal Study of Adolescent to Adult Health Wave III data. Data collected through in-home interviews in 2001 and 2002. National sample of 18- to 25-year-old women (N = 842) attending 4-year colleges. We examined relationships between protective factors, risk indicators, and consistent contraceptive use. Consistent contraceptive use was defined as use all of the time during intercourse in the past 12 months. Protective factors included external supports of parental closeness and relationship with caring nonparental adult and internal assets of self-esteem, confidence, independence, and life satisfaction. Risk indicators included heavy episodic drinking, marijuana use, and depression symptoms. Multivariable logistic regression models were used to evaluate relationships between protective factors and consistent contraceptive use and between risk indicators and contraceptive use. Self-esteem, confidence, independence, and life satisfaction were significantly associated with more consistent contraceptive use. In a final model including all internal assets, life satisfaction was significantly related to consistent contraceptive use. Marijuana use and depression symptoms were significantly associated with less consistent use. With one exception, protective factors did not moderate relationships between risk indicators and consistent use. Based on our findings, we suggest that risk and protective factors may have largely independent influences on consistent contraceptive use among college women. A focus on risk and protective factors may improve contraceptive use rates and thereby reduce unintended pregnancy among college students. Copyright © 2016 AWHONN, the Association of Women's Health, Obstetric and Neonatal Nurses. Published

  16. The SPH consistency problem and some astrophysical applications

    Science.gov (United States)

    Klapp, Jaime; Sigalotti, Leonardo; Rendon, Otto; Gabbasov, Ruslan; Torres, Ayax

    2017-11-01

    We discuss the SPH kernel and particle consistency problem and demonstrate that SPH has a limiting second-order convergence rate. We also present a solution to the SPH consistency problem. We present examples of how SPH implementations that are not mathematically consistent may lead to erroneous results. The new formalism has been implemented into the Gadget 2 code, including an improved scheme for the artificial viscosity. We present results for the ``Standard Isothermal Test Case'' of gravitational collapse and fragmentation of protostellar molecular cores that produce a very different evolution than with the standard SPH theory. A further application of accretion onto a black hole is presented.

  17. The Consistent Preferences Approach to Deductive Reasoning in Games

    CERN Document Server

    Asheim, Geir B

    2006-01-01

    "The Consistent Preferences Approach to Deductive Reasoning in Games" presents, applies, and synthesizes what my co-authors and I have called the 'consistent preferences' approach to deductive reasoning in games. Briefly described, this means that the object of the analysis is the ranking by each player of his own strategies, rather than his choice. The ranking can be required to be consistent (in different senses) with his beliefs about the opponent's ranking of her strategies. This can be contrasted to the usual 'rational choice' approach where a player's strategy choice is (in dif

  18. NTU-Bankruptcy Problems: Consistency and the Relative Adjustment Principle

    NARCIS (Netherlands)

    Dietzenbacher, B.; Borm, Peter; Estevez Fernandez, M.A.

    2017-01-01

    This paper axiomatically studies bankruptcy problems with nontransferable utility by adequately generalizing and analyzing properties for bankruptcy rules. In particular, we discuss several consistency notions and introduce the class of parametric bankruptcy rules. Moreover, we introduce the class

  19. Island of Stability for Consistent Deformations of Einstein's Gravity

    DEFF Research Database (Denmark)

    Dietrich, Dennis D.; Berkhahn, Felix; Hofmann, Stefan

    2012-01-01

    We construct deformations of general relativity that are consistent and phenomenologically viable, since they respect, in particular, cosmological backgrounds. These deformations have unique symmetries in accordance with their Minkowski cousins (Fierz-Pauli theory for massive gravitons) and incor...

  20. Consistency of Trend Break Point Estimator with Underspecified Break Number

    Directory of Open Access Journals (Sweden)

    Jingjing Yang

    2017-01-01

    Full Text Available This paper discusses the consistency of trend break point estimators when the number of breaks is underspecified. The consistency of break point estimators in a simple location model with level shifts has been well documented by researchers under various settings, including extensions such as allowing a time trend in the model. Despite the consistency of break point estimators of level shifts, there are few papers on the consistency of trend shift break point estimators in the presence of an underspecified break number. The simulation study and asymptotic analysis in this paper show that the trend shift break point estimator does not converge to the true break points when the break number is underspecified. In the case of two trend shifts, the inconsistency problem worsens if the magnitudes of the breaks are similar and the breaks are either both positive or both negative. The limiting distribution for the trend break point estimator is developed and closely approximates the finite sample performance.

  1. NTU-Bankruptcy Problems : Consistency and the Relative Adjustment Principle

    NARCIS (Netherlands)

    Dietzenbacher, Bas; Borm, Peter; Estevez Fernandez, M.A.

    2017-01-01

    This paper axiomatically studies bankruptcy problems with nontransferable utility by adequately generalizing and analyzing properties for bankruptcy rules. In particular, we discuss several consistency notions and introduce the class of parametric bankruptcy rules. Moreover, we introduce the class

  2. Uniform Consistency for Nonparametric Estimators in Null Recurrent Time Series

    DEFF Research Database (Denmark)

    Gao, Jiti; Kanaya, Shin; Li, Degui

    2015-01-01

    This paper establishes uniform consistency results for nonparametric kernel density and regression estimators when time series regressors concerned are nonstationary null recurrent Markov chains. Under suitable regularity conditions, we derive uniform convergence rates of the estimators. Our resu...

  3. A new insight into the consistency of smoothed particle hydrodynamics

    CERN Document Server

    Sigalotti, Leonardo Di G; Klapp, Jaime; Vargas, Carlos A; Campos, Kilver

    2016-01-01

    In this paper the problem of consistency of smoothed particle hydrodynamics (SPH) is solved. A novel error analysis is developed in $n$-dimensional space using the Poisson summation formula, which enables the treatment of the kernel and particle approximation errors in combined fashion. New consistency integral relations are derived for the particle approximation which correspond to the cosine Fourier transform of the classically known consistency conditions for the kernel approximation. The functional dependence of the error bounds on the SPH interpolation parameters, namely the smoothing length $h$ and the number of particles within the kernel support ${\\cal{N}}$ is demonstrated explicitly from which consistency conditions are seen to follow naturally. As ${\\cal{N}}\\to\\infty$, the particle approximation converges to the kernel approximation independently of $h$ provided that the particle mass scales with $h$ as $m\\propto h^{\\beta}$, with $\\beta >n$. This implies that as $h\\to 0$, the joint limit $m\\to 0$, $...

  4. Dynamic Consistency in Process Algebra: From Paradigm to ACP

    NARCIS (Netherlands)

    S. Andova; L.P.J. Groenewegen; E.P. de Vink (Erik Peter)

    2011-01-01

    htmlabstractThe coordination modelling language Paradigm addresses collaboration between components in terms of dynamic constraints. Within a Paradigm model, component dynamics are consistently specified at various levels of abstraction. The operational semantics of Paradigm is given. For a

  5. Consistent Yokoya-Chen Approximation to Beamstrahlung(LCC-0010)

    Energy Technology Data Exchange (ETDEWEB)

    Peskin, M

    2004-04-22

    I reconsider the Yokoya-Chen approximate evolution equation for beamstrahlung and modify it slightly to generate simple, consistent analytical approximations for the electron and photon energy spectra. I compare these approximations to previous ones, and to simulation data.I reconsider the Yokoya-Chen approximate evolution equation for beamstrahlung and modify it slightly to generate simple, consistent analytical approximations for the electron and photon energy spectra. I compare these approximations to previous ones, and to simulation data.

  6. All consistent interactions for exterior form gauge fields

    OpenAIRE

    Henneaux, Marc; Knaepen, Bernard

    1997-01-01

    We give the complete list of all first-order consistent interaction vertices for a set of exterior form gauge fields of form degree >1, described in the free limit by the standard Maxwell-like action. A special attention is paid to the interactions that deform the gauge transformations. These are shown to be necessarily of the Noether form "conserved antisymmetric tensor" times "p-form potential" and exist only in particular spacetime dimensions. Conditions for consistency to all orders in th...

  7. Articulated object tracking by rendering consistent appearance parts

    OpenAIRE

    Pezzementi, Z.; Voros, Sandrine; Hager, Gregory D.

    2009-01-01

    International audience; We describe a general methodology for tracking 3-dimensional objects in monocular and stereo video that makes use of GPU-accelerated filtering and rendering in combination with machine learning techniques. The method operates on targets consisting of kinematic chains with known geometry. The tracked target is divided into one or more areas of consistent appearance. The appearance of each area is represented by a classifier trained to assign a class-conditional probabil...

  8. Consistent description of kinetics and hydrodynamics of dusty plasma

    Energy Technology Data Exchange (ETDEWEB)

    Markiv, B. [Institute for Condensed Matter Physics of the National Academy of Sciences of Ukraine, 1 Svientsitskii St., 79011 Lviv (Ukraine); Tokarchuk, M. [Institute for Condensed Matter Physics of the National Academy of Sciences of Ukraine, 1 Svientsitskii St., 79011 Lviv (Ukraine); National University “Lviv Polytechnic,” 12 Bandera St., 79013 Lviv (Ukraine)

    2014-02-15

    A consistent statistical description of kinetics and hydrodynamics of dusty plasma is proposed based on the Zubarev nonequilibrium statistical operator method. For the case of partial dynamics, the nonequilibrium statistical operator and the generalized transport equations for a consistent description of kinetics of dust particles and hydrodynamics of electrons, ions, and neutral atoms are obtained. In the approximation of weakly nonequilibrium process, a spectrum of collective excitations of dusty plasma is investigated in the hydrodynamic limit.

  9. Behavioural consistency and life history of Rana dalmatina tadpoles.

    Science.gov (United States)

    Urszán, Tamás János; Török, János; Hettyey, Attila; Garamszegi, László Zsolt; Herczeg, Gábor

    2015-05-01

    The focus of evolutionary behavioural ecologists has recently turned towards understanding the causes and consequences of behavioural consistency, manifesting either as animal personality (consistency in a single behaviour) or behavioural syndrome (consistency across more behaviours). Behavioural type (mean individual behaviour) has been linked to life-history strategies, leading to the emergence of the integrated pace-of-life syndrome (POLS) theory. Using Rana dalmatina tadpoles as models, we tested if behavioural consistency and POLS could be detected during the early ontogenesis of this amphibian. We targeted two ontogenetic stages and measured activity, exploration and risk-taking in a common garden experiment, assessing both individual behavioural type and intra-individual behavioural variation. We observed that activity was consistent in all tadpoles, exploration only became consistent with advancing age and risk-taking only became consistent in tadpoles that had been tested, and thus disturbed, earlier. Only previously tested tadpoles showed trends indicative of behavioural syndromes. We found an activity-age at metamorphosis POLS in the previously untested tadpoles irrespective of age. Relative growth rate correlated positively with the intra-individual variation of activity of the previously untested older tadpoles. In previously tested older tadpoles, intra-individual variation of exploration correlated negatively and intra-individual variation of risk-taking correlated positively with relative growth rate. We provide evidence for behavioural consistency and POLS in predator- and conspecific-naive tadpoles. Intra-individual behavioural variation was also correlated to life history, suggesting its relevance for the POLS theory. The strong effect of moderate disturbance related to standard behavioural testing on later behaviour draws attention to the pitfalls embedded in repeated testing.

  10. S Matrix Proof of Consistency Condition Derived from Mixed Anomaly

    Science.gov (United States)

    Bhansali, Vineer

    For a confining quantum field theory with conserved current J and stress tensor T, the JJJ> and anomalies computed in terms of elementary quanta must be precisely equal to the same anomalies computed in terms of the exact physical spectrum if the conservation law corresponding to J is unbroken. These strongly constrain the allowed representations of the low energy spectrum. We present a proof of the latter consistency condition based on the proof by Coleman and Grossman of the former consistency condition.

  11. Measuring consistency of autobiographical memory recall in depression.

    LENUS (Irish Health Repository)

    Semkovska, Maria

    2012-05-15

    Autobiographical amnesia assessments in depression need to account for normal changes in consistency over time, contribution of mood and type of memories measured. We report herein validation studies of the Columbia Autobiographical Memory Interview - Short Form (CAMI-SF), exclusively used in depressed patients receiving electroconvulsive therapy (ECT) but without previous published report of normative data. The CAMI-SF was administered twice with a 6-month interval to 44 healthy volunteers to obtain normative data for retrieval consistency of its Semantic, Episodic-Extended and Episodic-Specific components and assess their reliability and validity. Healthy volunteers showed significant large decreases in retrieval consistency on all components. The Semantic and Episodic-Specific components demonstrated substantial construct validity. We then assessed CAMI-SF retrieval consistencies over a 2-month interval in 30 severely depressed patients never treated with ECT compared with healthy controls (n=19). On initial assessment, depressed patients produced less episodic-specific memories than controls. Both groups showed equivalent amounts of consistency loss over a 2-month interval on all components. At reassessment, only patients with persisting depressive symptoms were distinguishable from controls on episodic-specific memories retrieved. Research quantifying retrograde amnesia following ECT for depression needs to control for normal loss in consistency over time and contribution of persisting depressive symptoms.

  12. Self-consistency in Bicultural Persons: Dialectical Self-beliefs Mediate the Relation between Identity Integration and Self-consistency

    Science.gov (United States)

    Zhang, Rui; Noels, Kimberly A.; Lalonde, Richard N.; Salas, S. J.

    2017-01-01

    Prior research differentiates dialectical (e.g., East Asian) from non-dialectical cultures (e.g., North American and Latino) and attributes cultural differences in self-concept consistency to naïve dialecticism. In this research, we explored the effects of managing two cultural identities on consistency within the bicultural self-concept via the role of dialectical beliefs. Because the challenge of integrating more than one culture within the self is common to biculturals of various heritage backgrounds, the effects of bicultural identity integration should not depend on whether the heritage culture is dialectical or not. In four studies across diverse groups of bicultural Canadians, we showed that having an integrated bicultural identity was associated with being more consistent across roles (Studies 1–3) and making less ambiguous self-evaluations (Study 4). Furthermore, dialectical self-beliefs mediated the effect of bicultural identity integration on self-consistency (Studies 2–4). Finally, Latino biculturals reported being more consistent across roles than did East Asian biculturals (Study 2), revealing the ethnic heritage difference between the two groups. We conclude that both the content of heritage culture and the process of integrating cultural identities influence the extent of self-consistency among biculturals. Thus, consistency within the bicultural self-concept can be understood, in part, to be a unique psychological product of bicultural experience. PMID:28326052

  13. Near-resonant absorption in the time-dependent self-consistent field and multiconfigurational self-consistent field approximations

    DEFF Research Database (Denmark)

    Norman, Patrick; Bishop, David M.; Jensen, Hans Jørgen Aa

    2001-01-01

    Computationally tractable expressions for the evaluation of the linear response function in the multiconfigurational self-consistent field approximation were derived and implemented. The finite lifetime of the electronically excited states was considered and the linear response function was shown...

  14. Martial arts striking hand peak acceleration, accuracy and consistency.

    Science.gov (United States)

    Neto, Osmar Pinto; Marzullo, Ana Carolina De Miranda; Bolander, Richard P; Bir, Cynthia A

    2013-01-01

    The goal of this paper was to investigate the possible trade-off between peak hand acceleration and accuracy and consistency of hand strikes performed by martial artists of different training experiences. Ten male martial artists with training experience ranging from one to nine years volunteered to participate in the experiment. Each participant performed 12 maximum effort goal-directed strikes. Hand acceleration during the strikes was obtained using a tri-axial accelerometer block. A pressure sensor matrix was used to determine the accuracy and consistency of the strikes. Accuracy was estimated by the radial distance between the centroid of each subject's 12 strikes and the target, whereas consistency was estimated by the square root of the 12 strikes mean squared distance from their centroid. We found that training experience was significantly correlated to hand peak acceleration prior to impact (r(2)=0.456, p =0.032) and accuracy (r(2)=0. 621, p=0.012). These correlations suggest that more experienced participants exhibited higher hand peak accelerations and at the same time were more accurate. Training experience, however, was not correlated to consistency (r(2)=0.085, p=0.413). Overall, our results suggest that martial arts training may lead practitioners to achieve higher striking hand accelerations with better accuracy and no change in striking consistency.

  15. Cognitive consistency and math-gender stereotypes in Singaporean children.

    Science.gov (United States)

    Cvencek, Dario; Meltzoff, Andrew N; Kapur, Manu

    2014-01-01

    In social psychology, cognitive consistency is a powerful principle for organizing psychological concepts. There have been few tests of cognitive consistency in children and no research about cognitive consistency in children from Asian cultures, who pose an interesting developmental case. A sample of 172 Singaporean elementary school children completed implicit and explicit measures of math-gender stereotype (male=math), gender identity (me=male), and math self-concept (me=math). Results showed strong evidence for cognitive consistency; the strength of children's math-gender stereotypes, together with their gender identity, significantly predicted their math self-concepts. Cognitive consistency may be culturally universal and a key mechanism for developmental change in social cognition. We also discovered that Singaporean children's math-gender stereotypes increased as a function of age and that boys identified with math more strongly than did girls despite Singaporean girls' excelling in math. The results reveal both cultural universals and cultural variation in developing social cognition. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. Consistency of response and image recognition, pulmonary nodules.

    Science.gov (United States)

    Haygood, T M; Liu, M A Q; Galvan, E; Bassett, R; Murphy, W A; Ng, C S; Matamoros, A; Marom, E M

    2014-06-01

    To investigate the effect of recognition of a previously encountered radiograph on consistency of response in localized pulmonary nodules. 13 radiologists interpreted 40 radiographs each to locate pulmonary nodules. A few days later, they again interpreted 40 radiographs. Half of the images in the second set were new. We asked the radiologists whether each image had been in the first set. We used Fisher's exact test and Kruskal-Wallis test to evaluate the correlation between recognition of an image and consistency in its interpretation. We evaluated the data using all possible recognition levels-definitely, probably or possibly included vs definitely, probably or possibly not included by collapsing the recognition levels into two and by eliminating the "possibly included" and "possibly not included" scores. With all but one of six methods of looking at the data, there was no significant correlation between consistency in interpretation and recognition of the image. When the possibly included and possibly not included scores were eliminated, there was a borderline statistical significance (p = 0.04) with slightly greater consistency in interpretation of recognized than that of non-recognized images. We found no convincing evidence that radiologists' recognition of images in an observer performance study affects their interpretation on a second encounter. Conscious recognition of chest radiographs did not result in a greater degree of consistency in the tested interpretation than that in the interpretation of images that were not recognized.

  17. Gravitationally Consistent Halo Catalogs and Merger Trees for Precision Cosmology

    Science.gov (United States)

    Behroozi, Peter S.; Wechsler, Risa H.; Wu, Hao-Yi; Busha, Michael T.; Klypin, Anatoly A.; Primack, Joel R.

    2013-01-01

    We present a new algorithm for generating merger trees and halo catalogs which explicitly ensures consistency of halo properties (mass, position, and velocity) across time steps. Our algorithm has demonstrated the ability to improve both the completeness (through detecting and inserting otherwise missing halos) and purity (through detecting and removing spurious objects) of both merger trees and halo catalogs. In addition, our method is able to robustly measure the self-consistency of halo finders; it is the first to directly measure the uncertainties in halo positions, halo velocities, and the halo mass function for a given halo finder based on consistency between snapshots in cosmological simulations. We use this algorithm to generate merger trees for two large simulations (Bolshoi and Consuelo) and evaluate two halo finders (ROCKSTAR and BDM). We find that both the ROCKSTAR and BDM halo finders track halos extremely well; in both, the number of halos which do not have physically consistent progenitors is at the 1%-2% level across all halo masses. Our code is publicly available at http://code.google.com/p/consistent-trees. Our trees and catalogs are publicly available at http://hipacc.ucsc.edu/Bolshoi/.

  18. Analysis of Consistency of Printing Blankets using Correlation Technique

    Directory of Open Access Journals (Sweden)

    Lalitha Jayaraman

    2010-01-01

    Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thicknessdirection. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimizing the torque for blankets from different manufacturers.

  19. Self-consistent hybrid functionals for solids: a fully-automated implementation

    Science.gov (United States)

    Erba, A.

    2017-08-01

    A fully-automated algorithm for the determination of the system-specific optimal fraction of exact exchange in self-consistent hybrid functionals of the density-functional-theory is illustrated, as implemented into the public Crystal program. The exchange fraction of this new class of functionals is self-consistently updated proportionally to the inverse of the dielectric response of the system within an iterative procedure (Skone et al 2014 Phys. Rev. B 89, 195112). Each iteration of the present scheme, in turn, implies convergence of a self-consistent-field (SCF) and a coupled-perturbed-Hartree-Fock/Kohn-Sham (CPHF/KS) procedure. The present implementation, beside improving the user-friendliness of self-consistent hybrids, exploits the unperturbed and electric-field perturbed density matrices from previous iterations as guesses for subsequent SCF and CPHF/KS iterations, which is documented to reduce the overall computational cost of the whole process by a factor of 2.

  20. The Shakespeare Project: Experiments in Multimedia.

    Science.gov (United States)

    Friedlander, Larry

    1991-01-01

    Describes the Shakespeare Project, a multimedia system on HyperCard with a two-screen workstation linking a Macintosh, videodisc player, and video monitor. States that this project brings theater to students as a serious object of study. (MG)

  1. Context-dependent individual behavioral consistency in Daphnia

    DEFF Research Database (Denmark)

    Heuschele, Jan; Ekvall, Mikael T.; Bianco, Giuseppe

    2017-01-01

    The understanding of consistent individual differences in behavior, often termed "personality," for adapting and coping with threats and novel environmental conditions has advanced considerably during the last decade. However, advancements are almost exclusively associated with higher-order animals......, whereas studies focusing on smaller aquatic organisms are still rare. Here, we show individual differences in the swimming behavior of Daphnia magna, a clonal freshwater invertebrate, before, during, and after being exposed to a lethal threat, ultraviolet radiation (UVR). We show consistency in swimming...... that of adults. Overall, we show that aquatic invertebrates are far from being identical robots, but instead they show considerable individual differences in behavior that can be attributed to both ontogenetic development and individual consistency. Our study also demonstrates, for the first time...

  2. Self-Consistent Green’s Function Approaches

    Science.gov (United States)

    Barbieri, Carlo; Carbone, Arianna

    We present the fundamental techniques and working equations of many-body Green's function theory for calculating ground state properties and the spectral strength. Green's function methods closely relate to other polynomial scaling approaches discussed in Chaps. 8 and 10. However, here we aim directly at a global view of the many-fermion structure. We derive the working equations for calculating many-body propagators, using both the Algebraic Diagrammatic Construction technique and the self-consistent formalism at finite temperature. Their implementation is discussed, as well as the inclusion of three-nucleon interactions. The self-consistency feature is essential to guarantee thermodynamic consistency. The pairing and neutron matter models introduced in previous chapters are solved and compared with the other methods in this book.

  3. Family socioeconomic status and consistent environmental stimulation in early childhood.

    Science.gov (United States)

    Crosnoe, Robert; Leventhal, Tama; Wirth, R J; Pierce, Kim M; Pianta, Robert C

    2010-01-01

    The transition into school occurs at the intersection of multiple environmental settings. This study applied growth curve modeling to a sample of 1,364 American children, followed from birth through age 6, who had been categorized by their exposure to cognitive stimulation at home and in preschool child care and 1st-grade classrooms. Of special interest was the unique and combined contribution to early learning of these 3 settings. Net of socioeconomic selection into different settings, children had higher math achievement when they were consistently stimulated in all 3, and they had higher reading achievement when consistently stimulated at home and in child care. The observed benefits of consistent environmental stimulation tended to be more pronounced for low-income children.

  4. Consistent forcing scheme in the cascaded lattice Boltzmann method

    Science.gov (United States)

    Fei, Linlin; Luo, Kai Hong

    2017-11-01

    In this paper, we give an alternative derivation for the cascaded lattice Boltzmann method (CLBM) within a general multiple-relaxation-time (MRT) framework by introducing a shift matrix. When the shift matrix is a unit matrix, the CLBM degrades into an MRT LBM. Based on this, a consistent forcing scheme is developed for the CLBM. The consistency of the nonslip rule, the second-order convergence rate in space, and the property of isotropy for the consistent forcing scheme is demonstrated through numerical simulations of several canonical problems. Several existing forcing schemes previously used in the CLBM are also examined. The study clarifies the relation between MRT LBM and CLBM under a general framework.

  5. Self-organization of a hybrid nanostructure consisting of a nanoneedle and nanodot.

    Science.gov (United States)

    Liu, Hai; Wu, Junsheng; Wang, Ying; Chow, Chee Lap; Liu, Qing; Gan, Chee Lip; Tang, Xiaohong; Rawat, Rajdeep Singh; Tan, Ooi Kiang; Ma, Jan; Huang, Yizhong

    2012-09-24

    A special materials system that allows the self-organization of a unique hybrid nanonipple structure is developed. The system consists of a nanoneedle with a small nanodot sitting on top. Such hybrid nanonipples provide building blocks to assemble functional devices with significantly improved performance. The application of the system to high-sensitivity gas sensors is also demonstrated. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Modeling electrokinetic flows by consistent implicit incompressible smoothed particle hydrodynamics

    Science.gov (United States)

    Pan, Wenxiao; Kim, Kyungjoo; Perego, Mauro; Tartakovsky, Alexandre M.; Parks, Michael L.

    2017-04-01

    We present a consistent implicit incompressible smoothed particle hydrodynamics (I2SPH) discretization of Navier-Stokes, Poisson-Boltzmann, and advection-diffusion equations subject to Dirichlet or Robin boundary conditions. It is applied to model various two and three dimensional electrokinetic flows in simple or complex geometries. The accuracy and convergence of the consistent I2SPH are examined via comparison with analytical solutions, grid-based numerical solutions, or empirical models. The new method provides a framework to explore broader applications of SPH in microfluidics and complex fluids with charged objects, such as colloids and biomolecules, in arbitrary complex geometries.

  7. A Van Atta reflector consisting of half-wave dipoles

    DEFF Research Database (Denmark)

    Appel-Hansen, Jørgen

    1966-01-01

    The reradiation pattern of a passive Van Atta reflector consisting of half-wave dipoles is investigated. The character of the reradiation pattern first is deduced by qualitative and physical considerations. Various types of array elements are considered and several geometrical configurations...... of these elements are outlined. Following this, an analysis is made of the reradiation pattern of a linear Van Atta array consisting of four equispaced half-wave dipoles. The general form of the reradiation pattern is studied analytically. The influence of scattering and coupling is determined and the dependence...

  8. Promotion of the scissors motion by consistent interactions

    Energy Technology Data Exchange (ETDEWEB)

    Nojarov, R. (Inst. fuer Theoretische Physik, Univ. Tuebingen (Germany)); Faessler, A. (Inst. fuer Theoretische Physik, Univ. Tuebingen (Germany))

    1994-05-16

    A relationship is found between the quantized isovector rotor and the microscopic RPA formalism. The scissors restoring force is obtained from a sum-rule approach. It is not influenced significantly by symmetry-restoring interactions, derived consistently from the deformed potential. The resulting neutron-proton interaction generates a scissors restoring force, which almost coincides with that extracted by the relative angular momentum from the deformed mean field alone. Such consistent interactions do not favour much orbital M1 strength at high energy and the scissors mode fragments mainly over the low-lying orbital 1[sup +] excitations. (orig.)

  9. Detection and quantification of flow consistency in business process models

    DEFF Research Database (Denmark)

    Burattin, Andrea; Bernstein, Vered; Neurauter, Manuel

    2017-01-01

    , to show how such features can be quantified into computational metrics, which are applicable to business process models. We focus on one particular feature, consistency of flow direction, and show the challenges that arise when transforming it into a precise metric. We propose three different metrics...... addressing these challenges, each following a different view of flow consistency. We then report the results of an empirical evaluation, which indicates which metric is more effective in predicting the human perception of this feature. Moreover, two other automatic evaluations describing the performance...

  10. Standard Model Vacuum Stability and Weyl Consistency Conditions

    DEFF Research Database (Denmark)

    Antipin, Oleg; Gillioz, Marc; Krog, Jens

    2013-01-01

    At high energy the standard model possesses conformal symmetry at the classical level. This is reflected at the quantum level by relations between the different beta functions of the model. These relations are known as the Weyl consistency conditions. We show that it is possible to satisfy them...... order by order in perturbation theory, provided that a suitable coupling constant counting scheme is used. As a direct phenomenological application, we study the stability of the standard model vacuum at high energies and compare with previous computations violating the Weyl consistency conditions....

  11. Weyl consistency conditions in non-relativistic quantum field theory

    Energy Technology Data Exchange (ETDEWEB)

    Pal, Sridip; Grinstein, Benjamín [Department of Physics, University of California,San Diego, 9500 Gilman Drive, La Jolla, CA 92093 (United States)

    2016-12-05

    Weyl consistency conditions have been used in unitary relativistic quantum field theory to impose constraints on the renormalization group flow of certain quantities. We classify the Weyl anomalies and their renormalization scheme ambiguities for generic non-relativistic theories in 2+1 dimensions with anisotropic scaling exponent z=2; the extension to other values of z are discussed as well. We give the consistency conditions among these anomalies. As an application we find several candidates for a C-theorem. We comment on possible candidates for a C-theorem in higher dimensions.

  12. The cluster bootstrap consistency in generalized estimating equations

    KAUST Repository

    Cheng, Guang

    2013-03-01

    The cluster bootstrap resamples clusters or subjects instead of individual observations in order to preserve the dependence within each cluster or subject. In this paper, we provide a theoretical justification of using the cluster bootstrap for the inferences of the generalized estimating equations (GEE) for clustered/longitudinal data. Under the general exchangeable bootstrap weights, we show that the cluster bootstrap yields a consistent approximation of the distribution of the regression estimate, and a consistent approximation of the confidence sets. We also show that a computationally more efficient one-step version of the cluster bootstrap provides asymptotically equivalent inference. © 2012.

  13. Towards consistent nuclear models and comprehensive nuclear data evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Bouland, O [Los Alamos National Laboratory; Hale, G M [Los Alamos National Laboratory; Lynn, J E [Los Alamos National Laboratory; Talou, P [Los Alamos National Laboratory; Bernard, D [FRANCE; Litaize, O [FRANCE; Noguere, G [FRANCE; De Saint Jean, C [FRANCE; Serot, O [FRANCE

    2010-01-01

    The essence of this paper is to enlighten the consistency achieved nowadays in nuclear data and uncertainties assessments in terms of compound nucleus reaction theory from neutron separation energy to continuum. Making the continuity of theories used in resolved (R-matrix theory), unresolved resonance (average R-matrix theory) and continuum (optical model) rangcs by the generalization of the so-called SPRT method, consistent average parameters are extracted from observed measurements and associated covariances are therefore calculated over the whole energy range. This paper recalls, in particular, recent advances on fission cross section calculations and is willing to suggest some hints for future developments.

  14. Prevalence and factors influencing consistent condom use among ...

    African Journals Online (AJOL)

    Conclusions: Majority of youths are sexually active with few of them not consistently using condom. Parents should be informed on the sexual behavior of the youth inorder to discuss about safer sexual behavior with their youths while encourage abstinence or postponement of sexually activity. Youth peer education need to ...

  15. Consistency Checking for Euclidean Spatial Constraints: A Dimension Graph Approach

    Science.gov (United States)

    2000-07-11

    EA Boiten J Derrick H Bowman and MWA Steen Constructive Consistency Checking for Partial Specication in Z In Science of Computer Programming number...cation In Science of Computer Programming pages April T Cormen C Leiserson and R Rivest Introduction to Algorithms The MIT

  16. Consistent measurements comparing the drift features of noble gas mixtures

    CERN Document Server

    Becker, U; Fortunato, E M; Kirchner, J; Rosera, K; Uchida, Y

    1999-01-01

    We present a consistent set of measurements of electron drift velocities and Lorentz deflection angles for all noble gases with methane and ethane as quenchers in magnetic fields up to 0.8 T. Empirical descriptions are also presented. Details on the World Wide Web allow for guided design and optimization of future detectors.

  17. Nonlinear wave dynamics in self-consistent water channels

    Science.gov (United States)

    Pelinovsky, Efim; Didenkulova, Ira; Shurgalina, Ekaterina; Aseeva, Nataly

    2017-12-01

    We study long-wave dynamics in a self-consistent water channel of variable cross-section, taking into account the effects of weak nonlinearity and dispersion. The self-consistency of the water channel is considered within the linear shallow water theory, which implies that the channel depth and width are interrelated, so the wave propagates in such a channel without inner reflection from the bottom even if the water depth changes significantly. In the case of small-amplitude weakly dispersive waves, the reflection from the bottom is also small, which allows the use of a unidirectional approximation. A modified equation for Riemann waves is derived for the nondispersive case. The wave-breaking criterion (gradient catastrophe) for self-consistent channels is defined. If both weak nonlinearity and dispersion are accounted for, the variable-coefficient Korteweg–de Vries (KdV) equation for waves in self-consistent channels is derived. Note that this is the first time that a KdV equation has been derived for waves in strongly inhomogeneous media. Soliton transformation in a channel with an abrupt change in depth is also studied.

  18. Usability problem reports for comparative studies : Consistency and inspectability

    NARCIS (Netherlands)

    Vermeeren, A.P.O.S.; Attema, J.; Akar, E.; De Ridder, H.; Van Doorn, A.J.; Erburg, Ç.; Berkman, A.E.; Maguire, M.

    2008-01-01

    This study explores issues of consistency and inspectability in usability test data analysis processes and reports. Problem reports resulting from usability tests performed by three professional usability labs in three different countries are compared. Each of the labs conducted a usability test on

  19. Delimiting coefficient alpha from internal consistency and unidimensionality

    NARCIS (Netherlands)

    Sijtsma, K.

    2015-01-01

    I discuss the contribution by Davenport, Davison, Liou, & Love (2015) in which they relate reliability represented by coefficient α to formal definitions of internal consistency and unidimensionality, both proposed by Cronbach (1951). I argue that coefficient α is a lower bound to reliability and

  20. Probability machines: consistent probability estimation using nonparametric learning machines.

    Science.gov (United States)

    Malley, J D; Kruppa, J; Dasgupta, A; Malley, K G; Ziegler, A

    2012-01-01

    Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications.

  1. Planck 2013 results. XXXI. Consistency of the Planck data

    DEFF Research Database (Denmark)

    Ade, P. A. R.; Arnaud, M.; Ashdown, M.

    2014-01-01

    The Planck design and scanning strategy provide many levels of redundancy that can be exploited to provide tests of internal consistency. One of the most important is the comparison of the 70 GHz (amplifier) and 100 GHz (bolometer) channels. Based on dierent instrument technologies, with feeds lo...

  2. The utility of theory of planned behavior in predicting consistent ...

    African Journals Online (AJOL)

    admin

    Objective: To examine the utility of theory of planned behavior in predicting consistent condom use intention of HIV patients who ... of planned behavior model, attitude (r=0.31: p<0.001), subjective norm (r=0.39: P<0.001), perceived behavioral control (r= ..... Journal of Social Psychology 2001;40(Pt 4):471-99. 24. Renfroe ...

  3. Consistent histories: Description of a world with increasing entropy

    Directory of Open Access Journals (Sweden)

    C. H. Woo

    2000-07-01

    Full Text Available A distinction is made between two kinds of consistent histories: (1 robust histories consistent by virtue of decoherence, and (2 verifiable histories consistent through the existence of accessible records. It is events in verifiable histories which describe amplified quantum fluctuations. If the consistent-histories formalism is to improve on the Copenhagen interpretation by providing a self-contained quantum representation of the quasi-classical world, the appropriate quantum state must track closely all macroscopic phenomena, and the von Neumann entropy of that quantum state ought to change in the same direction as the statistical entropy of the macro-world. Since the von Neumann entropy tends to decrease under successive branchings, the evolution of an entropy-increasing quasi-classical world is not described by a process of branchings only: mergings of previously separate histories must also occur. As a consequence, the number of possible quasi-classical worlds does not have to grow indefinitely as in the many-world picture.

  4. Consistent estimation of linear panel data models with measurement error

    NARCIS (Netherlands)

    Meijer, Erik; Spierdijk, Laura; Wansbeek, Thomas

    2017-01-01

    Measurement error causes a bias towards zero when estimating a panel data linear regression model. The panel data context offers various opportunities to derive instrumental variables allowing for consistent estimation. We consider three sources of moment conditions: (i) restrictions on the

  5. Consistency, integration, and reuse in multi-disciplinary design processes

    NARCIS (Netherlands)

    Woestenenk, Krijn

    2014-01-01

    Modern product development becomes an increasingly difficult and complex activity. Issues in such product development can best be managed during the design process. An analysis of issues will show that guarding consistency, facilitating integration, and reusing design information, are good ways to

  6. Challenges of Predictability and Consistency in the First ...

    African Journals Online (AJOL)

    Abstract: Predictability and consistency are requirements that should run like a golden thread through the macro-, medio- as well as the microstructure of dictionary articles. Adher- ence to these requirements is one of the marks of a user-friendly reference work that will allow for easy access and trouble-free retrieval of ...

  7. Gender Differentials in Consistent Condom Use among Young ...

    African Journals Online (AJOL)

    Cross tabulation and chi-square results revealed a relationship between age, gender, place of location, socio-economic background, employment status, level of education media exposure, alcohol consumption and perceived risk of contracting HIV with the consistent use of condoms. For females, regression analyses ...

  8. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  9. Assessing the consistency between AVHRR and MODIS NDVI ...

    Indian Academy of Sciences (India)

    Keywords. Net ecosystem productivity; net primary productivity; carbon cycle; NDVI; CASA; India. Abstract. This study examines the consistency between the AVHRR and MODIS normalized difference vegetation index (NDVI) datasets in estimating net primary productivity (NPP) and net ecosystem productivity (NEP) over ...

  10. Consistency and Development of Prosocial Dispositions: A Longitudinal Study.

    Science.gov (United States)

    Eisenberg, Nancy; Guthrie, Ivanna K.; Murphy, Bridget C.; Shepard, Stephanie A.; Cumberland, Amanda; Carol, Gustavo

    1999-01-01

    Examined consistency in prosocial dispositions, using longitudinal data from preschool through early adulthood. Found that spontaneous prosocial behavior observed in preschools predicted actual prosocial behavior, other- and self-reported prosocial behavior, self-reported sympathy, and perspective taking in childhood to early adulthood. Prosocial…

  11. The Impact of Orthographic Consistency on German Spoken Word Identification

    Science.gov (United States)

    Beyermann, Sandra; Penke, Martina

    2014-01-01

    An auditory lexical decision experiment was conducted to find out whether sound-to-spelling consistency has an impact on German spoken word processing, and whether such an impact is different at different stages of reading development. Four groups of readers (school children in the second, third and fifth grades, and university students)…

  12. Weakly time consistent concave valuations and their dual representations

    NARCIS (Netherlands)

    Roorda, B.; Schumacher, Hans

    We derive dual characterizations of two notions of weak time consistency for concave valuations, which are convex risk measures under a positive sign convention. Combined with a suitable risk aversion property, these notions are shown to amount to three simple rules for not necessarily minimal

  13. Weakly time consistent concave valuations and their dual representations

    NARCIS (Netherlands)

    Roorda, Berend; Schumacher, Johannes M.

    2016-01-01

    We derive dual characterizations of two notions of weak time consistency for concave valuations, which are convex risk measures under a positive sign convention. Combined with a suitable risk aversion property, these notions are shown to amount to three simple rules for not necessarily minimal

  14. Personalities in great tits, Parus major : stability and consistency

    NARCIS (Netherlands)

    Carere, C; Drent, Piet J.; Privitera, Lucia; Koolhaas, Jaap M.; Groothuis, TGG

    2005-01-01

    We carried out a longitudinal study on great tits from two lines bidirectionally selected for fast or slow exploratory performance during the juvenile phase, a trait thought to reflect different personalities. We analysed temporal stability and consistency of responses within and between situations

  15. Estimating Classification Consistency and Accuracy for Cognitive Diagnostic Assessment

    Science.gov (United States)

    Cui, Ying; Gierl, Mark J.; Chang, Hua-Hua

    2012-01-01

    This article introduces procedures for the computation and asymptotic statistical inference for classification consistency and accuracy indices specifically designed for cognitive diagnostic assessments. The new classification indices can be used as important indicators of the reliability and validity of classification results produced by…

  16. Gregory Research Beliefs Scale: Factor Structure and Internal Consistency

    Science.gov (United States)

    Gregory, Virgil L., Jr.

    2010-01-01

    Objective: This study evaluates the factor structure and internal consistency of the Gregory Research Beliefs Scale (GRBS). Method: Data were collected from subject matter experts, a pilot study, an online sample, and a classroom sample. Psychometric analyses were conducted after combining the online and classroom samples. Results: An a priori…

  17. Consistency of the Takens estimator for the correlation dimension

    NARCIS (Netherlands)

    Borovkova, S.; Burton, Robert; Dehling, H.

    Motivated by the problem of estimating the fractal dimension of a strange attractor, we prove weak consistency of U-statistics for stationary ergodic and mixing sequences when the kernel function is unbounded, extending by this earlier results of Aaronson, Burton, Dehling, Gilat, Hill and Weiss. We

  18. Socio-Cultural Context of Consistent Use of Condoms among ...

    African Journals Online (AJOL)

    This paper examines the sexual behaviour and socio-cultural context of consistent use of condoms (male and female condoms) among female undergraduate students of the University of Lagos, Nigeria. Cross-sectional survey and key informant interview research methods were adopted to elicit information from the ...

  19. A Consistent Pricing Model for Index Options and Volatility Derivatives

    DEFF Research Database (Denmark)

    Kokholm, Thomas

    We propose and study a flexible modeling framework for the joint dynamics of an index and a set of forward variance swap rates written on this index, allowing options on forward variance swaps and options on the underlying index to be priced consistently. Our model reproduces various empirically ...

  20. Final Report Fermionic Symmetries and Self consistent Shell Model

    Energy Technology Data Exchange (ETDEWEB)

    Larry Zamick

    2008-11-07

    In this final report in the field of theoretical nuclear physics we note important accomplishments.We were confronted with "anomoulous" magnetic moments by the experimetalists and were able to expain them. We found unexpected partial dynamical symmetries--completely unknown before, and were able to a large extent to expain them.The importance of a self consistent shell model was emphasized.

  1. Robust Visual Tracking Via Consistent Low-Rank Sparse Learning

    KAUST Repository

    Zhang, Tianzhu

    2014-06-19

    Object tracking is the process of determining the states of a target in consecutive video frames based on properties of motion and appearance consistency. In this paper, we propose a consistent low-rank sparse tracker (CLRST) that builds upon the particle filter framework for tracking. By exploiting temporal consistency, the proposed CLRST algorithm adaptively prunes and selects candidate particles. By using linear sparse combinations of dictionary templates, the proposed method learns the sparse representations of image regions corresponding to candidate particles jointly by exploiting the underlying low-rank constraints. In addition, the proposed CLRST algorithm is computationally attractive since temporal consistency property helps prune particles and the low-rank minimization problem for learning joint sparse representations can be efficiently solved by a sequence of closed form update operations. We evaluate the proposed CLRST algorithm against 14 state-of-the-art tracking methods on a set of 25 challenging image sequences. Experimental results show that the CLRST algorithm performs favorably against state-of-the-art tracking methods in terms of accuracy and execution time.

  2. Brief Report: Consistency of Search Engine Rankings for Autism Websites

    Science.gov (United States)

    Reichow, Brian; Naples, Adam; Steinhoff, Timothy; Halpern, Jason; Volkmar, Fred R.

    2012-01-01

    The World Wide Web is one of the most common methods used by parents to find information on autism spectrum disorders and most consumers find information through search engines such as Google or Bing. However, little is known about how the search engines operate or the consistency of the results that are returned over time. This study presents the…

  3. Language consistency: a missing link in theory, research and reasoning?

    Science.gov (United States)

    Webber, Pamela B

    2010-01-01

    This paper is a report of a study conducted to determine if there is a lack of language consistency in nursing texts and among nursing educators related to theory, research and reasoning. Analysis of international language, communication and nursing literature revealed the essential role language consistency plays in promoting competency and desirable behavioural outcomes. It also revealed the difficulty nursing has had in gaining consistency of language associated with theory, research and reasoning. Six nursing theory, research and terminology texts with international circulation were analysed to identify key words related to theory, research and reasoning and to determine consistency of definitions among texts. Next, 97 nursing educators were surveyed to determine their ability to define the same key words. Educators were recruited during three national nursing education meetings in the United States of America between 2006 and 2007. Analysis revealed significant variability in definitions of key words among texts and in nursing educators' ability to match any defining words used by the texts and by each other. Epistemologically, the use of inconsistent language among texts and educators may be playing a role in the continuing marginalization of theory, research and reasoning in nursing education and practice; ultimately, hindering the profession's epistemological growth. Recommendations include a call for consensus of key words and meanings associated with theory, research and reasoning and the development of a language essentials document to support development of competency.

  4. Program Standards and Expectations: Providing Clarity, Consistency, and Focus

    Science.gov (United States)

    Diem, Keith G.

    2016-01-01

    The effort described in this article resulted from requests for clarity and consistency from new and existing Extension/4-H educators as well as from recommendations by university auditors. The primary purpose of the effort was to clarify standards for effective county-based 4-H youth development programs and to help focus the roles of 4-H…

  5. Consistent comparisons of attainment and shortfall inequality: a critical examination

    NARCIS (Netherlands)

    Bosmans, K.G.M.

    2016-01-01

    An inequality measure is ‘consistent’ if it ranks distributions the same irrespective of whether health quantities are represented in terms of attainments or shortfalls. This consistency property severely restricts the set of admissible inequality measures. We show that, within a more general

  6. A Consistent Pricing Model for Index Options and Volatility Derivatives

    DEFF Research Database (Denmark)

    Kokholm, Thomas

    to be priced consistently, while allowing for jumps in volatility and returns. An affine specification using Lévy processes as building blocks leads to analytically tractable pricing formulas for volatility derivatives, such as VIX options, as well as efficient numerical methods for pricing of European options...

  7. On the consistency of a separator | Harper | Quaestiones ...

    African Journals Online (AJOL)

    A smoother can be called a separator if it is idempotent and co-idempotent, with the motivation coming from a physical analogy. This is a rudementary consistency, and type of “linearity”, which has been found to occur in practice. All the LULU-separator turn out to act linearly on any non-negative combination of the two ...

  8. Body saccades of Drosophila consist of stereotyped banked turns

    NARCIS (Netherlands)

    Muijres, F.T.; Elzinga, M.J.; Iwasaki, N.A.; Dickinson, M.H.

    2015-01-01

    The flight pattern of many fly species consists of straight flight segments interspersed with rapid turns called body saccades, a strategy that is thought to minimize motion blur. We analyzed the body saccades of fruit flies (Drosophila hydei), using high-speed 3D videography to track body and wing

  9. Consistency in behavior of the CEO regarding corporate social responsibility

    NARCIS (Netherlands)

    Elving, W.J.L.; Kartal, D.

    2012-01-01

    Purpose - When corporations adopt a corporate social responsibility (CSR) program and use and name it in their external communications, their members should act in line with CSR. The purpose of this paper is to present an experiment in which the consistent or inconsistent behavior of a CEO was

  10. What does consistent participation in 401(k) plans generate?

    Science.gov (United States)

    VanDerhei, Jack; Holden, Sarah; Alonso, Luis

    2009-07-01

    EBRI/ICI 401(K) DATABASE: The annual EBRI/ICI 401(k) database update report is based on large cross-sections of 401(k) plan participants. Whereas the cross-sections cover participants with a wide range of participation experience in 401(k) plans, meaningful analysis of the potential for 401(k) participants to accumulate retirement assets over time must examine how a consistent group of participants' accounts have performed over the long term. Looking at consistent participants in the EBRI/ICI 401(k) database over the eight-year period from 1999 to 2007: The average 401(k) account balance increased at an annual growth rate of 9.5 percent over the period, to $137,430 at year-end 2007. The median 401(k) account balance (half above, half below) increased at an annual growth rate of 15.2 percent over the period, to $76,946 at year-end 2007. ANALYSIS OF A CONSISTENT GROUP OF 401(K) PARTICIPANTS HIGHLIGHTS THE ACCUMULATION POTENTIAL OF 401(K) PLANS. At year-end 2007, the average account balance among consistent participants was double the average account balance among all participants in the EBRI/ICI 401(k) database. The consistent group's median balance was more than four times larger than the median balance across all participants at year-end 2007. YOUNGER PARTICIPANTS OR THOSE WITH SMALLER INITIAL BALANCES EXPERIENCED HIGHER GROWTH IN ACCOUNT BALANCES COMPARED WITH OLDER PARTICIPANTS OR THOSE WITH LARGER INITIAL BALANCES. Among the consistent group, individual participant experience is influenced by three primary factors that impact account balances: contributions, investment returns, and withdrawal and loan activity. For example, the average account balance of participants in their 20s was heavily influenced by the relative size of contributions to the account balances and increased at an average growth rate of 36.0 percent per year between year-end 1999 and year-end 2007. 401(K) PARTICIPANTS TEND TO CONCENTRATE THEIR ACCOUNTS IN EQUITY SECURITIES. The asset

  11. Managing Consistency Anomalies in Distributed Integrated Databases with Relaxed ACID Properties

    DEFF Research Database (Denmark)

    Frank, Lars; Ulslev Pedersen, Rasmus

    2014-01-01

    distributed consistency property. We will also illustrate how to use the countermeasures against the consistency anomalies in ERP systems integrated with heterogeneous E-commerce systems and the databases of mobile salesman ERP modules. The methods described in this paper may be used in so called CAP...... has to be optimized. Therefore, we will in this paper use so called relaxed ACID properties across different locations. The objective of designing relaxed ACID properties across different database locations is that the users can trust the data they use even if the distributed database temporarily...

  12. Stable functional networks exhibit consistent timing in the human brain.

    Science.gov (United States)

    Chapeton, Julio I; Inati, Sara K; Zaghloul, Kareem A

    2017-03-01

    Despite many advances in the study of large-scale human functional networks, the question of timing, stability, and direction of communication between cortical regions has not been fully addressed. At the cellular level, neuronal communication occurs through axons and dendrites, and the time required for such communication is well defined and preserved. At larger spatial scales, however, the relationship between timing, direction, and communication between brain regions is less clear. Here, we use a measure of effective connectivity to identify connections between brain regions that exhibit communication with consistent timing. We hypothesized that if two brain regions are communicating, then knowledge of the activity in one region should allow an external observer to better predict activity in the other region, and that such communication involves a consistent time delay. We examine this question using intracranial electroencephalography captured from nine human participants with medically refractory epilepsy. We use a coupling measure based on time-lagged mutual information to identify effective connections between brain regions that exhibit a statistically significant increase in average mutual information at a consistent time delay. These identified connections result in sparse, directed functional networks that are stable over minutes, hours, and days. Notably, the time delays associated with these connections are also highly preserved over multiple time scales. We characterize the anatomic locations of these connections, and find that the propagation of activity exhibits a preferred posterior to anterior temporal lobe direction, consistent across participants. Moreover, networks constructed from connections that reliably exhibit consistent timing between anatomic regions demonstrate features of a small-world architecture, with many reliable connections between anatomically neighbouring regions and few long range connections. Together, our results demonstrate

  13. Consistência do padrão de agrupamento de cultivares de milho Clustering pattern consistency of corn cultivars

    Directory of Open Access Journals (Sweden)

    Alberto Cargnelutti Filho

    2011-09-01

    Full Text Available O objetivo deste trabalho foi avaliar a consistência do padrão de agrupamento obtido a partir da combinação de duas medidas de dissimilaridade e quatro métodos de agrupamento, em cenários formados por combinações de número de cultivares e número de variáveis, com dados reais de cultivares de milho (Zea mays L. e com dados simulados. Foram usados os dados reais de cinco variáveis mensuradas em 69 experimentos de competição de cultivares de milho, cujo número de cultivares avaliadas oscilou entre 9 e 40. A fim de investigar os resultados com maior número de cultivares e de variáveis, foram simulados, sob distribuição normal padrão, 1.000 experimentos para cada um dos 54 cenários formados pela combinação entre o número de cultivares (20, 30, 40, 50, 60, 70, 80, 90 e 100 e o número de variáveis (5, 6, 7, 8, 9 e 10. Foram realizadas análises de correlação, de diagnóstico de multicolinearidade e de agrupamento. A consistência do padrão de agrupamento foi avaliada por meio do coeficiente de correlação cofenética. Há decréscimo da consistência do padrão de agrupamento com o acréscimo do número de cultivares e de variáveis. A distância euclidiana proporciona maior consistência no padrão de agrupamento em relação à distância de Manhattan. A consistência do padrão de agrupamento entre os métodos aumenta na seguinte ordem: Ward, ligação completa, ligação simples e ligação média entre grupo.The objective of this research was to evaluate the clustering pattern consistency obtained from the combination of the two dissimilarity measures and four clustering methods, in scenarios consist of combinations number of cultivars and number of variables, with real data in corn cultivars (Zea mays L. and simulated data. We used real data from five variables measured in 69 trials involving corn cultivars, the number of cultivars ranged between 9 and 40. In order to investigate the results with more cultivars and

  14. Consistency of FMEA used in the validation of analytical procedures

    DEFF Research Database (Denmark)

    Oldenhof, M.T.; van Leeuwen, J.F.; Nauta, Maarten

    2011-01-01

    In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection...... needing urgent corrective action respectively, with two being commonly identified. Of the failure modes needing necessary corrective actions, about a third were commonly identified by both teams. These results show inconsistency in the outcome of the FMEA. To improve consistency, we recommend that FMEA...... is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating...

  15. Violation of consistency relations and the protoinflationary transition

    CERN Document Server

    Giovannini, Massimo

    2014-01-01

    If we posit the validity of the consistency relations, the tensor spectral index and the relative amplitude of the scalar and tensor power spectra are both fixed by a single slow roll parameter. The physics of the protoinflationary transition can break explicitly the consistency relations causing a reduction of the inflationary curvature scale in comparison with the conventional lore. After a critical scrutiny, we argue that the inflationary curvature scale, the total number of inflationary efolds and, ultimately, the excursion of the inflaton across its Planckian boundary are all characterized by a computable theoretical error. While these considerations ease some of the tensions between the Bicep2 data and the other satellite observations, they also demand an improved understanding of the protoinflationary transition whose physical features may be assessed, in the future, through a complete analysis of the spectral properties of the B mode autocorrelations.

  16. Self-consistent liquid-to-gas mass transfer calculations.

    Science.gov (United States)

    Smith, Simon A; Stöckle, Claudio O

    2010-12-01

    This work develops an alternative gas transfer calculation method to the two methods currently used in anaerobic digestion modelling. The current calculation methods are problematic because one is computationally stiff, while the other introduces an artificial overpressure. The new approach began by noting that the gas partial pressures are the same as the partial flows at the liquid/gas interface, and then used the self-consistency requirement to develop gas pressure equations which were used by a search algorithm. The new approach took about three iterations to achieve a flow precision better than 2x10(-7) mol h(-1) l(-1), and was self-consistent and stable even when working with eight gases. 2010 Elsevier Ltd. All rights reserved.

  17. Consistency Across Standards or Standards in a New Business Model

    Science.gov (United States)

    Russo, Dane M.

    2010-01-01

    Presentation topics include: standards in a changing business model, the new National Space Policy is driving change, a new paradigm for human spaceflight, consistency across standards, the purpose of standards, danger of over-prescriptive standards, a balance is needed (between prescriptive and general standards), enabling versus inhibiting, characteristics of success-oriented standards, characteristics of success-oriented standards, and conclusions. Additional slides include NASA Procedural Requirements 8705.2B identifies human rating standards and requirements, draft health and medical standards for human rating, what's been done, government oversight models, examples of consistency from anthropometry, examples of inconsistency from air quality and appendices of government and non-governmental human factors standards.

  18. Self-consistent Castaing distribution of solar wind turbulent fluctuations

    CERN Document Server

    Sorriso-Valvo, L; Lijoi, L; Perri, S; Carbone, V

    2015-01-01

    The intermittent behavior of solar wind turbulent fluctuations has often been investigated through the modeling of their probability distribution functions (PDFs). Among others, the Castaing model (Castaing et al. 1990) has successfully been used in the past. In this paper, the energy dissipation field of solar wind turbulence has been studied for fast, slow and polar wind samples recorded by Helios 2 and Ulysses spacecraft. The statistical description of the dissipation rate has then be used to remove intermittency through conditioning of the PDFs. Based on such observation, a self-consistent, parameter-free Castaing model is presented. The self-consistent model is tested against experimental PDFs, showing good agreement and supporting the picture of a multifractal energy cascade at the origin of solar wind intermittency.

  19. Time-Consistent and Market-Consistent Evaluations (replaced by CentER DP 2012-086)

    NARCIS (Netherlands)

    Pelsser, A.; Stadje, M.A.

    2011-01-01

    We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  20. Full self-consistency versus quasiparticle self-consistency in diagrammatic approaches: exactly solvable two-site Hubbard model.

    Science.gov (United States)

    Kutepov, A L

    2015-08-12

    Self-consistent solutions of Hedin's equations (HE) for the two-site Hubbard model (HM) have been studied. They have been found for three-point vertices of increasing complexity (Γ = 1 (GW approximation), Γ1 from the first-order perturbation theory, and the exact vertex Γ(E)). Comparison is made between the cases when an additional quasiparticle (QP) approximation for Green's functions is applied during the self-consistent iterative solving of HE and when QP approximation is not applied. The results obtained with the exact vertex are directly related to the present open question-which approximation is more advantageous for future implementations, GW + DMFT or QPGW + DMFT. It is shown that in a regime of strong correlations only the originally proposed GW + DMFT scheme is able to provide reliable results. Vertex corrections based on perturbation theory (PT) systematically improve the GW results when full self-consistency is applied. The application of QP self-consistency combined with PT vertex corrections shows similar problems to the case when the exact vertex is applied combined with QP sc. An analysis of Ward Identity violation is performed for all studied in this work's approximations and its relation to the general accuracy of the schemes used is provided.

  1. Associations between tongue movement pattern consistency and formant movement pattern consistency in response to speech behavioral modifications.

    Science.gov (United States)

    Mefferd, Antje S

    2016-11-01

    The degree of speech movement pattern consistency can provide information about speech motor control. Although tongue motor control is particularly important because of the tongue's primary contribution to the speech acoustic signal, capturing tongue movements during speech remains difficult and costly. This study sought to determine if formant movements could be used to estimate tongue movement pattern consistency indirectly. Two age groups (seven young adults and seven older adults) and six speech conditions (typical, slow, loud, clear, fast, bite block speech) were selected to elicit an age- and task-dependent performance range in tongue movement pattern consistency. Kinematic and acoustic spatiotemporal indexes (STI) were calculated based on sentence-length tongue movement and formant movement signals, respectively. Kinematic and acoustic STI values showed strong associations across talkers and moderate to strong associations for each talker across speech tasks; although, in cases where task-related tongue motor performance changes were relatively small, the acoustic STI values were poorly associated with kinematic STI values. These findings suggest that, depending on the sensitivity needs, formant movement pattern consistency could be used in lieu of direct kinematic analysis to indirectly examine speech motor control.

  2. Thermodynamic consistency and fast dynamics in phase field crystal modeling

    OpenAIRE

    Cheng, Mowei; Cottenier, Stefaan; Emmerich, Heike

    2008-01-01

    A general formulation is presented to derive the equation of motion and to demonstrate thermodynamic consistency for several classes of phase field models at once. It applies to models with a conserved phase field, describing either uniform or periodic stable states, and containing slow as well as fast thermodynamic variables. The approach is based on an entropy functional formalism previously developed in the context of phase field models for uniform states [P. Galenko and D. Jou, Phys. Rev....

  3. Enforcing consistency during the adaptation of a parallel component

    OpenAIRE

    Buisson, Jérémy; André, Françoise; Pazat, Jean-Louis

    2005-01-01

    International audience; As Grid architectures provide execution environments that are distributed, parallel and dynamic, applications require to be not only parallel and distributed, but also able to adapt themselves to their execution environment. This article presents a model for designing self-adaptable parallel components that can be assembled to build applications for Grid. This model includes the definition of a consistency criterion for the dynamic adaptation of SPMD components. We pro...

  4. Half-maximal consistent truncations using exceptional field theory

    Science.gov (United States)

    Malek, E.

    We show how to construct half-maximal consistent truncations of 10- and 11-dimensional supergravity to seven dimensions using exceptional field theory. This procedure gives rise to a seven-dimensional half-maximal gauged supergravity coupled to n vector multiplets, with n ≠ 3 in general. We also show how these techniques can be used to reduce exceptional field theory to heterotic double field theory.

  5. Spectrally Consistent Satellite Image Fusion with Improved Image Priors

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Aanæs, Henrik; Jensen, Thomas B.S.

    2006-01-01

    Here an improvement to our previous framework for satellite image fusion is presented. A framework purely based on the sensor physics and on prior assumptions on the fused image. The contributions of this paper are two fold. Firstly, a method for ensuring 100% spectrally consistency is proposed......, even when more sophisticated image priors are applied. Secondly, a better image prior is introduced, via data-dependent image smoothing....

  6. Globally Consistent Multi-People Tracking using Motion Patterns

    OpenAIRE

    Maksai, Andrii; Wang, Xinchao; Fleuret, Francois; Fua, Pascal

    2016-01-01

    Many state-of-the-art approaches to people tracking rely on detecting them in each frame independently, grouping detections into short but reliable trajectory segments, and then further grouping them into full trajectories. This grouping typically relies on imposing local smoothness constraints but almost never on enforcing more global constraints on the trajectories. In this paper, we propose an approach to imposing global consistency by first inferring behavioral patterns from the ground tr...

  7. Modeling a Consistent Behavior of PLC-Sensors

    Directory of Open Access Journals (Sweden)

    E. V. Kuzmin

    2014-01-01

    Full Text Available The article extends the cycle of papers dedicated to programming and verificatoin of PLC-programs by LTL-specification. This approach provides the availability of correctness analysis of PLC-programs by the model checking method.The model checking method needs to construct a finite model of a PLC program. For successful verification of required properties it is important to take into consideration that not all combinations of input signals from the sensors can occur while PLC works with a control object. This fact requires more advertence to the construction of the PLC-program model.In this paper we propose to describe a consistent behavior of sensors by three groups of LTL-formulas. They will affect the program model, approximating it to the actual behavior of the PLC program. The idea of LTL-requirements is shown by an example.A PLC program is a description of reactions on input signals from sensors, switches and buttons. In constructing a PLC-program model, the approach to modeling a consistent behavior of PLC sensors allows to focus on modeling precisely these reactions without an extension of the program model by additional structures for realization of a realistic behavior of sensors. The consistent behavior of sensors is taken into account only at the stage of checking a conformity of the programming model to required properties, i. e. a property satisfaction proof for the constructed model occurs with the condition that the model contains only such executions of the program that comply with the consistent behavior of sensors.

  8. Overspecification of colour, pattern, and size: Salience, absoluteness, and consistency

    OpenAIRE

    Sammie eTarenskeen; Mirjam eBroersma; Bart eGeurts

    2015-01-01

    The rates of overspecification of colour, pattern, and size are compared, to investigate how salience and absoluteness contribute to the production of overspecification. Colour and pattern are absolute attributes, whereas size is relative and less salient. Additionally, a tendency towards consistent responses is assessed. Using a within-participants design, we find similar rates of colour and pattern overspecification, which are both higher than the rate of size overspecification. Using a bet...

  9. The Relationship Between Sexual Concordance and Orgasm Consistency in Women.

    Science.gov (United States)

    Suschinsky, Kelly D; Chivers, Meredith L

    2018-02-08

    Sexual concordance (the relationship between genital and self-reported sexual responses) may be associated with orgasm consistency (OC; the proportion of sexual acts leading to orgasm) during penile-vaginal intercourse (PVI) in women. We investigated the relationship between women's sexual concordance (assessed using different stimulus modalities and self-reported sexual arousal methods) and OC during various sexual activities (assessed using different types of questions). For Study 1 (n = 51), when sexual concordance was assessed using audiovisual sexual stimuli, we did not find a statistically significant relationship between OC and poststimulus self-reports of sexual arousal or genital sensations, raw values of OC, or ranges of OC. For Study 2 (n = 44), where sexual concordance was assessed using audionarrative sexual stimuli, we did find a statistically significant relationship between PVI OC and sexual concordance using change in self-reported sexual arousal, and ranges of orgasm consistency. Two findings were inconsistent with previous research. First, OC varied significantly by activity type in both studies; masturbation yielded the highest OC. Second, PVI OC was significantly related to oral sex and masturbation OC (Study 2). We discuss the need for further research and various factors that may affect women's orgasm consistency and sexual concordance.

  10. Consistent Adjoint Driven Importance Sampling using Space, Energy and Angle

    Energy Technology Data Exchange (ETDEWEB)

    Peplow, Douglas E. [ORNL; Mosher, Scott W [ORNL; Evans, Thomas M [ORNL

    2012-08-01

    For challenging radiation transport problems, hybrid methods combine the accuracy of Monte Carlo methods with the global information present in deterministic methods. One of the most successful hybrid methods is CADIS Consistent Adjoint Driven Importance Sampling. This method uses a deterministic adjoint solution to construct a biased source distribution and consistent weight windows to optimize a specific tally in a Monte Carlo calculation. The method has been implemented into transport codes using just the spatial and energy information from the deterministic adjoint and has been used in many applications to compute tallies with much higher figures-of-merit than analog calculations. CADIS also outperforms user-supplied importance values, which usually take long periods of user time to develop. This work extends CADIS to develop weight windows that are a function of the position, energy, and direction of the Monte Carlo particle. Two types of consistent source biasing are presented: one method that biases the source in space and energy while preserving the original directional distribution and one method that biases the source in space, energy, and direction. Seven simple example problems are presented which compare the use of the standard space/energy CADIS with the new space/energy/angle treatments.

  11. Consistency between Modalities Enhances Visually Induced Self-Motion (Vection

    Directory of Open Access Journals (Sweden)

    Takeharu Seno

    2011-10-01

    Full Text Available Visually induced illusory self-motion (vection is generally facilitated by consistent information of self-motion from other modalities. We provide three examples that consistent information between vision and other proprioception enhances vection, ie, locomotion, air flow, and sounds. We used an optic flow of expansion or contraction created by positioning 16,000 dots at random inside a simulated cube (length 20 m, and moving the observer's viewpoint to simulate forward or backward self-motion of 16 m/s. First, We measured the strength of forward or backward vection with or without forward locomotion on a treadmill (2 km/h. The results revealed that forward vection was facilitated by the consistent locomotion whereas vections in the other directions were inhibited by the inconsistent locomotion. Second, we found that forward vection intensity increased when the air flow to subjects' faces produced by an electric fan (the wind speed was 6.37 m/s was provided. On the contrary, the air flow did not enhance backward vection. Finally, we demonstrated that sounds which increased in loudness facilitated forward vection and the sounds which ascended (descended in pitch facilitated upward (downward vection.

  12. Consistency relation in power law G-inflation

    Energy Technology Data Exchange (ETDEWEB)

    Unnikrishnan, Sanil; Shankaranarayanan, S., E-mail: sanil@iisertvm.ac.in, E-mail: shanki@iisertvm.ac.in [School of Physics, Indian Institute of Science Education and Research, Thiruvananthapuram 695016 (India)

    2014-07-01

    In the standard inflationary scenario based on a minimally coupled scalar field, canonical or non-canonical, the subluminal propagation of speed of scalar perturbations ensures the following consistency relation: r ≤ −8n{sub T}, where r is the tensor-to-scalar-ratio and n{sub T} is the spectral index for tensor perturbations. However, recently, it has been demonstrated that this consistency relation could be violated in Galilean inflation models even in the absence of superluminal propagation of scalar perturbations. It is therefore interesting to investigate whether the subluminal propagation of scalar field perturbations impose any bound on the ratio r/|n{sub T}| in G-inflation models. In this paper, we derive the consistency relation for a class of G-inflation models that lead to power law inflation. Within these class of models, it turns out that one can have r > −8n{sub T} or r ≤ −8n{sub T} depending on the model parameters. However, the subluminal propagation of speed of scalar field perturbations, as required by causality, restricts r ≤ −(32/3) n{sub T}.

  13. Fission gas bubble percolation on crystallographically consistent grain boundary networks

    Energy Technology Data Exchange (ETDEWEB)

    Sabogal-Suárez, Daniel; David Alzate-Cardona, Juan, E-mail: jdalzatec@unal.edu.co; Restrepo-Parra, Elisabeth

    2016-07-15

    Fission gas release in nuclear fuels can be modeled in the framework of percolation theory, where each grain boundary is classified as open or closed to the release of the fission gas. In the present work, two-dimensional grain boundary networks were assembled both at random and in a crystallographically consistent manner resembling a general textured microstructure. In the crystallographically consistent networks, grain boundaries were classified according to its misorientation. The percolation behavior of the grain boundary networks was evaluated as a function of radial cracks and radial thermal gradients in the fuel pellet. Percolation thresholds tend to shift to the left with increasing length and number of cracks, especially in the presence of thermal gradients. In general, the topology and percolation behavior of the crystallographically consistent networks differs from those of the random network. - Highlights: • Fission gas release in nuclear fuels was studied in the framework of percolation theory. • The nuclear fuel cross-section microstructure was modeled through grain boundary networks. • The grain boundaries were classified randomly or according to its crystallography. • Differences in topology and percolation behavior for both kinds networks were determined.

  14. Generalized Self-Consistency: Multinomial logit model and Poisson likelihood.

    Science.gov (United States)

    Tsodikov, Alex; Chefo, Solomon

    2008-01-01

    A generalized self-consistency approach to maximum likelihood estimation (MLE) and model building was developed in (Tsodikov, 2003) and applied to a survival analysis problem. We extend the framework to obtain second-order results such as information matrix and properties of the variance. Multinomial model motivates the paper and is used throughout as an example. Computational challenges with the multinomial likelihood motivated Baker (1994) to develop the Multinomial-Poisson (MP) transformation for a large variety of regression models with multinomial likelihood kernel. Multinomial regression is transformed into a Poisson regression at the cost of augmenting model parameters and restricting the problem to discrete covariates. Imposing normalization restrictions by means of Lagrange multipliers (Lang, 1996) justifies the approach. Using the self-consistency framework we develop an alternative solution to multinomial model fitting that does not require augmenting parameters while allowing for a Poisson likelihood and arbitrary covariate structures. Normalization restrictions are imposed by averaging over artificial "missing data" (fake mixture). Lack of probabilistic interpretation at the "complete-data" level makes the use of the generalized self-consistency machinery essential.

  15. Stool consistency is significantly associated with pain perception.

    Directory of Open Access Journals (Sweden)

    Yukiko Shiro

    Full Text Available Commensal as well as pathogenic bacteria can influence a variety of gut functions, thereby leading to constipation and diarrhea in severe cases. In fact, several researchers have reported evidence supporting the association between stool consistency or constipation and the Gut microbiome (GM composition and dysbiosis. GM influences the human health and disease via the gut-brain axis. We thus hypothesized that the pathogenic bacteria increases pain perception to some extent, which means that there could be an association between stool consistency or constipation and pain perception of healthy subjects.Observational study.The aim of the present study was to investigate the association between stool consistency or constipation and pain perception of healthy subjects.Thirty-eight healthy subjects participated in this study. The participants were assessed on their usual stool form (the Bristol Stool Form Scale: BSFS, constipation (the Cleveland Clinic Constipation score: CCS, degree of obesity, pain perception by mechanical stimulus, cold pain threshold, and a questionnaire on psychological state.The BSFS was significantly and positively associated with pain perception, and showed a significant association with anxiety states. Furthermore, pain perception was significantly associated with anxiety states. However, there were no significant associations between the CCS and any independent variables. In addition, we found that a significant predictor to the pain perception was BSFS. Moreover, there were significant relationships among the psychological states, BSFS and obesity.These results suggest that the stool form is associated with pain perception and anxiety status.

  16. Assessing De Novo transcriptome assembly metrics for consistency and utility.

    Science.gov (United States)

    O'Neil, Shawn T; Emrich, Scott J

    2013-07-09

    Transcriptome sequencing and assembly represent a great resource for the study of non-model species, and many metrics have been used to evaluate and compare these assemblies. Unfortunately, it is still unclear which of these metrics accurately reflect assembly quality. We simulated sequencing transcripts of Drosophila melanogaster. By assembling these simulated reads using both a "perfect" and a modern transcriptome assembler while varying read length and sequencing depth, we evaluated quality metrics to determine whether they 1) revealed perfect assemblies to be of higher quality, and 2) revealed perfect assemblies to be more complete as data quantity increased.Several commonly used metrics were not consistent with these expectations, including average contig coverage and length, though they became consistent when singletons were included in the analysis. We found several annotation-based metrics to be consistent and informative, including contig reciprocal best hit count and contig unique annotation count. Finally, we evaluated a number of novel metrics such as reverse annotation count, contig collapse factor, and the ortholog hit ratio, discovering that each assess assembly quality in unique ways. Although much attention has been given to transcriptome assembly, little research has focused on determining how best to evaluate assemblies, particularly in light of the variety of options available for read length and sequencing depth. Our results provide an important review of these metrics and give researchers tools to produce the highest quality transcriptome assemblies.

  17. Overspecification of colour, pattern, and size: Salience, absoluteness, and consistency

    Directory of Open Access Journals (Sweden)

    Sammie eTarenskeen

    2015-11-01

    Full Text Available The rates of overspecification of colour, pattern, and size are compared, to investigate how salience and absoluteness contribute to the production of overspecification. Colour and pattern are absolute attributes, whereas size is relative and less salient. Additionally, a tendency towards consistent responses is assessed. Using a within-participants design, we find similar rates of colour and pattern overspecification, which are both higher than the rate of size overspecification. Using a between-participants design, however, we find similar rates of pattern and size overspecification, which are both lower than the rate of colour overspecification. This indicates that although many speakers are more likely to include colour than pattern (probably because colour is more salient, they may also treat pattern like colour due to a tendency towards consistency. We find no increase in size overspecification when the salience of size is increased, suggesting that speakers are more likely to include absolute than relative attributes. However, we do find an increase in size overspecification when mentioning the attributes is triggered, which again shows that speakers tend refer in a consistent manner, and that there are circumstances in which even size overspecification is frequently produced.

  18. Global Consistency Management Methods Based on Escrow Approaches in Mobile ad Hoc Networks

    Directory of Open Access Journals (Sweden)

    Takahiro Hara

    2010-01-01

    Full Text Available In a mobile ad hoc network, consistency management of data operations on replicas is a crucial issue for system performance. In our previous work, we classified several primitive consistency levels according to the requirements from applications and provided protocols to realize them. In this paper, we assume special types of applications in which the instances of each data item can be partitioned and propose two consistency management protocols which are combinations of an escrow method and our previously proposed protocols. We also report simulation results to investigate the characteristics of these protocols in a mobile ad hoc network. From the simulation results, we confirm that the protocols proposed in this paper drastically improve data availability and reduce the traffic for data operations while maintaining the global consistency in the entire network.

  19. Dictionary-based fiber orientation estimation with improved spatial consistency.

    Science.gov (United States)

    Ye, Chuyang; Prince, Jerry L

    2018-02-01

    Diffusion magnetic resonance imaging (dMRI) has enabled in vivo investigation of white matter tracts. Fiber orientation (FO) estimation is a key step in tract reconstruction and has been a popular research topic in dMRI analysis. In particular, the sparsity assumption has been used in conjunction with a dictionary-based framework to achieve reliable FO estimation with a reduced number of gradient directions. Because image noise can have a deleterious effect on the accuracy of FO estimation, previous works have incorporated spatial consistency of FOs in the dictionary-based framework to improve the estimation. However, because FOs are only indirectly determined from the mixture fractions of dictionary atoms and not modeled as variables in the objective function, these methods do not incorporate FO smoothness directly, and their ability to produce smooth FOs could be limited. In this work, we propose an improvement to Fiber Orientation Reconstruction using Neighborhood Information (FORNI), which we call FORNI+; this method estimates FOs in a dictionary-based framework where FO smoothness is better enforced than in FORNI alone. We describe an objective function that explicitly models the actual FOs and the mixture fractions of dictionary atoms. Specifically, it consists of data fidelity between the observed signals and the signals represented by the dictionary, pairwise FO dissimilarity that encourages FO smoothness, and weighted ℓ 1 -norm terms that ensure the consistency between the actual FOs and the FO configuration suggested by the dictionary representation. The FOs and mixture fractions are then jointly estimated by minimizing the objective function using an iterative alternating optimization strategy. FORNI+ was evaluated on a simulation phantom, a physical phantom, and real brain dMRI data. In particular, in the real brain dMRI experiment, we have qualitatively and quantitatively evaluated the reproducibility of the proposed method. Results demonstrate that

  20. GENESIS: new self-consistent models of exoplanetary spectra

    Science.gov (United States)

    Gandhi, Siddharth; Madhusudhan, Nikku

    2017-12-01

    We are entering the era of high-precision and high-resolution spectroscopy of exoplanets. Such observations herald the need for robust self-consistent spectral models of exoplanetary atmospheres to investigate intricate atmospheric processes and to make observable predictions. Spectral models of plane-parallel exoplanetary atmospheres exist, mostly adapted from other astrophysical applications, with different levels of sophistication and accuracy. There is a growing need for a new generation of models custom-built for exoplanets and incorporating state-of-the-art numerical methods and opacities. The present work is a step in this direction. Here we introduce GENESIS, a plane-parallel, self-consistent, line-by-line exoplanetary atmospheric modelling code that includes (a) formal solution of radiative transfer using the Feautrier method, (b) radiative-convective equilibrium with temperature correction based on the Rybicki linearization scheme, (c) latest absorption cross-sections, and (d) internal flux and external irradiation, under the assumptions of hydrostatic equilibrium, local thermodynamic equilibrium and thermochemical equilibrium. We demonstrate the code here with cloud-free models of giant exoplanetary atmospheres over a range of equilibrium temperatures, metallicities, C/O ratios and spanning non-irradiated and irradiated planets, with and without thermal inversions. We provide the community with theoretical emergent spectra and pressure-temperature profiles over this range, along with those for several known hot Jupiters. The code can generate self-consistent spectra at high resolution and has the potential to be integrated into general circulation and non-equilibrium chemistry models as it is optimized for efficiency and convergence. GENESIS paves the way for high-fidelity remote sensing of exoplanetary atmospheres at high resolution with current and upcoming observations.

  1. Consistent individual differences in fathering in threespined stickleback Gasterosteus aculeatus

    Directory of Open Access Journals (Sweden)

    Laura R. STEIN, Alison M. BELL

    2012-02-01

    Full Text Available There is growing evidence that individual animals show consistent differences in behavior. For example, individual threespined stickleback fish differ in how they react to predators and how aggressive they are during social interactions with conspecifics. A relatively unexplored but potentially important axis of variation is parental behavior. In sticklebacks, fathers provide all of the parental care that is necessary for offspring survival; therefore paternal care is directly tied to fitness. In this study, we assessed whether individual male sticklebacks differ consistently from each other in parental behavior. We recorded visits to nest, total time fanning, and activity levels of 11 individual males every day throughout one clutch, and then allowed the males to breed again. Half of the males were exposed to predation risk while parenting during the first clutch, and the other half of the males experienced predation risk during the second clutch. We detected dramatic temporal changes in parental behaviors over the course of the clutch: for example, total time fanning increased six-fold prior to eggs hatching, then decreased to approximately zero. Despite these temporal changes, males retained their individually-distinctive parenting styles within a clutch that could not be explained by differences in body size or egg mass. Moreover, individual differences in parenting were maintained when males reproduced for a second time. Males that were exposed to simulated predation risk briefly decreased fanning and increased activity levels. Altogether, these results show that individual sticklebacks consistently differ from each other in how they behave as parents [Current Zoology 58 (1: 45–52, 2012].

  2. Self-consistent modelling of resonant tunnelling structures

    DEFF Research Database (Denmark)

    Fiig, T.; Jauho, A.P.

    1992-01-01

    We report a comprehensive study of the effects of self-consistency on the I-V-characteristics of resonant tunnelling structures. The calculational method is based on a simultaneous solution of the effective-mass Schrödinger equation and the Poisson equation, and the current is evaluated...... with the Tsu-Esaki formula. We consider the formation of the accumulation layer in the emitter contact layer in a number of different approximation schemes, and we introduce a novel way to account for the energy relaxation of continuum states to the two-dimensional quasi-bound states appearing for contain...

  3. Electrostatic Potentials from Self-Consistent Hirshfeld Atomic Charges.

    Science.gov (United States)

    Van Damme, Sofie; Bultinck, Patrick; Fias, Stijn

    2009-02-10

    It is shown that molecular electrostatic potentials obtained from iterative or self-consistent Hirshfeld atomic point charges agree remarkably well with the ab initio computed electrostatic potentials. The iterative Hirshfeld scheme performs nearly as well as electrostatic potential derived atomic charges, having the advantage of allowing the definition of the atom in the molecule, rather than just yielding charges. The quality of the iterative Hirshfeld charges for computing electrostatic potentials is examined for a large set of molecules and compared to other commonly used techniques for population analysis.

  4. Consistent Perturbative Fixed Point Calculations in QCD and Supersymmetric QCD

    DEFF Research Database (Denmark)

    Ryttov, Thomas A.

    2016-01-01

    We suggest how to consistently calculate the anomalous dimension $\\gamma_*$ of the $\\bar{\\psi}\\psi$ operator in finite order perturbation theory at an infrared fixed point for asymptotically free theories. If the $n+1$ loop beta function and $n$ loop anomalous dimension are known then $\\gamma_*$ ...... throughout the entire conformal window. We finally compute $\\gamma_*$ through $O(\\Delta_f^3)$ for QCD and a variety of other non-supersymmetric fermionic gauge theories. Small values of $\\gamma_*$ are observed for a large range of flavors....

  5. Consistency of differential and integral thermonuclear neutronics data

    Energy Technology Data Exchange (ETDEWEB)

    Reupke, W.A.

    1978-01-01

    To increase the accuracy of the neutronics analysis of nuclear reactors, physicists and engineers have employed a variety of techniques, including the adjustment of multigroup differential data to improve consistency with integral data. Of the various adjustment strategies, a generalized least-squares procedure which adjusts the combined differential and integral data can significantly improve the accuracy of neutronics calculations compared to calculations employing only differential data. This investigation analyzes 14 MeV neutron-driven integral experiments, using a more extensively developed methodology and a newly developed computer code, to extend the domain of adjustment from the energy range of fission reactors to the energy range of fusion reactors.

  6. THE COFFEE FRUITS PREVIOUS STORAGE CONSISTS OF THE MAINTENANCE

    OpenAIRE

    Marise Cota Machado; Universidade Federal de Viçosa; Juarez de Sousa e Silva; Universidade Federal de Viçosa; Antonio Teixeira de Matos; Universidade Federal de Viçosa; Paola Alfonsa Vieira Lo Monaco; Universidade Federal de Viçosa

    2012-01-01

    O armazenamento prévio consiste na manutenção dos frutos do cafeeiro recém-colhidos imersos em água, possibilitando o ajuste entre o volume de frutos colhidos e a capacidade de processamento disponível, sem prejuízo à qualidade final dos grãos. Neste trabalho, objetivou-se avaliar algumas características químicas e bioquímicas das águas residuárias geradas no processo, visando avaliar a necessidade de seu tratamento para lançamento em corpos hídricos ou as possibilidades de aproveitamento com...

  7. Mean-field theory and self-consistent dynamo modeling

    Energy Technology Data Exchange (ETDEWEB)

    Yoshizawa, Akira; Yokoi, Nobumitsu [Tokyo Univ. (Japan). Inst. of Industrial Science; Itoh, Sanae-I [Kyushu Univ., Fukuoka (Japan). Research Inst. for Applied Mechanics; Itoh, Kimitaka [National Inst. for Fusion Science, Toki, Gifu (Japan)

    2001-12-01

    Mean-field theory of dynamo is discussed with emphasis on the statistical formulation of turbulence effects on the magnetohydrodynamic equations and the construction of a self-consistent dynamo model. The dynamo mechanism is sought in the combination of the turbulent residual-helicity and cross-helicity effects. On the basis of this mechanism, discussions are made on the generation of planetary magnetic fields such as geomagnetic field and sunspots and on the occurrence of flow by magnetic fields in planetary and fusion phenomena. (author)

  8. Consistent and efficient processing of ADCP streamflow measurements

    Science.gov (United States)

    Mueller, David S.; Constantinescu, George; Garcia, Marcelo H.; Hanes, Dan

    2016-01-01

    The use of Acoustic Doppler Current Profilers (ADCPs) from a moving boat is a commonly used method for measuring streamflow. Currently, the algorithms used to compute the average depth, compute edge discharge, identify invalid data, and estimate velocity and discharge for invalid data vary among manufacturers. These differences could result in different discharges being computed from identical data. Consistent computational algorithm, automated filtering, and quality assessment of ADCP streamflow measurements that are independent of the ADCP manufacturer are being developed in a software program that can process ADCP moving-boat discharge measurements independent of the ADCP used to collect the data.

  9. A Consistent Procedure for Pseudo-Component Delumping

    DEFF Research Database (Denmark)

    Leibovici, Claude; Stenby, Erling Halfdan; Knudsen, Kim

    1996-01-01

    on mixture properties. As a result Henry constant is then available without any computation for any solute, as soon as the parameters describing the solvent have been calculated. A further extension of the same approach to phases in equilibrium leads to the same type of relation for the equilibrium constants....... Thereby infinite dilution K-values can be obtained exactly without any further computation.Based on these results a consistent procedure for the estimation of equilibrium constants in the more classical cases of finite dilution has been developed. It can be used when moderate binary interaction parameters...

  10. Frontal lobe epilepsy manifesting with seizures consisting of isolated vocalization.

    Science.gov (United States)

    Rego, Ricardo; Arnold, Stephan; Noachtar, Soheyl

    2006-12-01

    Vocalizations may occur in focal epileptic seizures, which typically arise from frontal and temporal regions. They are commonly associated with other motor phenomena such as automatisms, tonic posturing, or head version. We report on a patient whose seizures were documented by video-EEG monitoring, but in whom the observable ictal semiology consisted solely of a brief, monotonous vocalization. Ictal EEGs showed left frontal seizure patterns. Isolated vocalizations can constitute an ictal epileptic event and may be the only observable clinical manifestation of a left frontal lobe epilepsy. [Published with video sequences].

  11. Simplified models for dark matter face their consistent completions

    Energy Technology Data Exchange (ETDEWEB)

    Gonçalves, Dorival; Machado, Pedro A. N.; No, Jose Miguel

    2017-03-01

    Simplified dark matter models have been recently advocated as a powerful tool to exploit the complementarity between dark matter direct detection, indirect detection and LHC experimental probes. Focusing on pseudoscalar mediators between the dark and visible sectors, we show that the simplified dark matter model phenomenology departs significantly from that of consistent ${SU(2)_{\\mathrm{L}} \\times U(1)_{\\mathrm{Y}}}$ gauge invariant completions. We discuss the key physics simplified models fail to capture, and its impact on LHC searches. Notably, we show that resonant mono-Z searches provide competitive sensitivities to standard mono-jet analyses at $13$ TeV LHC.

  12. A Consistent Pricing Model for Index Options and Volatility Derivatives

    DEFF Research Database (Denmark)

    Kokholm, Thomas

    We propose a flexible modeling framework for the joint dynamics of an index and a set of forward variance swap rates written on this index. Our model reproduces various empirically observed properties of variance swap dynamics and enables volatility derivatives and options on the underlying index...... to be priced consistently, while allowing for jumps in volatility and returns. An affine specification using Lévy processes as building blocks leads to analytically tractable pricing formulas for volatility derivatives, such as VIX options, as well as efficient numerical methods for pricing of European options...

  13. A Consistent Pricing Model for Index Options and Volatility Derivatives

    DEFF Research Database (Denmark)

    Cont, Rama; Kokholm, Thomas

    2013-01-01

    We propose a flexible modeling framework for the joint dynamics of an index and a set of forward variance swap rates written on this index. Our model reproduces various empirically observed properties of variance swap dynamics and enables volatility derivatives and options on the underlying index...... to be priced consistently, while allowing for jumps in volatility and returns. An affine specification using Lévy processes as building blocks leads to analytically tractable pricing formulas for volatility derivatives, such as VIX options, as well as efficient numerical methods for pricing of European options...

  14. Designing apps for success developing consistent app design practices

    CERN Document Server

    David, Matthew

    2014-01-01

    In 2007, Apple released the iPhone. With this release came tools as revolutionary as the internet was to businesses and individuals back in the mid- and late-nineties: Apps. Much like websites drove (and still drive) business, so too do apps drive sales, efficiencies and communication between people. But also like web design and development, in its early years and iterations, guidelines and best practices for apps are few and far between.Designing Apps for Success provides web/app designers and developers with consistent app design practices that result in timely, appropriate, and efficiently

  15. A Consistent Pricing Model for Index Options and Volatility Derivatives

    DEFF Research Database (Denmark)

    Cont, Rama; Kokholm, Thomas

    We propose and study a flexible modeling framework for the joint dynamics of an index and a set of forward variance swap rates written on this index, allowing options on forward variance swaps and options on the underlying index to be priced consistently. Our model reproduces various empirically ...... on S&P 500 across strikes and maturities as well as options on the VIX volatility index. The calibration of the model is done in two steps, first by matching VIX option prices and then by matching prices of options on the underlying....

  16. Testing the consistency between cosmological measurements of distance and age

    Directory of Open Access Journals (Sweden)

    Remya Nair

    2015-05-01

    Full Text Available We present a model independent method to test the consistency between cosmological measurements of distance and age, assuming the distance duality relation. We use type Ia supernovae, baryon acoustic oscillations, and observational Hubble data, to reconstruct the luminosity distance DL(z, the angle-averaged distance DV(z and the Hubble rate H(z, using Gaussian processes regression technique. We obtain estimate of the distance duality relation in the redshift range 0.1

  17. The Consistency of Quantum Mechanics Implies its Non-Determinism

    OpenAIRE

    Reznikoff, Iegor

    2010-01-01

    In a previous paper (arXiv:1008.3661v1[quant-ph] 21 Aug 2010), we have given a purely logical proof of the Conway and Kochen Free Will theorem in QM: the freedom of the observer implies the freedom of the observed particle. Here we show that the hypothesis of the observer's freedom is not necessary: the assumption of the (informal) consistency (non contradiction) of QM implies its non-determinism relative to physical events (the freedom of observed particles).

  18. The Consistency Of High Attorney Of Papua In Corruption Investigation

    Directory of Open Access Journals (Sweden)

    Samsul Tamher

    2015-08-01

    Full Text Available This study aimed to determine the consistency of High Attorney of Papua in corruption investigation and efforts to return the state financial loss. The type of study used in this paper is a normative-juridical and empirical-juridical. The results showed that the High Attorney of Papua in corruption investigation is not optimal due to the political interference on a case that involving local officials so that the High Attorney in decide the case is not accordance with the rule of law. The efforts of the High Attorney of Papua to return the state financial loss through State Auction Body civil- and criminal laws.

  19. Periodic mesoporous organosilicas consisting of 3d hexagonally ordered interconnected globular pores

    NARCIS (Netherlands)

    Vercaemst, C.; Friedrich, H.|info:eu-repo/dai/nl/304837350; de Jongh, P.E.|info:eu-repo/dai/nl/186125372; Neimark, A.V.; Goderis, B.; Verpoort, F.; van der Voort, P.

    2009-01-01

    A new family of periodic mesoporous organosilicas with 100% E-configured ethenylene-bridges and controllable pore systems is presented. 2D hexagonally ordered hybrid nanocomposites consisting of cylindrical pores are obtained, of which some are filled with solid material. The architectural

  20. Enriching Elementary Quantum Mechanics with the Computer: Self-Consistent Field Problems in One Dimension

    Science.gov (United States)

    Bolemon, Jay S.; Etzold, David J.

    1974-01-01

    Discusses the use of a small computer to solve self-consistent field problems of one-dimensional systems of two or more interacting particles in an elementary quantum mechanics course. Indicates that the calculation can serve as a useful introduction to the iterative technique. (CC)

  1. A Need for Logical and Consistent Anatomical Nomenclature for Cutaneous Nerves of the Limbs

    Science.gov (United States)

    Gest, Thomas R.; Burkel, William E.; Cortright, Gerald W.

    2009-01-01

    The system of anatomical nomenclature needs to be logical and consistent. However, variations in translation to English of the Latin and Greek terminology used in Nomina Anatomica and Terminologia Anatomica have led to some inconsistency in the nomenclature of cutaneous nerves in the limbs. An historical review of cutaneous nerve nomenclature…

  2. The pan-European population distribution across consistently defined functional urban areas

    OpenAIRE

    Schmidheiny, Kurt; Suedekum, Jens

    2015-01-01

    We analyze the first data set on consistently defined functional urban areas in Europe and compare the European to the US urban system. City sizes in Europe do not follow a power law: the largest cities are "too small" to follow Zipf's law.

  3. Completeness and Consistency in Hierarchical State-Based Requirements

    National Research Council Canada - National Science Library

    Heimdahl, Mats P. E

    1995-01-01

    .... The analysis algorithms and tools have been validated on TCAS II, a complex, airborne, collision-avoidance system required on all commercial aircraft with more than 30 passengers that fly in U.S. airspace.

  4. Sparse PDF Volumes for Consistent Multi-Resolution Volume Rendering.

    Science.gov (United States)

    Sicat, Ronell; Krüger, Jens; Möller, Torsten; Hadwiger, Markus

    2014-12-01

    This paper presents a new multi-resolution volume representation called sparse pdf volumes, which enables consistent multi-resolution volume rendering based on probability density functions (pdfs) of voxel neighborhoods. These pdfs are defined in the 4D domain jointly comprising the 3D volume and its 1D intensity range. Crucially, the computation of sparse pdf volumes exploits data coherence in 4D, resulting in a sparse representation with surprisingly low storage requirements. At run time, we dynamically apply transfer functions to the pdfs using simple and fast convolutions. Whereas standard low-pass filtering and down-sampling incur visible differences between resolution levels, the use of pdfs facilitates consistent results independent of the resolution level used. We describe the efficient out-of-core computation of large-scale sparse pdf volumes, using a novel iterative simplification procedure of a mixture of 4D Gaussians. Finally, our data structure is optimized to facilitate interactive multi-resolution volume rendering on GPUs.

  5. Effect of irradiation on Brazilian honeys' consistency and their acceptability

    Science.gov (United States)

    Matsuda, A. H.; Sabato, S. F.

    2004-09-01

    Contamination of bee products may occur during packing or even during the process of collection. Gamma irradiation was found to decrease the number of bacteria and fungi. However, little information is available on the effects of gamma irradiation on viscosity which is an important property of honey. In this work the viscosity of two varieties of Brazilian honey was measured when they were irradiated at 5 and 10 kGy. The viscosity was measured at four temperatures (25°C, 30°C, 35°C and 40°C) for both samples and compared with control and within the doses. The sensory evaluation was carried on for the parameters color, odor, taste and consistency, using a 9-point hedonic scale. All the data were treated with a statistical tool (Statistica 5.1, StatSoft, 1998). The viscosity was not impaired significantly by gamma irradiation in doses 5 and 10 kGy ( psensorial characteristics (odor, color, taste and consistency) is presented. The taste for Parana type indicated a significant difference among irradiation doses ( p<0.05) but the higher value was for 5 kGy dose, demonstrating the acceptability for this case. The Organic honey presented the taste parameter for 10 kGy, significantly lower than the control mean but it did not differ significantly from the 5 kGy value.

  6. Planck 2013 results. XXXI. Consistency of the Planck data

    CERN Document Server

    Ade, P A R; Ashdown, M; Aumont, J; Baccigalupi, C; Banday, A.J; Barreiro, R.B; Battaner, E; Benabed, K; Benoit-Levy, A; Bernard, J.P; Bersanelli, M; Bielewicz, P; Bond, J.R; Borrill, J; Bouchet, F.R; Burigana, C; Cardoso, J.F; Catalano, A; Challinor, A; Chamballu, A; Chiang, H.C; Christensen, P.R; Clements, D.L; Colombi, S; Colombo, L.P.L; Couchot, F; Coulais, A; Crill, B.P; Curto, A; Cuttaia, F; Danese, L; Davies, R.D; Davis, R.J; de Bernardis, P; de Rosa, A; de Zotti, G; Delabrouille, J; Desert, F.X; Dickinson, C; Diego, J.M; Dole, H; Donzelli, S; Dore, O; Douspis, M; Dupac, X; Ensslin, T.A; Eriksen, H.K; Finelli, F; Forni, O; Frailis, M; Fraisse, A A; Franceschi, E; Galeotta, S; Ganga, K; Giard, M; Gonzalez-Nuevo, J; Gorski, K.M.; Gratton, S.; Gregorio, A; Gruppuso, A; Gudmundsson, J E; Hansen, F.K; Hanson, D; Harrison, D; Henrot-Versille, S; Herranz, D; Hildebrandt, S.R; Hivon, E; Hobson, M; Holmes, W.A.; Hornstrup, A; Hovest, W.; Huffenberger, K.M; Jaffe, T.R; Jaffe, A.H; Jones, W.C; Keihanen, E; Keskitalo, R; Knoche, J; Kunz, M; Kurki-Suonio, H; Lagache, G; Lahteenmaki, A; Lamarre, J.M; Lasenby, A; Lawrence, C.R; Leonardi, R; Leon-Tavares, J; Lesgourgues, J; Liguori, M; Lilje, P.B; Linden-Vornle, M; Lopez-Caniego, M; Lubin, P.M; Macias-Perez, J.F; Maino, D; Mandolesi, N; Maris, M; Martin, P.G; Martinez-Gonzalez, E; Masi, S; Matarrese, S; Mazzotta, P; Meinhold, P.R; Melchiorri, A; Mendes, L; Mennella, A; Migliaccio, M; Mitra, S; Miville-Deschenes, M.A; Moneti, A; Montier, L; Morgante, G; Mortlock, D; Moss, A; Munshi, D; Murphy, J A; Naselsky, P; Nati, F; Natoli, P; Norgaard-Nielsen, H.U; Noviello, F; Novikov, D; Novikov, I; Oxborrow, C.A; Pagano, L; Pajot, F; Paoletti, D; Partridge, B; Pasian, F; Patanchon, G; Pearson, D; Pearson, T.J; Perdereau, O; Perrotta, F; Piacentini, F; Piat, M; Pierpaoli, E; Pietrobon, D; Plaszczynski, S; Pointecouteau, E; Polenta, G; Ponthieu, N; Popa, L; Pratt, G.W; Prunet, S; Puget, J.L; Rachen, J.P; Reinecke, M; Remazeilles, M; Renault, C; Ricciardi, S.; Ristorcelli, I; Rocha, G.; Roudier, G; Rubino-Martin, J.A; Rusholme, B; Sandri, M; Scott, D; Stolyarov, V; Sudiwala, R; Sutton, D; Suur-Uski, A.S; Sygnet, J.F; Tauber, J.A; Terenzi, L; Toffolatti, L; Tomasi, M; Tristram, M; Tucci, M; Valenziano, L; Valiviita, J; Van Tent, B; Vielva, P; Villa, F; Wade, L.A; Wandelt, B.D; Wehus, I K; White, S D M; Yvon, D; Zacchei, A; Zonca, A

    2014-01-01

    The Planck design and scanning strategy provide many levels of redundancy that can be exploited to provide tests of internal consistency. One of the most important is the comparison of the 70 GHz (amplifier) and 100 GHz (bolometer) channels. Based on different instrument technologies, with feeds located differently in the focal plane, analysed independently by different teams using different software, and near the minimum of diffuse foreground emission, these channels are in effect two different experiments. The 143 GHz channel has the lowest noise level on Planck, and is near the minimum of unresolved foreground emission. In this paper, we analyse the level of consistency achieved in the 2013 Planck data. We concentrate on comparisons between the 70, 100, and 143 GHz channel maps and power spectra, particularly over the angular scales of the first and second acoustic peaks, on maps masked for diffuse Galactic emission and for strong unresolved sources. Difference maps covering angular scales from 8°...

  7. Consistencies and Inconsistencies Between Science Teachers' Beliefs and Practices

    Science.gov (United States)

    Mansour, Nasser

    2013-05-01

    To gain a better understanding of teachers' beliefs about teaching, as compared with their in-reality classroom practices, case studies were constructed with four science teachers in different schools in Egypt. The main aims of this article were to provide an answer to the research question, 'To what extent do science teachers' beliefs correspond to their practices?' and to explore the contextual factors that can explain the difference, the consistency or inconsistency, between teachers' beliefs and practices. The study collected data for each teacher using semi-structured interviews, notes taken while observing classes, and teachers' notes, journals, and lesson plans concerned with STS lessons. The data were analysed using the constant comparative method around common themes, which were identified as distinctive features of teachers' beliefs; these same themes were then compared with their practices. Results showed that a few of the in-service science teachers' pedagogical beliefs aligned with constructivist philosophy. Some of the teachers' beliefs were consistent with their practices, especially the traditional beliefs, while some of teachers' practices were conflicted with their beliefs in different contexts.

  8. Inconsistent handers show higher psychopathy than consistent handers.

    Science.gov (United States)

    Shobe, Elizabeth; Desimone, Kailey

    2016-01-01

    Three hundred and forty-two university students completed the Short Dark Triad (SD3) and the Edinburgh Handedness Inventory (EHI). Inconsistent handers showed higher psychopathy scores than consistent handers, and no handedness differences were observed for narcissism or Machiavellianism. Participants were further subdivided by quartile into low, moderately low, moderately high, and high psychopathy groups (non-clinical). Absolute EHI scores were equally distributed among low and moderate groups, but were significantly lower for the high psychopathy group. These findings suggest that inconsistent handedness is only associated with the upper quartile of psychopathy scores. Also, males showed significantly higher psychopathy scores than females, and the ratio of male to female inconsistent handers decreased as psychopathy score increased. No gender × handedness interaction indicated that both female and male inconsistent handers have higher psychopathy scores than consistent handers. Although significant, the effects were small and 99.6% of participants were not in the range of a potential clinical diagnosis. The reader, therefore, is strongly cautioned against equating inconsistent handedness with psychopathy.

  9. Sparse PDF Volumes for Consistent Multi-Resolution Volume Rendering

    KAUST Repository

    Sicat, Ronell Barrera

    2014-12-31

    This paper presents a new multi-resolution volume representation called sparse pdf volumes, which enables consistent multi-resolution volume rendering based on probability density functions (pdfs) of voxel neighborhoods. These pdfs are defined in the 4D domain jointly comprising the 3D volume and its 1D intensity range. Crucially, the computation of sparse pdf volumes exploits data coherence in 4D, resulting in a sparse representation with surprisingly low storage requirements. At run time, we dynamically apply transfer functions to the pdfs using simple and fast convolutions. Whereas standard low-pass filtering and down-sampling incur visible differences between resolution levels, the use of pdfs facilitates consistent results independent of the resolution level used. We describe the efficient out-of-core computation of large-scale sparse pdf volumes, using a novel iterative simplification procedure of a mixture of 4D Gaussians. Finally, our data structure is optimized to facilitate interactive multi-resolution volume rendering on GPUs.

  10. Consistent linguistic fuzzy preference relations method with ranking fuzzy numbers

    Science.gov (United States)

    Ridzuan, Siti Amnah Mohd; Mohamad, Daud; Kamis, Nor Hanimah

    2014-12-01

    Multi-Criteria Decision Making (MCDM) methods have been developed to help decision makers in selecting the best criteria or alternatives from the options given. One of the well known methods in MCDM is the Consistent Fuzzy Preference Relation (CFPR) method, essentially utilizes a pairwise comparison approach. This method was later improved to cater subjectivity in the data by using fuzzy set, known as the Consistent Linguistic Fuzzy Preference Relations (CLFPR). The CLFPR method uses the additive transitivity property in the evaluation of pairwise comparison matrices. However, the calculation involved is lengthy and cumbersome. To overcome this problem, a method of defuzzification was introduced by researchers. Nevertheless, the defuzzification process has a major setback where some information may lose due to the simplification process. In this paper, we propose a method of CLFPR that preserves the fuzzy numbers form throughout the process. In obtaining the desired ordering result, a method of ranking fuzzy numbers is utilized in the procedure. This improved procedure for CLFPR is implemented to a case study to verify its effectiveness. This method is useful for solving decision making problems and can be applied to many areas of applications.

  11. A formulation of consistent particle hydrodynamics in strong form

    Science.gov (United States)

    Yamamoto, Satoko; Makino, Junichiro

    2017-04-01

    In fluid dynamical simulations in astrophysics, large deformations are common and surface tracking is sometimes necessary. The smoothed particle hydrodynamics (SPH) method has been used in many such simulations. Recently, however, it has been shown that SPH cannot handle contact discontinuities or free surfaces accurately. There are several reasons for this problem. The first one is that SPH requires that the density is continuous and differentiable. The second one is that SPH does not have consistency, and thus the accuracy is of the zeroth-order in space. In addition, we cannot express accurate boundary conditions with SPH. In this paper, we propose a novel, high-order scheme for particle-based hydrodynamics of compressible fluid. Our method is based on a kernel-weighted high-order fitting polynomial for intensive variables. With this approach, we can construct a scheme which solves all of the three problems described above. For shock capturing, we use a tensor form of von Neumann-Richtmyer artificial viscosity. We have applied our method to many test problems and obtained excellent results. Our method is not conservative, since particles do not have mass or energy, but only their densities. However, because of the Lagrangian nature of our scheme, the violation of the conservation laws turned out to be small. We name this method Consistent Particle Hydrodynamics in Strong Form (CPHSF).

  12. Exposing image forgery by detecting consistency of shadow.

    Science.gov (United States)

    Ke, Yongzhen; Qin, Fan; Min, Weidong; Zhang, Guiling

    2014-01-01

    We propose two tampered image detection methods based on consistency of shadow. The first method is based on texture consistency of shadow for the first kind of splicing image, in which the shadow as well as main body is copied and pasted from another image. The suspicious region including shadow and nonshadow is first selected. Then texture features of the shadow region and the nonshadow region are extracted. Last, correlation function is used to measure the similarity of the two texture features. By comparing the similarity, we can judge whether the image is tampered. Due to the failure in detecting the second kind of splicing image, in which main body, its shadow, and surrounding regions are copied and pasted from another image, another method based on strength of light source of shadows is proposed. The two suspicious shadow regions are first selected. Then an efficient method is used to estimate the strength of light source of shadow. Last, the similarity of strength of light source of two shadows is measured by correlation function. By combining the two methods, we can detect forged image with shadows. Experimental results demonstrate that the proposed methods are effective despite using simplified model compared with the existing methods.

  13. Exposing Image Forgery by Detecting Consistency of Shadow

    Directory of Open Access Journals (Sweden)

    Yongzhen Ke

    2014-01-01

    Full Text Available We propose two tampered image detection methods based on consistency of shadow. The first method is based on texture consistency of shadow for the first kind of splicing image, in which the shadow as well as main body is copied and pasted from another image. The suspicious region including shadow and nonshadow is first selected. Then texture features of the shadow region and the nonshadow region are extracted. Last, correlation function is used to measure the similarity of the two texture features. By comparing the similarity, we can judge whether the image is tampered. Due to the failure in detecting the second kind of splicing image, in which main body, its shadow, and surrounding regions are copied and pasted from another image, another method based on strength of light source of shadows is proposed. The two suspicious shadow regions are first selected. Then an efficient method is used to estimate the strength of light source of shadow. Last, the similarity of strength of light source of two shadows is measured by correlation function. By combining the two methods, we can detect forged image with shadows. Experimental results demonstrate that the proposed methods are effective despite using simplified model compared with the existing methods.

  14. Consistency of commercial devices for measuring elevation gain.

    Science.gov (United States)

    Menaspà, Paolo; Impellizzeri, Franco M; Haakonssen, Eric C; Martin, David T; Abbiss, Chris R

    2014-09-01

    To determine the consistency of commercially available devices used for measuring elevation gain in outdoor activities and sports. Two separate observational validation studies were conducted. Garmin (Forerunner 310XT, Edge 500, Edge 750, and Edge 800; with and without elevation correction) and SRM (Power Control 7) devices were used to measure total elevation gain (TEG) over a 15.7-km mountain climb performed on 6 separate occasions (6 devices; study 1) and during a 138-km cycling event (164 devices; study 2). TEG was significantly different between the Garmin and SRM devices (P Garmin devices (study 1: 0.2% and 1.5%, respectively). The use of the Garmin elevation-correction option resulted in a 5-10% increase in the TEG. While measurements of TEG were relatively consistent within each brand, the measurements differed between the SRM and Garmin devices by as much as 3%. Caution should be taken when comparing elevation-gain data recorded with different settings or with devices of different brands.

  15. Self-consistent viscous heating of rapidly compressed turbulence

    Science.gov (United States)

    Campos, Alejandro; Morgan, Brandon; Olson, Britton; Greenough, Jeffrey

    2016-11-01

    Given turbulence subjected to infinitely rapid deformations, linear terms representing interactions between the mean flow and the turbulence dictate the flow evolution, whereas non-linear terms corresponding to turbulence-turbulence interactions are safely ignored. For rapidly deformed flows where the turbulence Reynolds number is not sufficiently large, viscous effects can't be neglected and tend to play a prominent role, as shown in Davidovits & Fisch (2016). For such a case, the rapid increase of viscosity in a plasma-as compared to the weaker scaling of viscosity in a fluid-leads to the sudden viscous dissipation of turbulent kinetic energy. As described in Davidovits & Fisch, increases in temperature caused by the direct compression of the plasma drive sufficiently large values of viscosity. We report on numerical simulations of turbulence where the increase in temperature is the result of both the direct compression (an inviscid mechanism) and the self-consistent viscous transfer of energy from the turbulent scales towards the thermal energy. A comparison between implicit large-eddy simulations against well-resolved direct numerical simulations is included to asses the effect of the numerical and subgrid-scale dissipation on the self-consistent viscous energy transfer. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  16. Consistency-based respiratory motion estimation in rotational angiography.

    Science.gov (United States)

    Unberath, Mathias; Aichert, André; Achenbach, Stephan; Maier, Andreas

    2017-09-01

    Rotational coronary angiography enables 3D reconstruction but suffers from intra-scan cardiac and respiratory motion. While gating handles cardiac motion, respiratory motion requires compensation. State-of-the-art algorithms rely on 3D-2D registration that depends on initial reconstructions of sufficient quality. We propose a compensation method that is applied directly in projection domain. It overcomes the need for reconstruction and thus complements the state-of-the-art. Virtual single-frame background subtraction based on vessel segmentation and spectral deconvolution yields non-truncated images of the contrasted lumen. This allows motion compensation based on data consistency conditions. We compensate craniocaudal shifts by optimizing epipolar consistency to (a) devise an image-based surrogate for cardiac motion and (b) compensate for respiratory motion. We validate our approach in two numerical phantom studies and three clinical cases. Correlation of the image-based surrogate for cardiac motion with the ECG-based ground truth was excellent yielding a Pearson correlation of 0.93 ± 0.04. Considering motion compensation, the target error measure decreased by 98% and 69%, respectively, for the phantom experiments while for the clinical cases the same figure of merit improved by 46 ± 21%. The proposed method is entirely image-based and accurately estimates craniocaudal shifts due to respiration and cardiac contraction. Future work will investigate experimental trajectories and possibilities for simplification of the single-frame subtraction pipeline. © 2016 American Association of Physicists in Medicine.

  17. Conservation in two-particle self-consistent extensions of dynamical mean-field theory

    Science.gov (United States)

    Krien, Friedrich; van Loon, Erik G. C. P.; Hafermann, Hartmut; Otsuki, Junya; Katsnelson, Mikhail I.; Lichtenstein, Alexander I.

    2017-08-01

    Extensions of dynamical mean-field theory (DMFT) make use of quantum impurity models as nonperturbative and exactly solvable reference systems which are essential to treat the strong electronic correlations. Through the introduction of retarded interactions on the impurity, these approximations can be made two-particle self-consistent. This is of interest for the Hubbard model because it allows to suppress the antiferromagnetic phase transition in two dimensions in accordance with the Mermin-Wagner theorem, and to include the effects of bosonic fluctuations. For a physically sound description of the latter, the approximation should be conserving. In this paper, we show that the mutual requirements of two-particle self-consistency and conservation lead to fundamental problems. For an approximation that is two-particle self-consistent in the charge and longitudinal spin channels, the double occupancy of the lattice and the impurity is no longer consistent when computed from single-particle properties. For the case of self-consistency in the charge and longitudinal as well as transversal spin channels, these requirements are even mutually exclusive so that no conserving approximation can exist. We illustrate these findings for a two-particle self-consistent and conserving DMFT approximation.

  18. Self-consistent radiative corrections to false vacuum decay

    Science.gov (United States)

    Garbrecht, B.; Millington, P.

    2017-07-01

    With the Higgs mass now measured at the sub-percent level, the potential metastability of the electroweak vacuum of the Standard Model (SM) motivates renewed study of false vacuum decay in quantum field theory. In this note, we describe an approach to calculating quantum corrections to the decay rate of false vacua that is able to account fully and self-consistently for the underlying inhomogeneity of the solitonic tunneling configuration. We show that this method can be applied both to theories in which the instability arises already at the level of the classical potential and those in which the instability arises entirely through radiative effects, as is the case for the SM Higgs vacuum. We analyse two simple models in the thin-wall regime, and we show that the modifications of the one-loop corrections from accounting fully for the inhomogeneity can compete at the same level as the two-loop homogeneous corrections.

  19. Formal verification of an oral messages algorithm for interactive consistency

    Science.gov (United States)

    Rushby, John

    1992-01-01

    The formal specification and verification of an algorithm for Interactive Consistency based on the Oral Messages algorithm for Byzantine Agreement is described. We compare our treatment with that of Bevier and Young, who presented a formal specification and verification for a very similar algorithm. Unlike Bevier and Young, who observed that 'the invariant maintained in the recursive subcases of the algorithm is significantly more complicated than is suggested by the published proof' and who found its formal verification 'a fairly difficult exercise in mechanical theorem proving,' our treatment is very close to the previously published analysis of the algorithm, and our formal specification and verification are straightforward. This example illustrates how delicate choices in the formulation of the problem can have significant impact on the readability of its formal specification and on the tractability of its formal verification.

  20. Path lengths in tree-child time consistent hybridization networks

    CERN Document Server

    Cardona, Gabriel; Rossello, Francesc; Valiente, Gabriel

    2008-01-01

    Hybridization networks are representations of evolutionary histories that allow for the inclusion of reticulate events like recombinations, hybridizations, or lateral gene transfers. The recent growth in the number of hybridization network reconstruction algorithms has led to an increasing interest in the definition of metrics for their comparison that can be used to assess the accuracy or robustness of these methods. In this paper we establish some basic results that make it possible the generalization to tree-child time consistent (TCTC) hybridization networks of some of the oldest known metrics for phylogenetic trees: those based on the comparison of the vectors of path lengths between leaves. More specifically, we associate to each hybridization network a suitably defined vector of `splitted' path lengths between its leaves, and we prove that if two TCTC hybridization networks have the same such vectors, then they must be isomorphic. Thus, comparing these vectors by means of a metric for real-valued vecto...

  1. Kinematic Analysis of Continuum Robot Consisted of Driven Flexible Rods

    Directory of Open Access Journals (Sweden)

    Yingzhong Tian

    2016-01-01

    Full Text Available This paper presents the kinematic analysis of a continuum bionic robot with three flexible actuation rods. Since the motion of the end-effector is actuated by the deformation of the rods, the robot structure is with high elasticity and good compliance and the kinematic analysis of the robot requires special treatment. We propose a kinematic model based on the geometry with constant curvature. The analysis consists of two independent mappings: a general mapping for the kinematics of all robots and a specific mapping for this kind of robots. Both of those mappings are developed for the single section and for the multisections. We aim at providing a guide for kinematic analysis of the similar manipulators through this paper.

  2. The Physical Squeezed Limit: Consistency Relations at Order q^2

    CERN Document Server

    Creminelli, Paolo; Senatore, Leonardo; Simonović, Marko; Trevisan, Gabriele

    2013-01-01

    In single-field models of inflation the effect of a long mode with momentum q reduces to a diffeomorphism at zeroth and first order in q. This gives the well-known consistency relations for the n-point functions. At order q^2 the long mode has a physical effect on the short ones, since it induces curvature, and we expect that this effect is the same as being in a curved FRW universe. In this paper we verify this intuition in various examples of the three-point function, whose behaviour at order q^2 can be written in terms of the power spectrum in a curved universe. This gives a simple alternative understanding of the level of non-Gaussianity in single-field models. Non-Gaussianity is always parametrically enhanced when modes freeze at a physical scale k_{ph, f} shorter than H: f_{NL} \\sim (k_{ph, f}/H)^2.

  3. Changes in forest productivity across Alaska consistent with biome shift.

    Science.gov (United States)

    Beck, Pieter S A; Juday, Glenn P; Alix, Claire; Barber, Valerie A; Winslow, Stephen E; Sousa, Emily E; Heiser, Patricia; Herriges, James D; Goetz, Scott J

    2011-04-01

    Global vegetation models predict that boreal forests are particularly sensitive to a biome shift during the 21st century. This shift would manifest itself first at the biome's margins, with evergreen forest expanding into current tundra while being replaced by grasslands or temperate forest at the biome's southern edge. We evaluated changes in forest productivity since 1982 across boreal Alaska by linking satellite estimates of primary productivity and a large tree-ring data set. Trends in both records show consistent growth increases at the boreal-tundra ecotones that contrast with drought-induced productivity declines throughout interior Alaska. These patterns support the hypothesized effects of an initiating biome shift. Ultimately, tree dispersal rates, habitat availability and the rate of future climate change, and how it changes disturbance regimes, are expected to determine where the boreal biome will undergo a gradual geographic range shift, and where a more rapid decline. © 2011 Blackwell Publishing Ltd/CNRS.

  4. Adaptive image denoising using scale and space consistency.

    Science.gov (United States)

    Scharcanski, Jacob; Jung, Cláudio R; Clarke, Robin T

    2002-01-01

    This paper proposes a new method for image denoising with edge preservation, based on image multiresolution decomposition by a redundant wavelet transform. In our approach, edges are implicitly located and preserved in the wavelet domain, whilst image noise is filtered out. At each resolution level, the image edges are estimated by gradient magnitudes (obtained from the wavelet coefficients), which are modeled probabilistically, and a shrinkage function is assembled based on the model obtained. Joint use of space and scale consistency is applied for better preservation of edges. The shrinkage functions are combined to preserve edges that appear simultaneously at several resolutions, and geometric constraints are applied to preserve edges that are not isolated. The proposed technique produces a filtered version of the original image, where homogeneous regions appear separated by well-defined edges. Possible applications include image presegmentation, and image denoising.

  5. A consistency approach for evaluation of biosimilar products.

    Science.gov (United States)

    Tsou, Hsiao-Hui; Chang, Wan-Jung; Hwang, Wong-Shian; Lai, Yi-Hsuan

    2013-01-01

    Recently, biosimilars have attracted much attention from sponsors and regulatory authorities, while patents on early biological products will soon expire in the next few years. The European Medicines Agency (EMEA) of the European Union (EU) published a guideline on similar biological medicinal products for approval of these products in 2005 . Based on the foundational principles of the EMEA guideline, biosimilars are expected to be similar, not identical, to the innovator biologics they seek to copy. In this article, we develop a consistency approach for assessment of similarity between a biosimilar product and the innovator biologic. A method for sample size determination for conducting a clinical trial to assess the biosimilar product is also proposed. A numerical example is given to illustrate applications of the proposed approach in different scenarios.

  6. A self-consistent spin-diffusion model for micromagnetics

    KAUST Repository

    Abert, Claas

    2016-12-17

    We propose a three-dimensional micromagnetic model that dynamically solves the Landau-Lifshitz-Gilbert equation coupled to the full spin-diffusion equation. In contrast to previous methods, we solve for the magnetization dynamics and the electric potential in a self-consistent fashion. This treatment allows for an accurate description of magnetization dependent resistance changes. Moreover, the presented algorithm describes both spin accumulation due to smooth magnetization transitions and due to material interfaces as in multilayer structures. The model and its finite-element implementation are validated by current driven motion of a magnetic vortex structure. In a second experiment, the resistivity of a magnetic multilayer structure in dependence of the tilting angle of the magnetization in the different layers is investigated. Both examples show good agreement with reference simulations and experiments respectively.

  7. Business architecture management architecting the business for consistency and alignment

    CERN Document Server

    Simon, Daniel

    2015-01-01

    This book presents a comprehensive overview of enterprise architecture management with a specific focus on the business aspects. While recent approaches to enterprise architecture management have dealt mainly with aspects of information technology, this book covers all areas of business architecture from business motivation and models to business execution. The book provides examples of how architectural thinking can be applied in these areas, thus combining different perspectives into a consistent whole. In-depth experiences from end-user organizations help readers to understand the abstract concepts of business architecture management and to form blueprints for their own professional approach. Business architecture professionals, researchers, and others working in the field of strategic business management will benefit from this comprehensive volume and its hands-on examples of successful business architecture management practices.​.

  8. Thermodynamically consistent model of brittle oil shales under overpressure

    Science.gov (United States)

    Izvekov, Oleg

    2016-04-01

    The concept of dual porosity is a common way for simulation of oil shale production. In the frame of this concept the porous fractured media is considered as superposition of two permeable continua with mass exchange. As a rule the concept doesn't take into account such as the well-known phenomenon as slip along natural fractures, overpressure in low permeability matrix and so on. Overpressure can lead to development of secondary fractures in low permeability matrix in the process of drilling and pressure reduction during production. In this work a new thermodynamically consistent model which generalizes the model of dual porosity is proposed. Particularities of the model are as follows. The set of natural fractures is considered as permeable continuum. Damage mechanics is applied to simulation of secondary fractures development in low permeability matrix. Slip along natural fractures is simulated in the frame of plasticity theory with Drucker-Prager criterion.

  9. Cross-situational consistency in recognition memory response bias.

    Science.gov (United States)

    Kantner, Justin; Lindsay, D Stephen

    2014-10-01

    Individuals taking an old-new recognition memory test differ widely in their bias to respond "old," ranging from strongly conservative to strongly liberal, even without any manipulation intended to affect bias. Kantner and Lindsay (2012) found stability of bias across study-test cycles, suggesting that bias is a cognitive trait. That consistency, however, could have arisen because participants perceived the two tests as being part of the same experiment in the same context. In the present study, we tested for stability across two recognition study-test procedures embedded in markedly different experiments, held weeks apart, that participants did not know were connected. Bias showed substantial cross-situational stability. Moreover, bias weakly predicted identifications on an eyewitness memory task and accuracy on a go-no-go task. Although we found little in the way of relationships between bias and five personality measures, these findings suggest that response bias is a stable and broadly influential characteristic of recognizers.

  10. Self-Consistent Green Function Method in Nuclear Matter

    Directory of Open Access Journals (Sweden)

    Khaled S. A. Hassaneen

    2013-01-01

    Full Text Available Symmetric nuclear matter is studied within the Brueckner-Hartree-Fock (BHF approach and is extending to the self-consistent Green’s function (SCGF approach. Both approximations are based on realistic nucleon-nucleon interaction; that is, CD-Bonn potential is chosen. The single-particle energy and the equation of state (EOS are studied. The Fermi energy at the saturation point fulfills the Hugenholtz-Van Hove theorem. In comparison to the BHF approach, the binding energy is reduced and the EOS is stiffer. Both the SCGF and BHF approaches do not reproduce the correct saturation point. A simple contact interaction should be added to SCGF and BHF approaches to reproduce the empirical saturation point.

  11. Will the consistent organic food consumer step forward?

    DEFF Research Database (Denmark)

    Juhl, Hans Jørn; Fenger, Morten H. J.; Thøgersen, John

    2017-01-01

    The organic food market has reached a significant value in developed countries, but market shares vary substantively between product categories. This article investigates general patterns in the sequence of adoption of organic products based on a major Danish retailer’s panel scanner data. All...... and it captures the movements between states or segments. A pattern emerges which is consistent with the theory of behavioral spillover and inconsistent with the theory of moral licensing, including a tendency to buy organic products in an increasing number of product categories over time. The order in which...... organic products are adopted is inversely related to the behavioral costs of adopting them. The employed approach provides a firm basis for personalized communication aiming to increase cross-selling of organic products, increase the sale of less popular organic products, and to accelerate movements from...

  12. Making the Sustainable Development Goals Consistent with Sustainability

    Directory of Open Access Journals (Sweden)

    Mathis Wackernagel

    2017-07-01

    Full Text Available The UN’s Sustainable development Goals (SDGs are the most significant global effort so far to advance global sustainable development. Bertelsmann Stiftung and the sustainable development solutions network released an SDG index to assess countries’ average performance on SDGs. Ranking high on the SDG index strongly correlates with high per person demand on nature (or “Footprints”, and low ranking with low Footprints, making evident that the SDGs as expressed today vastly underperform on sustainability. Such underperformance is anti-poor because lowest-income people exposed to resource insecurity will lack the financial means to shield themselves from the consequences. Given the significance of the SDGs for guiding development, rigorous accounting is essential for making them consistent with the goals of sustainable development: thriving within the means of planet Earth.

  13. Migraine patients consistently show abnormal vestibular bedside tests

    Directory of Open Access Journals (Sweden)

    Eliana Teixeira Maranhão

    2015-01-01

    Full Text Available Migraine and vertigo are common disorders, with lifetime prevalences of 16% and 7% respectively, and co-morbidity around 3.2%. Vestibular syndromes and dizziness occur more frequently in migraine patients. We investigated bedside clinical signs indicative of vestibular dysfunction in migraineurs.Objective To test the hypothesis that vestibulo-ocular reflex, vestibulo-spinal reflex and fall risk (FR responses as measured by 14 bedside tests are abnormal in migraineurs without vertigo, as compared with controls.Method Cross-sectional study including sixty individuals – thirty migraineurs, 25 women, 19-60 y-o; and 30 gender/age healthy paired controls.Results Migraineurs showed a tendency to perform worse in almost all tests, albeit only the Romberg tandem test was statistically different from controls. A combination of four abnormal tests better discriminated the two groups (93.3% specificity.Conclusion Migraine patients consistently showed abnormal vestibular bedside tests when compared with controls.

  14. Multirobot FastSLAM Algorithm Based on Landmark Consistency Correction

    Directory of Open Access Journals (Sweden)

    Shi-Ming Chen

    2014-01-01

    Full Text Available Considering the influence of uncertain map information on multirobot SLAM problem, a multirobot FastSLAM algorithm based on landmark consistency correction is proposed. Firstly, electromagnetism-like mechanism is introduced to the resampling procedure in single-robot FastSLAM, where we assume that each sampling particle is looked at as a charged electron and attraction-repulsion mechanism in electromagnetism field is used to simulate interactive force between the particles to improve the distribution of particles. Secondly, when multiple robots observe the same landmarks, every robot is regarded as one node and Kalman-Consensus Filter is proposed to update landmark information, which further improves the accuracy of localization and mapping. Finally, the simulation results show that the algorithm is suitable and effective.

  15. Self-Consistent Dynamical Model of the Broad Line Region

    Directory of Open Access Journals (Sweden)

    Bozena Czerny

    2017-06-01

    Full Text Available We develop a self-consistent description of the Broad Line Region based on the concept of a failed wind powered by radiation pressure acting on a dusty accretion disk atmosphere in Keplerian motion. The material raised high above the disk is illuminated, dust evaporates, and the matter falls back toward the disk. This material is the source of emission lines. The model predicts the inner and outer radius of the region, the cloud dynamics under the dust radiation pressure and, subsequently, the gravitational field of the central black hole, which results in asymmetry between the rise and fall. Knowledge of the dynamics allows us to predict the shapes of the emission lines as functions of the basic parameters of an active nucleus: black hole mass, accretion rate, black hole spin (or accretion efficiency and the viewing angle with respect to the symmetry axis. Here we show preliminary results based on analytical approximations to the cloud motion.

  16. A Secure Localization Approach against Wormhole Attacks Using Distance Consistency

    Directory of Open Access Journals (Sweden)

    Honglong Chen

    2010-01-01

    Full Text Available Wormhole attacks can negatively affect the localization in wireless sensor networks. A typical wormhole attack can be launched by two colluding attackers, one of which sniffs packets at one point in the network and tunnels them through a wired or wireless link to another point, and the other relays them within its vicinity. In this paper, we investigate the impact of the wormhole attack on the localization and propose a novel distance-consistency-based secure localization scheme against wormhole attacks, which includes three phases of wormhole attack detection, valid locators identification and self-localization. The theoretical model is further formulated to analyze the proposed secure localization scheme. The simulation results validate the theoretical results and also demonstrate the effectiveness of our proposed scheme.

  17. A New Bias Corrected Version of Heteroscedasticity Consistent Covariance Estimator

    Directory of Open Access Journals (Sweden)

    Munir Ahmed

    2016-06-01

    Full Text Available In the presence of heteroscedasticity, different available flavours of the heteroscedasticity consistent covariance estimator (HCCME are used. However, the available literature shows that these estimators can be considerably biased in small samples. Cribari–Neto et al. (2000 introduce a bias adjustment mechanism and give the modified White estimator that becomes almost bias-free even in small samples. Extending these results, Cribari-Neto and Galvão (2003 present a similar bias adjustment mechanism that can be applied to a wide class of HCCMEs’. In the present article, we follow the same mechanism as proposed by Cribari-Neto and Galvão to give bias-correction version of HCCME but we use adaptive HCCME rather than the conventional HCCME. The Monte Carlo study is used to evaluate the performance of our proposed estimators.

  18. Individual consistencies in behavior: achievement persistence interactions as personality styles.

    Science.gov (United States)

    Ribes-Iñesta, Emilio; Contreras, Sagrario

    2007-10-01

    Two experiments were carried out to find within-subject consistencies as well as individual differences in a choice situation involving achievement persistence. Four volunteers, two men (21 and 23 years old) and two women (21 and 22 years old) were exposed twice, with a 1-mo. interval, to a filling-patterns task, to evaluate their choices for two options. For one option, task-time and outcomes for every response were constant. For the other option, task-time decreased in correlation with an increase in earnings for every response. Analyses showed reliable profiles for three of the four subjects when percent of responding to each option was compared in independent experiments. Results are discussed in terms of interactive styles.

  19. Metabolome Consistency: Additional Parazoanthines from the Mediterranean Zoanthid Parazoanthus Axinellae

    Directory of Open Access Journals (Sweden)

    Coralie Audoin

    2014-05-01

    Full Text Available Ultra-high pressure liquid chromatography coupled to high resolution mass spectrometry (UHPLC-MS/MS analysis of the organic extract obtained from the Mediterranean zoanthid Parazoanthus axinellae yielded to the identification of five new parazoanthines F-J. The structures were fully determined by comparison of fragmentation patterns with those of previously isolated parazoathines and MS/MS spectra simulation of in silico predicted compounds according to the metabolome consistency. The absolute configuration of the new compounds has been assigned using on-line electronic circular dichroism (UHPLC-ECD. We thus demonstrated the potential of highly sensitive hyphenated techniques to characterize the structures of a whole family of natural products within the metabolome of a marine species. Minor compounds can be characterized using these techniques thus avoiding long isolation processes that may alter the structure of the natural products. These results are also of interest to identify putative bioactive compounds present at low concentration in a complex mixture.

  20. Multiple intelligence: ethical leadership feature consistent financial institutions.

    Directory of Open Access Journals (Sweden)

    Diamela Nava

    2015-03-01

    Full Text Available This study aims to make a theoretical underpinning contrast analysis on the multiple intelligences: consistent feature of Ethical Leadership in Financial Institutions. However, this research was conducted under a qualitative approach, a descriptive, using document analysis, which eventually might be considered that would support multiple intelligences to implement certain capabilities, to achieve the objectives with the purpose and from the rational point of view, to know how to establish significant changes in some ways it is, the way to assess the cognitive abilities of integrating human talent in organizations. Therefore, the role of the leader is to guide and support the development of human potential in their group as a community of interest in order to achieve the aspirations of the organization using intelligence as a strategic tool in different ways to not limit your imagination, judgment, and cooperative action.  

  1. Implicit attitude measures: consistency, stability, and convergent validity.

    Science.gov (United States)

    Cunningham, W A; Preacher, K J; Banaji, M R

    2001-03-01

    In recent years, several techniques have been developed to measure implicit social cognition. Despite their increased use, little attention has been devoted to their reliability and validity. This article undertakes a direct assessment of the interitem consistency, stability, and convergent validity of some implicit attitude measures. Attitudes toward blacks and whites were measured on four separate occasions, each 2 weeks apart, using three relatively implicit measures (response-window evaluative priming, the Implicit Association Test, and the response-window Implicit Association Test) and one explicit measure (Modern Racism Scale). After correcting for interitem inconsistency with latent variable analyses, we found that (a) stability indices improved and (b) implicit measures were substantially correlated with each other, forming a single latent factor. The psychometric properties of response-latency implicit measures have greater integrity than recently suggested.

  2. Consistency relations and conservation of ζ in holographic inflation

    Energy Technology Data Exchange (ETDEWEB)

    Garriga, Jaume [Departament de Física Fonamental i Institut de Ciències del Cosmos,Universitat de Barcelona,Martí i Franquès 1, 08028 Barcelona (Spain); Institute of Cosmology, Department of Physics and Astronomy, Tufts University,Medford, MA 02155 (United States); Urakawa, Yuko [Department of Physics and Astrophysics, Nagoya University,Chikusa, Nagoya 464-8602 (Japan)

    2016-10-18

    It is well known that, in single clock inflation, the curvature perturbation ζ is constant in time on superhorizon scales. In the standard bulk description this follows quite simply from the local conservation of the energy momentum tensor in the bulk. On the other hand, in a holographic description, the constancy of the curvature perturbation must be related to the properties of the RG flow in the boundary theory. Here, we show that, in single clock holographic inflation, the time independence of correlators of ζ follows from the absence of the anomolous dimension of the energy momentum tensor in the boundary theory, and from the so-called consistency relations for vertex functions with a soft leg.

  3. Sustaining biological welfare for our future through consistent science

    Directory of Open Access Journals (Sweden)

    Shimomura Yoshihiro

    2013-01-01

    Full Text Available Abstract Physiological anthropology presently covers a very broad range of human knowledge and engineering technologies. This study reviews scientific inconsistencies within a variety of areas: sitting posture; negative air ions; oxygen inhalation; alpha brain waves induced by music and ultrasound; 1/f fluctuations; the evaluation of feelings using surface electroencephalography; Kansei; universal design; and anti-stress issues. We found that the inconsistencies within these areas indicate the importance of integrative thinking and the need to maintain the perspective on the biological benefit to humanity. Analytical science divides human physiological functions into discrete details, although individuals comprise a unified collection of whole-body functions. Such disparate considerations contribute to the misunderstanding of physiological functions and the misevaluation of positive and negative values for humankind. Research related to human health will, in future, depend on the concept of maintaining physiological functions based on consistent science and on sustaining human health to maintain biological welfare in future generations.

  4. Fully consistent CFD methods for incompressible flow computations

    DEFF Research Database (Denmark)

    Kolmogorov, Dmitry; Shen, Wen Zhong; Sørensen, Niels N.

    2014-01-01

    -velocity coupling on collocated grids use the so-called momentum interpolation method of Rhie and Chow [1]. As known, the method and some of its widely spread modi_cations result in solutions, which are dependent of time step at convergence. In this paper the magnitude of the dependence is shown to contribute about...... 0.5% into the total error in a typical turbulent ow computation. Nevertheless if coarse grids are used, the standard interpolation methods result in much higher non-consistent behavior. To overcome the problem, a recently developed interpolation method, which is independent of time step, is used....... It is shown that in comparison to other time step independent method, the method may enhance the convergence rate of the SIMPLEC algorithm up to 25 %. The method is veri_ed using turbulent ow computations around a NACA 64618 airfoil and the roll-up of a shear layer, which may appear in wind turbine wake....

  5. Ball Recovery Consistency as a Performance Indicator in Elite Soccer

    Directory of Open Access Journals (Sweden)

    Mohammad Maleki

    2016-03-01

    3=1.597, p=0.66. The results of time-zone evaluation indicated that Netherlands and Brazil teams did not have performance consistency on all field zones (χ2 15=31.29, p=0.008 and χ2 15=37.53, p=0.001, respectively. Most ball recoveries were made in the defensive and middle-defensive zones in accordance with modern soccer. It was found that for a soccer team to be successful, it requires a space distribution of experienced players in the field, which leads to power balance for redesigning a team to be offensive in all zones.

  6. Self-consistent dynamical model of the Broad Line Region

    Science.gov (United States)

    Czerny, Bozena; Li, Yan-Rong; Sredzinska, Justyna; Hryniewicz, Krzysztof; Panda, Swayam; Wildy, Conor; Karas, Vladimir

    2017-06-01

    We develope a self-consistent description of the Broad Line Region based on the concept of the failed wind powered by the radiation pressure acting on dusty accretion disk atmosphere in Keplerian motion. The material raised high above the disk is illuminated, dust evaportes, and the matter falls back towards the disk. This material is the source of emission lines. The model predicts the inner and outer radius of the region, the cloud dynamics under the dust radiation pressure and, subsequently, just the gravitational field of the central black hole, which results in assymetry between the rise and fall. Knowledge of the dynamics allows to predict the shapes of the emission lines as functions of the basic parameters of an active nucleus: black hole mass, accretion rate, black hole spin (or accretion efficiency) and the viewing angle with respect to the symmetry axis. Here we show preliminary results based on analytical approximations to the cloud motion.

  7. Consistency of the counting talk test for exercise prescription.

    Science.gov (United States)

    Loose, Brant D; Christiansen, Ann M; Smolczyk, Jill E; Roberts, Kelsey L; Budziszewska, Anna; Hollatz, Crystal G; Norman, Joseph F

    2012-06-01

    The purpose of this study was to assess the consistency of the counting talk test (CTT) method for estimating exercise intensity across various modes of exercise in healthy young adults. Thirty-six individuals completed the study, which required participation in 3 separate sessions within a 2-week time period. During the first session, the individuals completed a maximal effort treadmill test from which each individual's heart rate reserve (HRR) was calculated. During the second and third sessions, the subjects participated in 2 modes of exercise in each session for a total of 4 different modes of exercise. The individuals exercised at 40% HRR, 50% HRR, 60% HRR, 75% HRR, and 85% HRR. The heart rate (HR), CTT, and rating of perceived exertion (RPE) were recorded at each workload. Based on the individual's resting CTT (CTT(rest)), the %CTT for each exercise stage was then calculated. Pearson correlations demonstrated moderate to good correlations between the CTT and HRR methods and the CTT and RPE methods for estimating exercise intensity. This study found that for the individuals with CTT(rest) exercise as recommended by the American College of Sports Medicine HRR guidelines could be achieved by exercising at a level of 40-50% CTT(rest). Individuals with a CTT(rest) ≥25, exercising at a level of 30-40% CTT(rest) would place them in the moderate to vigorous exercise intensity range. A high degree of reliability was demonstrated using the CTT method across the various modes of aerobic exercise. As such, independent of the exercise mode, the CTT was found to be an easy and consistent method for prescribing moderate to vigorous aerobic exercise intensity.

  8. Consistency of FMEA used in the validation of analytical procedures.

    Science.gov (United States)

    Oldenhof, M T; van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Vredenbregt, M J; Weda, M; Barends, D M

    2011-02-20

    In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection-Mass Spectrometry (HPLC-DAD-MS) analytical procedure used in the quality control of medicines. Each team was free to define their own ranking scales for the probability of severity (S), occurrence (O), and detection (D) of failure modes. We calculated Risk Priority Numbers (RPNs) and we identified the failure modes above the 90th percentile of RPN values as failure modes needing urgent corrective action; failure modes falling between the 75th and 90th percentile of RPN values were identified as failure modes needing necessary corrective action, respectively. Team 1 and Team 2 identified five and six failure modes needing urgent corrective action respectively, with two being commonly identified. Of the failure modes needing necessary corrective actions, about a third were commonly identified by both teams. These results show inconsistency in the outcome of the FMEA. To improve consistency, we recommend that FMEA is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating that this inconsistency is not always a drawback. Copyright © 2010 Elsevier B.V. All rights reserved.

  9. Bayesian detection of causal rare variants under posterior consistency.

    KAUST Repository

    Liang, Faming

    2013-07-26

    Identification of causal rare variants that are associated with complex traits poses a central challenge on genome-wide association studies. However, most current research focuses only on testing the global association whether the rare variants in a given genomic region are collectively associated with the trait. Although some recent work, e.g., the Bayesian risk index method, have tried to address this problem, it is unclear whether the causal rare variants can be consistently identified by them in the small-n-large-P situation. We develop a new Bayesian method, the so-called Bayesian Rare Variant Detector (BRVD), to tackle this problem. The new method simultaneously addresses two issues: (i) (Global association test) Are there any of the variants associated with the disease, and (ii) (Causal variant detection) Which variants, if any, are driving the association. The BRVD ensures the causal rare variants to be consistently identified in the small-n-large-P situation by imposing some appropriate prior distributions on the model and model specific parameters. The numerical results indicate that the BRVD is more powerful for testing the global association than the existing methods, such as the combined multivariate and collapsing test, weighted sum statistic test, RARECOVER, sequence kernel association test, and Bayesian risk index, and also more powerful for identification of causal rare variants than the Bayesian risk index method. The BRVD has also been successfully applied to the Early-Onset Myocardial Infarction (EOMI) Exome Sequence Data. It identified a few causal rare variants that have been verified in the literature.

  10. Example-Based Image Colorization Using Locality Consistent Sparse Representation.

    Science.gov (United States)

    Li, Bo; Zhao, Fuchen; Su, Zhuo; Liang, Xiangguo; Lai, Yu-Kun; Rosin, Paul L

    2017-11-01

    Image colorization aims to produce a natural looking color image from a given gray-scale image, which remains a challenging problem. In this paper, we propose a novel example-based image colorization method exploiting a new locality consistent sparse representation. Given a single reference color image, our method automatically colorizes the target gray-scale image by sparse pursuit. For efficiency and robustness, our method operates at the superpixel level. We extract low-level intensity features, mid-level texture features, and high-level semantic features for each superpixel, which are then concatenated to form its descriptor. The collection of feature vectors for all the superpixels from the reference image composes the dictionary. We formulate colorization of target superpixels as a dictionary-based sparse reconstruction problem. Inspired by the observation that superpixels with similar spatial location and/or feature representation are likely to match spatially close regions from the reference image, we further introduce a locality promoting regularization term into the energy formulation, which substantially improves the matching consistency and subsequent colorization results. Target superpixels are colorized based on the chrominance information from the dominant reference superpixels. Finally, to further improve coherence while preserving sharpness, we develop a new edge-preserving filter for chrominance channels with the guidance from the target gray-scale image. To the best of our knowledge, this is the first work on sparse pursuit image colorization from single reference images. Experimental results demonstrate that our colorization method outperforms the state-of-the-art methods, both visually and quantitatively using a user study.

  11. Coroner consistency - The 10-jurisdiction, 10-year, postcode lottery?

    Science.gov (United States)

    Mclean, Maxwell

    2015-04-01

    The investigation and classification of deaths in England and Wales relies upon the application by medical practitioners of diverse reporting standards set locally by coroners and thereafter upon the effectively unconstrained decision process of those same coroners. The author has conducted extensive comparative analysis of Ministry of Justice data on reports to the coroner and their inquest and verdict returns alongside Office of National Statistics data pertaining to the numbers of registered deaths in equivalent local jurisdictions. Here, he analyses 10 jurisdictions characterised by almost identical inquest return numbers in 2011. Substantial variation was found in reporting rates to the coroner and in the profile of inquest verdicts. The range of deaths reported varied from 34% to 62% of all registered deaths. Likewise only 2 of the 10 jurisdictions shared the same ranking of proportions in which the six common verdicts were reached. Individual jurisdictions tended to be consistent over time in their use of verdicts. In all cases, proportionately more male deaths were reported to the coroner. Coroners generally seemed prima facie to be 'gendered' in their approach to verdicts; that is, they were consistently more likely to favour a particular verdict when dealing with a death, according to the sex of the deceased. The extent to which coroners seemed gendered varied widely. While similar services such as the criminal courts or the Crown Prosecution Service are subject to extensive national guidance in an attempt to constrain idiosyncratic decision making, there seems no reason why this should apply less to the process of death investigation and classification. Further analysis of coroners' local practices and their determinants seems necessary. © International Headache Society 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  12. Study of consistency properties of instillation liniment-gel for therapy of pyoinflammatory diseases of maxillofacial region

    Directory of Open Access Journals (Sweden)

    A. V. Kurinnoy

    2012-12-01

    Full Text Available Using rotary viscometer «Reotest 2» researches of consistency properties of instillation gel-liniment for antimicrobial therapy of pyoinfl ammatory diseases of maxillufacial area are conducted. It is defi ned, that consistency properties of gel-liniment for antimicrobial therapy of pyoinflammatory diseases of maxillufacial area are within the limits of rheological optimum of consistency of ointments, and value «mechanical stability» (1,33 characterizes the system as exceptionally thixotropic, providing recoverability of the system after loading and allows to forecast stability of consistency properties of gel-liniment at the prolonged storage.

  13. Autonomous Navigation Based on SEIF with Consistency Constraint for C-Ranger AUV

    Directory of Open Access Journals (Sweden)

    Yue Shen

    2015-01-01

    Full Text Available An autonomous underwater vehicle (AUV has to solve two essential problems in underwater environment, namely, localization and mapping. SLAM is one novel solution to estimate locations and maps simultaneously based on motion models and sensor measurements. Sparse extended information filter (SEIF is an effective algorithm to reduce storage and computational costs of large-scale maps in the SLAM problem. However, there exists the inconsistency in the SEIF since the rank of the observability matrix of linearized error-state model in SLAM system is higher than that of the nonlinear SLAM system. By analyzing the consistency of the SEIF-based SLAM from the perspective of observability, a SLAM based on SEIF with consistency constraint (SEIF-CC SLAM is developed to improve the estimator’s consistency. The proposed algorithm uses the first-ever available estimates to calculate SEIF Jacobians for each of the state variables, called the First Estimates Jacobian (FEJ. Then, the linearized error-state model can keep the same observability as the underlying nonlinear SLAM system. The capability of autonomous navigation with the proposed algorithm is validated through simulations experiments and sea trials for a C-Ranger AUV. Experimental results show that the proposed SEIF-CC SLAM algorithm yields more consistent and accurate estimates compared with the SEIF-based SLAM.

  14. Using Word Embeddings to Enforce Document-Level Lexical Consistency in Machine Translation

    Directory of Open Access Journals (Sweden)

    Garcia Eva Martínez

    2017-06-01

    Full Text Available We integrate new mechanisms in a document-level machine translation decoder to improve the lexical consistency of document translations. First, we develop a document-level feature designed to score the lexical consistency of a translation. This feature, which applies to words that have been translated into different forms within the document, uses word embeddings to measure the adequacy of each word translation given its context. Second, we extend the decoder with a new stochastic mechanism that, at translation time, allows to introduce changes in the translation oriented to improve its lexical consistency. We evaluate our system on English–Spanish document translation, and we conduct automatic and manual assessments of its quality. The automatic evaluation metrics, applied mainly at sentence level, do not reflect significant variations. On the contrary, the manual evaluation shows that the system dealing with lexical consistency is preferred over both a standard sentence-level and a standard document-level phrase-based MT systems.

  15. LA CONSISTENCIA EN EL LENGUAJE DE LAS NORMAS: HACIA UNA PROPUESTA CONSISTENTE Consistency in regulatory language: Towards a consistent proposal

    Directory of Open Access Journals (Sweden)

    Fernando Centenera Sánchez-Seco

    2012-01-01

    Full Text Available La recomendación de consistencia en el lenguaje normativo puede encontrarse en guías o manuales de redacción, y también en varios estudios. Sin embargo, una revisión desde el punto de vista de la práctica invita a formular propuestas para obtener una recomendación más funcional. En este trabajo se ofrecen algunas ideas al respecto. En él se estudia la acepción de consistencia que pudiera ser más óptima y sus tipos, y se plantea la pregunta de si la recomendación ha de aplicarse en todo caso. Tomando como base esta exposición se propone una nueva formulación de la recomendación.The recommendation that regulatory language should be consistent is common in guides or manuals for drafting as well as in several studies. Yet a review from the angle of practice demonstrates the desirability of a more functional recommendation. This study offers some ideas in this regard. It analyses which might be the optimal definition for consistency and its types, and it asks whether the recommendation should be applied in all cases. That discussion then serves as the basis for a new formulation of the recommendation.

  16. Consistent seasonal snow cover depth and duration variability over ...

    Indian Academy of Sciences (India)

    of fresh water for many perennial river systems. (Dettinger and Cayan 1995; ... management, planning and decision making for various socio-economic ... Methodology. Seasonal snow cover build-up for a winter season is broadly described by snow cover depth and its duration. Variability in winter weather patterns. Table 1.

  17. Human T-cell memory consists mainly of unexpanded clones

    NARCIS (Netherlands)

    Klarenbeek, P.L.; Tak, P.P.; van Schaik, B.D.C.; Zwinderman, A.H.; Jakobs, M.E.; Zhang, Z.; van Kampen, A.H.C.; van Lier, R.A.W.; Baas, F.; de Vries, N.

    2010-01-01

    The immune system is able to respond to millions of antigens using adaptive receptors, including the αβ-T-cell receptor (TCR). Upon antigen encounter a T-cell may proliferate to produce a clone of TCR-identical cells, which develop a memory phenotype. Previous studies suggested that most memory

  18. Self-consistent dynamical and thermodynamical evolutions of protoplanetary disks.

    Science.gov (United States)

    Baillie, K.; Charnoz, S.; Taillifet, E.; Piau, L.

    2012-09-01

    Astronomical observations reveal the diversity of protoplanetary disk evolutions. In order to understand the global evolution of these disks from their birth, during the collapse of the molecular cloud, to their evaporation because of the stellar radiation, many processes with different timescales must be coupled: stellar evolution, thermodynamical evolution, photoevaporation, cloud collapse, viscous spreading... Simulating all these processes simultaneously is beyond the capacity of modern computers. However, by modeling the results of large scale simulations and coupling them with models of viscous evolution, we have designed a one dimension full model of disk evolution. In order to generate the most realistic protoplanetary disk, we minimize the number of input parameters and try to calculate most of them from self-consistent processes, as early as possible in the history of the disk; starting with the collapse of the molecular cloud that feeds the disk in gas. We start from the Hueso and Guillot, 2005 [2] model of disk evolution and couple the radiative transfer description of Calvet et al, 1991 [1] allowing us to handle a non-isothermal disk which midplane temperature is defined by an irradiation term form the central star and a viscous heating term depending on the optical depth of the disk. Our new model of the disk photosphere profile allows us to estimate self-consistent photosphere heights and midplane temperatures at the same time. We then follow the disk evolution using an upgrade of the viscous spreading equation from Lynden-Bell and Pringle, 1981 [3]. In particular, the molecular cloud collapse adds a time varying term to the temporal variation of the surface mass density of the disk, in the same manner that photo-evaporation introduces a density loss term. The central star itself is modeled using recent stellar evolution code described in Piau et al, 2011 [4]. Using the same temperature model in the vertical direction, we estimate 2D thermal maps of

  19. A consistent model for tsunami actions on buildings

    Science.gov (United States)

    Foster, A.; Rossetto, T.; Eames, I.; Chandler, I.; Allsop, W.

    2016-12-01

    The Japan (2011) and Indian Ocean (2004) tsunami resulted in significant loss of life, buildings, and critical infrastructure. The tsunami forces imposed upon structures in coastal regions are initially due to wave slamming, after which the quasi-steady flow of the sea water around buildings becomes important. An essential requirement in both design and loss assessment is a consistent model that can accurately predict these forces. A model suitable for predicting forces in the in the quasi-steady range has been established as part of a systematic programme of research by the UCL EPICentre to understand the fundamental physical processes of tsunami actions on buildings, and more generally their social and economic consequences. Using the pioneering tsunami generator at HR Wallingford, this study considers the influence of unsteady flow conditions on the forces acting upon a rectangular building occupying 10-80% of a channel for 20-240 second wave periods. A mathematical model based upon basic open-channel flow principles is proposed, which provides empirical estimates for drag and hydrostatic coefficients. A simple force prediction equation, requiring only basic flow velocity and wave height inputs is then developed, providing good agreement with the experimental results. The results of this study demonstrate that the unsteady forces from the very long waves encountered during tsunami events can be predicted with a level of accuracy and simplicity suitable for design and risk assessment.

  20. Fully self-consistent GW calculations for molecules

    DEFF Research Database (Denmark)

    Rostgaard, Carsten; Jacobsen, Karsten Wedel; Thygesen, Kristian Sommer

    2010-01-01

    We calculate single-particle excitation energies for a series of 34 molecules using fully self-consistent GW, one-shot G0W0, Hartree-Fock (HF), and hybrid density-functional theory (DFT). All calculations are performed within the projector-augmented wave method using a basis set of Wannier...... functions augmented by numerical atomic orbitals. The GW self-energy is calculated on the real frequency axis including its full frequency dependence and off-diagonal matrix elements. The mean absolute error of the ionization potential (IP) with respect to experiment is found to be 4.4, 2.6, 0.8, 0.4, and 0...... leading to underestimation of the IPs. The best IPs are obtained from one-shot G0W0 calculations based on HF since this reduces the overscreening. Finally, we find that the inclusion of core-valence exchange is important and can affect the excitation energies by as much as 1 eV....