WorldWideScience

Sample records for surprisingly successful metamodels

  1. Reliability estimation using kriging metamodel

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Tae Min; Ju, Byeong Hyeon; Lee, Byung Chai [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Jung, Do Hyun [Korea Automotive Technology Institute, Chonan (Korea, Republic of)

    2006-08-15

    In this study, the new method for reliability estimation is proposed using kriging metamodel. Kriging metamodel can be determined by appropriate sampling range and sampling numbers because there are no random errors in the Design and Analysis of Computer Experiments(DACE) model. The first kriging metamodel is made based on widely ranged sampling points. The Advanced First Order Reliability Method(AFORM) is applied to the first kriging metamodel to estimate the reliability approximately. Then, the second kriging metamodel is constructed using additional sampling points with updated sampling range. The Monte-Carlo Simulation(MCS) is applied to the second kriging metamodel to evaluate the reliability. The proposed method is applied to numerical examples and the results are almost equal to the reference reliability.

  2. Unified metamodel of object system

    OpenAIRE

    Oleynik, P. P.

    2015-01-01

    This article describes the unified metamodel of object system which can be used for domain-driven design (DDD) of information system. At the beginning of the work carried out in-depth analysis of existing studies devoted to the organization different metamodels. Metamodel for representing fragments in the figures presented class diagrams of Unified Modeling Language (UML). In the beginning of this article provides a general diagram which displays important associations. Next are separately sh...

  3. Schizoanalysis as Metamodeling

    Directory of Open Access Journals (Sweden)

    Janell Watson

    2008-01-01

    Full Text Available Félix Guattari, writing both on his own and with philosopher Gilles Deleuze, developed the notion of schizoanalysis out of his frustration with what he saw as the shortcomings of Freudian and Lacanian psychoanalysis, namely the orientation toward neurosis, emphasis on language, and lack of socio-political engagement. Guattari was analyzed by Lacan, attended the seminars from the beginning, and remained a member of Lacan's school until his death in 1992. His unorthodox lacanism grew out of his clinical work with schizophrenics and involvement in militant politics. Paradoxically, even as he rebelled theoretically and practically against Lacan's 'mathemes of the unconscious' and topology of knots, Guattari ceaselessly drew diagrams and models. Deleuze once said of him that 'His ideas are drawings, or even diagrams.' Guattari's singled-authored books are filled with strange figures, which borrow from fields as diverse as linguistics, cultural anthropology, chaos theory, energetics, and non-equilibrium thermodynamics. Guattari himself declared schizoanalysis a 'metamodeling,' but at the same time insisted that his models were constructed aesthetically, not scientifically, despite his liberal borrowing of scientific terminology. The practice of schizoanalytic metamodeling is complicated by his and Deleuze's concept of the diagram, which they define as a way of thinking that bypasses language, as for example in musical notation or mathematical formulas. This article will explore Guattari's models, in relation to Freud, Lacan, C.S. Peirce, Louis Hjelmslev, Noam Chomsky, and Ilya Prigogine. I will also situate his drawings in relation to his work as a practicing clinician, political activist, and co-author of Anti-Oedipus and A Thousand Plateaus.

  4. Metamodel of the it governance framework COBIT

    National Research Council Canada - National Science Library

    Souza Neto, João; Ferreira Neto, Arthur Nunes

    2013-01-01

    ... of IT best practice frameworks. The MetaFrame methodology used for the construction of the COBIT metamodel is based on the discipline of conceptual metamodeling and on the extended Entity/Relationship methodology...

  5. Rewriting Constraint Models with Metamodels

    CERN Document Server

    Chenouard, Raphael; Soto, Ricardo

    2010-01-01

    An important challenge in constraint programming is to rewrite constraint models into executable programs calculat- ing the solutions. This phase of constraint processing may require translations between constraint programming lan- guages, transformations of constraint representations, model optimizations, and tuning of solving strategies. In this paper, we introduce a pivot metamodel describing the common fea- tures of constraint models including different kinds of con- straints, statements like conditionals and loops, and other first-class elements like object classes and predicates. This metamodel is general enough to cope with the constructions of many languages, from object-oriented modeling languages to logic languages, but it is independent from them. The rewriting operations manipulate metamodel instances apart from languages. As a consequence, the rewriting operations apply whatever languages are selected and they are able to manage model semantic information. A bridge is created between the metamode...

  6. Ontological Surprises

    DEFF Research Database (Denmark)

    Leahu, Lucian

    2016-01-01

    This paper investigates how we might rethink design as the technological crafting of human-machine relations in the context of a machine learning technique called neural networks. It analyzes Google’s Inceptionism project, which uses neural networks for image recognition. The surprising output of...... a hybrid approach where machine learning algorithms are used to identify objects as well as connections between them; finally, it argues for remaining open to ontological surprises in machine learning as they may enable the crafting of different relations with and through technologies....

  7. Surprise Trips

    DEFF Research Database (Denmark)

    Korn, Matthias; Kawash, Raghid; Andersen, Lisbet Møller

    We report on a platform that augments the natural experience of exploration in diverse indoor and outdoor environments. The system builds on the theme of surprises in terms of user expectations and finding points of interest. It utilizes physical icons as representations of users' interests and a...

  8. Metamodels: definitions of structures or ontological commitments?

    NARCIS (Netherlands)

    Kurtev, I.

    2007-01-01

    The concept of metamodel is central in Model Driven Engineering (MDE). It is used to define the conceptual foundation of modeling languages. There exist specialized languages for specifying metamodels known as metalanguages. The most popular of them are object-oriented and support defining structure

  9. Sensitivity validation technique for sequential kriging metamodel

    Energy Technology Data Exchange (ETDEWEB)

    Huh, Seung Kyun; Lee, Jin Min; Lee, Tae Hee [Hanyang Univ., Seoul (Korea, Republic of)

    2012-08-15

    Metamodels have been developed with a variety of design optimization techniques in the field of structural engineering over the last decade because they are efficient, show excellent prediction performance, and provide easy interconnections into design frameworks. To construct a metamodel, a sequential procedure involving steps such as the design of experiments, metamodeling techniques, and validation techniques is performed. Because validation techniques can measure the accuracy of the metamodel, the number of presampled points for an accurate kriging metamodel is decided by the validation technique in the sequential kriging metamodel. Because the interpolation model such as the kriging metamodel based on computer experiments passes through responses at presampled points, additional analyses or reconstructions of the meta models are required to measure the accuracy of the meta model if existing validation techniques are applied. In this study, we suggest a sensitivity validation that does not require additional analyses or reconstructions of the meta models. Fourteen two dimensional mathematical problems and an engineering problem are illustrated to show the feasibility of the suggested method.

  10. Charming surprise

    CERN Multimedia

    Antonella Del Rosso

    2011-01-01

    The CP violation in charm quarks has always been thought to be extremely small. So, looking at particle decays involving matter and antimatter, the LHCb experiment has recently been surprised to observe that things might be different. Theorists are on the case.   The study of the physics of the charm quark was not in the initial plans of the LHCb experiment, whose letter “b” stands for “beauty quark”. However, already one year ago, the Collaboration decided to look into a wider spectrum of processes that involve charm quarks among other things. The LHCb trigger allows a lot of these processes to be selected, and, among them, one has recently shown interesting features. Other experiments at b-factories have already performed the same measurement but this is the first time that it has been possible to achieve such high precision, thanks to the huge amount of data provided by the very high luminosity of the LHC. “We have observed the decay modes of t...

  11. Charming surprise

    CERN Multimedia

    Antonella Del Rosso

    2011-01-01

    The CP violation in charm quarks has always been thought to be extremely small. So, looking at particle decays involving matter and antimatter, the LHCb experiment has recently been surprised to observe that things might be different. Theorists are on the case. The study of the physics of the charm quark was not in the initial plans of the LHCb experiment, whose letter “b” stands for “beauty quark”. However, already one year ago, the Collaboration decided to look into a wider spectrum of processes that involve charm quarks among other things. The LHCb trigger allows a lot of these processes to be selected, and, among them, one has recently shown interesting features. Other experiments at b-factories have already performed the same measurement but this is the first time that it has been possible to achieve such high precision, thanks to the huge amount of data provided by the very high luminosity of the LHC. “We have observed the decay modes of the D0, a pa...

  12. Kriging Metamodeling in Simulation : A Review

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2007-01-01

    This article reviews Kriging (also called spatial correlation modeling). It presents the basic Kriging assumptions and formulas contrasting Kriging and classic linear regression metamodels. Furthermore, it extends Kriging to random simulation, and discusses bootstrapping to estimate the variance of

  13. Metamodeling of Semantic Web Enabled Multiagent Systems

    NARCIS (Netherlands)

    Kardas, G.; Göknil, Arda; Dikenelli, O.; Topaloglu, N.Y.; Weyns, D.; Holvoet, T.

    2006-01-01

    Several agent researchers are currently studying agent modeling and they propose dierent architectural metamodels for developing Multiagent Systems (MAS) according to specic agent development methodologies. When support for Semantic Web technology and its related constructs are considered, agent

  14. Metamodel of the IT Governance Framework COBIT

    Directory of Open Access Journals (Sweden)

    João Souza Neto

    2013-10-01

    Full Text Available This paper addresses the generation and analysis of the COBIT 4.1 ontological metamodel of IT Governance framework. The ontological metamodels represent the logical structures and fundamental semantics of framework models and constitute adequate tools for the analysis, adaptation, comparison and integration of IT best practice frameworks. The MetaFrame methodology used for the construction of the COBIT metamodel is based on the discipline of conceptual metamodeling and on the extended Entity/Relationship methodology. It has an iterative process of construction of the metamodel’s components, using techniques of modeling and documentation of information systems. In the COBIT 4.1metamodel, the central entity type is the IT Process. The entity type of IT Domain represents the four domains that group one or more IT processes of the COBIT 4.1. In turn, these domains are divided into one or more Activities that are carried through by one or more Roles which are consulted, informed, accounted for or liable for each Activity. The COBIT 4.1 metamodel may suggest adaptation or implementation of a new process within the framework or even contribute to the integration of frameworks, when, after the processes of analysis and comparison, there are connection points between the components and the logical structures of its relationships.

  15. N-dimensional non uniform rational B-splines for metamodeling

    Energy Technology Data Exchange (ETDEWEB)

    Turner, Cameron J [Los Alamos National Laboratory; Crawford, Richard H [UT - AUSTIN

    2008-01-01

    Non Uniform Rational B-splines (NURBs) have unique properties that make them attractive for engineering metamodeling applications. NURBs are known to accurately model many different continuous curve and surface topologies in 1-and 2-variate spaces. However, engineering metamodels of the design space often require hypervariate representations of multidimensional outputs. In essence, design space metamodels are hyperdimensional constructs with a dimensionality determined by their input and output variables. To use NURBs as the basis for a metamodel in a hyperdimensional space, traditional geometric fitting techniques must be adapted to hypervariate and hyperdimensional spaces composed of both continuous and discontinuous variable types. In this paper, we describe the necessary adaptations for the development of a NURBs-based metamodel called a Hyperdimensional Performance Model or HyPerModel. HyPerModels are capable of accurately and reliably modeling nonlinear hyperdimensional objects defined by both continuous and discontinuous variables of a wide variety of topologies, such as those that define typical engineering design spaces. We demonstrate this ability by successfully generating accurate HyPerModels of 10 trial functions laying the foundation for future work with N-dimensional NURBs in design space applications.

  16. n-dimensional non uniform rational b-splines for metamodeling

    Energy Technology Data Exchange (ETDEWEB)

    Turner, Cameron J [Los Alamos National Laboratory; Crawford, Richard H [UT-AUSTIN

    2008-01-01

    Non Uniform Rational B-splines (NURBs) have unique properties that make them attractive for engineering metamodeling applications. NURBs are known to accurately model many different continuous curve and surface topologies in 1- and 2-variate spaces. However, engineering metamodels of the design space often require hypervariate representations of multidimensional outputs. In essence, design space metamodels are hyperdimensional constructs with a dimensionality determined by their input and output variables. To use NURBs as the basis for a metamodel in a hyperdimensional space, traditional geometric fitting techniques must be adapted to hypervariate and hyperdimensional spaces composed of both continuous and discontinuous variable types. In this paper, they describe the necessary adaptations for the development of a NURBs-based metamodel called a Hyperdimensional Performance Model or HyPerModel. HyPerModels are capable of accurately and reliably modeling nonlinear hyperdimensional objects defined by both continuous and discontinuous variables of a wide variety of topologies, such as those that define typical engineering design spaces. They demonstrate this ability by successfully generating accurate HyPerModels of 10 trial functions laying the foundation for future work with N-dimensional NURBs in design space applications.

  17. SPEM: Software Process Engineering Metamodel

    Directory of Open Access Journals (Sweden)

    Víctor Hugo Menéndez Domínguez

    2015-05-01

    Full Text Available Todas las organizaciones involucradas en el desarrollo de software necesitan establecer, gestionar y soportar el trabajo de desarrollo. El término “proceso de desarrollo de software” tiende a unificar todas las actividades y prácticas que cubren esas necesidades. Modelar el proceso de software es una forma para mejorar el desarrollo y la calidad de las aplicaciones resultantes. De entre todos los lenguajes existentes para el modelado de procesos, aquellos basados en productos de trabajo son los más adecuados. Uno de tales lenguajes es SPEM (Software Process Engineering Metamodel. SPEM fue creado por OMG (Object Management Group como un estándar de alto nivel, que está basado en MOF (MetaObject Facility y es un metamodelo UML (Uniform Model Language. Constituye un tipo de ontología de procesos de desarrollo de software. En este artículo se ofrece una descripción, en términos generales, del estándar SPEM. También se destacan los cambios que ha experimentado entre la versión 1.1 y la versión 2.0, presentando tanto las ventajas como las desventajas encontradas entre ambas versiones.

  18. Stochastic Intrinsic Kriging for Simulation Metamodelling

    NARCIS (Netherlands)

    Mehdad, Ehsan; Kleijnen, J.P.C.

    2015-01-01

    Kriging provides metamodels for deterministic and random simulation models. Actually, there are several types of Kriging; the classic type is so-called universal Kriging, which includes ordinary Kriging. These classic types require estimation of the trend in the input-output data of the underlying

  19. Stochastic Intrinsic Kriging for Simulation Metamodelling

    NARCIS (Netherlands)

    Mehdad, Ehsan; Kleijnen, J.P.C.

    2015-01-01

    Kriging provides metamodels for deterministic and random simulation models. Actually, there are several types of Kriging; the classic type is so-called universal Kriging, which includes ordinary Kriging. These classic types require estimation of the trend in the input-output data of the underlying s

  20. Stochastic Intrinsic Kriging for Simulation Metamodelling

    NARCIS (Netherlands)

    Mehdad, E.; Kleijnen, Jack P.C.

    2014-01-01

    We derive intrinsic Kriging, using Matherons intrinsic random functions which eliminate the trend in classic Kriging. We formulate this intrinsic Kriging as a metamodel in deterministic and random simulation models. For random simulation we derive an experimental design that also specifies the numbe

  1. Information-theoretic metamodel of organizational evolution

    Science.gov (United States)

    Sepulveda, Alfredo

    2011-12-01

    Social organizations are abstractly modeled by holarchies---self-similar connected networks---and intelligent complex adaptive multiagent systems---large networks of autonomous reasoning agents interacting via scaled processes. However, little is known of how information shapes evolution in such organizations, a gap that can lead to misleading analytics. The research problem addressed in this study was the ineffective manner in which classical model-predict-control methods used in business analytics attempt to define organization evolution. The purpose of the study was to construct an effective metamodel for organization evolution based on a proposed complex adaptive structure---the info-holarchy. Theoretical foundations of this study were holarchies, complex adaptive systems, evolutionary theory, and quantum mechanics, among other recently developed physical and information theories. Research questions addressed how information evolution patterns gleamed from the study's inductive metamodel more aptly explained volatility in organization. In this study, a hybrid grounded theory based on abstract inductive extensions of information theories was utilized as the research methodology. An overarching heuristic metamodel was framed from the theoretical analysis of the properties of these extension theories and applied to business, neural, and computational entities. This metamodel resulted in the synthesis of a metaphor for, and generalization of organization evolution, serving as the recommended and appropriate analytical tool to view business dynamics for future applications. This study may manifest positive social change through a fundamental understanding of complexity in business from general information theories, resulting in more effective management.

  2. Multi-objective reliability-based optimization with stochastic metamodels.

    Science.gov (United States)

    Coelho, Rajan Filomeno; Bouillard, Philippe

    2011-01-01

    This paper addresses continuous optimization problems with multiple objectives and parameter uncertainty defined by probability distributions. First, a reliability-based formulation is proposed, defining the nondeterministic Pareto set as the minimal solutions such that user-defined probabilities of nondominance and constraint satisfaction are guaranteed. The formulation can be incorporated with minor modifications in a multiobjective evolutionary algorithm (here: the nondominated sorting genetic algorithm-II). Then, in the perspective of applying the method to large-scale structural engineering problems--for which the computational effort devoted to the optimization algorithm itself is negligible in comparison with the simulation--the second part of the study is concerned with the need to reduce the number of function evaluations while avoiding modification of the simulation code. Therefore, nonintrusive stochastic metamodels are developed in two steps. First, for a given sampling of the deterministic variables, a preliminary decomposition of the random responses (objectives and constraints) is performed through polynomial chaos expansion (PCE), allowing a representation of the responses by a limited set of coefficients. Then, a metamodel is carried out by kriging interpolation of the PCE coefficients with respect to the deterministic variables. The method has been tested successfully on seven analytical test cases and on the 10-bar truss benchmark, demonstrating the potential of the proposed approach to provide reliability-based Pareto solutions at a reasonable computational cost.

  3. Artificial Neural Network Metamodels of Stochastic Computer Simulations

    Science.gov (United States)

    1994-08-10

    SUBTITLE r 5. FUNDING NUMBERS Artificial Neural Network Metamodels of Stochastic I () Computer Simulations 6. AUTHOR(S) AD- A285 951 Robert Allen...8217!298*1C2 ARTIFICIAL NEURAL NETWORK METAMODELS OF STOCHASTIC COMPUTER SIMULATIONS by Robert Allen Kilmer B.S. in Education Mathematics, Indiana...dedicate this document to the memory of my father, William Ralph Kilmer. mi ABSTRACT Signature ARTIFICIAL NEURAL NETWORK METAMODELS OF STOCHASTIC

  4. A UML-based metamodel for software evolution process

    Science.gov (United States)

    Jiang, Zuo; Zhou, Wei-Hong; Fu, Zhi-Tao; Xiong, Shun-Qing

    2014-04-01

    A software evolution process is a set of interrelated software processes under which the corresponding software is evolving. An object-oriented software evolution process meta-model (OO-EPMM), abstract syntax and formal OCL constraint of meta-model are presented in this paper. OO-EPMM can not only represent software development process, but also represent software evolution.

  5. Metamodelling Approach and Software Tools for Physical Modelling and Simulation

    Directory of Open Access Journals (Sweden)

    Vitaliy Mezhuyev

    2015-02-01

    Full Text Available In computer science, metamodelling approach becomes more and more popular for the purpose of software systems development. In this paper, we discuss applicability of the metamodelling approach for development of software tools for physical modelling and simulation.To define a metamodel for physical modelling the analysis of physical models will be done. The result of such the analyses will show the invariant physical structures, we propose to use as the basic abstractions of the physical metamodel. It is a system of geometrical objects, allowing to build a spatial structure of physical models and to set a distribution of physical properties. For such geometry of distributed physical properties, the different mathematical methods can be applied. To prove the proposed metamodelling approach, we consider the developed prototypes of software tools.

  6. Algebraic Semantics of OCL-Constrained Metamodel Specifications

    Science.gov (United States)

    Boronat, Artur; Meseguer, José

    In the definition of domain-specific modeling languages a MOF metamodel is used to define the main types of its abstract syntax, and OCL invariants are used to add static semantic constraints. The semantics of a metamodel definition can be given as a model type whose values are well-formed models. A model is said to conform to its metamodel when it is a value of the corresponding model type. However, when OCL invariants are involved, the concept of model conformance has not yet been formally defined in the MOF standard. In this work, the concept of OCL-constrained metamodel conformance is formally defined and used for defining style-preserving software architecture configurations. This concept is supported in MOMENT2, an algebraic framework for MOF metamodeling, where OCL constraints can be used for both static and dynamic analysis.

  7. Certified metamodels for sensitivity indices estimation

    Directory of Open Access Journals (Sweden)

    Prieur Clémentine

    2012-04-01

    Full Text Available Global sensitivity analysis of a numerical code, more specifically estimation of Sobol indices associated with input variables, generally requires a large number of model runs. When those demand too much computation time, it is necessary to use a reduced model (metamodel to perform sensitivity analysis, whose outputs are numerically close to the ones of the original model, while being much faster to run. In this case, estimated indices are subject to two kinds of errors: sampling error, caused by the computation of the integrals appearing in the definition of the Sobol indices by a Monte-Carlo method, and metamodel error, caused by the replacement of the original model by the metamodel. In cases where we have certified bounds for the metamodel error, we propose a method to quantify both types of error, and we compute confidence intervals for first-order Sobol indices. L’analyse de sensibilité globale d’un modèle numérique, plus précisément l’estimation des indices de Sobol associés aux variables d’entrée, nécessite généralement un nombre important d’exécutions du modèle à analyser. Lorsque celles-ci requièrent un temps de calcul important, il est judicieux d’effectuer l’analyse de sensibilité sur un modèle réduit (ou métamodèle, fournissant des sorties numériquement proches du modèle original mais pour un coût nettement inférieur. Les indices estimés sont alors entâchés de deux sortes d’erreur : l’erreur d’échantillonnage, causée par l’estimation des intégrales définissant les indices de Sobol par une méthode de Monte-Carlo, et l’erreur de métamodèle, liée au remplacement du modèle original par le métamodèle. Lorsque nous disposons de bornes d’erreurs certifiées pour le métamodèle, nous proposons une méthode pour quantifier les deux types d’erreurs et fournir des intervalles de confiance pour les indices de Sobol du premier ordre.

  8. Metamodel defined multidimensional embedded sequential sampling criteria.

    Energy Technology Data Exchange (ETDEWEB)

    Turner, C. J. (Cameron J.); Campbell, M. I. (Matthew I.); Crawford, R. H. (Richard H.)

    2004-01-01

    Collecting data to characterize an unknown space presents a series of challenges. Where in the space should data be collected? What regions are more valuable than others to sample? When have sufficient samples been acquired to characterize the space with some level of confidence? Sequential sampling techniques offer an approach to answering these questions by intelligently sampling an unknown space. Sampling decisions are made with criteria intended to preferentially search the space for desirable features. However, N-dimensional applications need efficient and effective criteria. This paper discusses the evolution of several such criteria based on an understanding of the behaviors of existing criteria, and desired criteria properties. The resulting criteria are evaluated with a variety of planar functions, and preliminary results for higher dimensional applications are also presented. In addition, a set of convergence criteria, intended to evaluate the effectiveness of further sampling are implemented. Using these sampling criteria, an effective metamodel representation of the unknown space can be generated at reasonable sampling costs. Furthermore, the use of convergence criteria allows conclusions to be drawn about the level of confidence in the metamodel, and forms the basis for evaluating the adequacy of the original sampling budget.

  9. Achieving Actionable Results from Available Inputs: Metamodels Take Building Energy Simulations One Step Further

    Energy Technology Data Exchange (ETDEWEB)

    Horsey, Henry; Fleming, Katherine; Ball, Brian; Long, Nicholas

    2016-08-26

    Modeling commercial building energy usage can be a difficult and time-consuming task. The increasing prevalence of optimization algorithms provides one path for reducing the time and difficulty. Many use cases remain, however, where information regarding whole-building energy usage is valuable, but the time and expertise required to run and post-process a large number of building energy simulations is intractable. A relatively underutilized option to accurately estimate building energy consumption in real time is to pre-compute large datasets of potential building energy models, and use the set of results to quickly and efficiently provide highly accurate data. This process is called metamodeling. In this paper, two case studies are presented demonstrating the successful applications of metamodeling using the open-source OpenStudio Analysis Framework. The first case study involves the U.S. Department of Energy's Asset Score Tool, specifically the Preview Asset Score Tool, which is designed to give nontechnical users a near-instantaneous estimated range of expected results based on building system-level inputs. The second case study involves estimating the potential demand response capabilities of retail buildings in Colorado. The metamodel developed in this second application not only allows for estimation of a single building's expected performance, but also can be combined with public data to estimate the aggregate DR potential across various geographic (county and state) scales. In both case studies, the unique advantages of pre-computation allow building energy models to take the place of topdown actuarial evaluations. This paper ends by exploring the benefits of using metamodels and then examines the cost-effectiveness of this approach.

  10. Optimization Using Metamodeling in the Context of Integrated Computational Materials Engineering (ICME)

    Energy Technology Data Exchange (ETDEWEB)

    Hammi, Youssef; Horstemeyer, Mark F; Wang, Paul; David, Francis; Carino, Ricolindo

    2013-11-18

    Predictive Design Technologies, LLC (PDT) proposed to employ Integrated Computational Materials Engineering (ICME) tools to help the manufacturing industry in the United States regain the competitive advantage in the global economy. ICME uses computational materials science tools within a holistic system in order to accelerate materials development, improve design optimization, and unify design and manufacturing. With the advent of accurate modeling and simulation along with significant increases in high performance computing (HPC) power, virtual design and manufacturing using ICME tools provide the means to reduce product development time and cost by alleviating costly trial-and-error physical design iterations while improving overall quality and manufacturing efficiency. To reduce the computational cost necessary for the large-scale HPC simulations and to make the methodology accessible for small and medium-sized manufacturers (SMMs), metamodels are employed. Metamodels are approximate models (functional relationships between input and output variables) that can reduce the simulation times by one to two orders of magnitude. In Phase I, PDT, partnered with Mississippi State University (MSU), demonstrated the feasibility of the proposed methodology by employing MSU?s internal state variable (ISV) plasticity-damage model with the help of metamodels to optimize the microstructure-process-property-cost for tube manufacturing processes used by Plymouth Tube Company (PTC), which involves complicated temperature and mechanical loading histories. PDT quantified the microstructure-property relationships for PTC?s SAE J525 electric resistance-welded cold drawn low carbon hydraulic 1010 steel tube manufacturing processes at seven different material states and calibrated the ISV plasticity material parameters to fit experimental tensile stress-strain curves. PDT successfully performed large scale finite element (FE) simulations in an HPC environment using the ISV plasticity

  11. Development and validation of a Database Forensic Metamodel (DBFM)

    Science.gov (United States)

    Al-dhaqm, Arafat; Razak, Shukor; Othman, Siti Hajar; Ngadi, Asri; Ahmed, Mohammed Nazir; Ali Mohammed, Abdulalem

    2017-01-01

    Database Forensics (DBF) is a widespread area of knowledge. It has many complex features and is well known amongst database investigators and practitioners. Several models and frameworks have been created specifically to allow knowledge-sharing and effective DBF activities. However, these are often narrow in focus and address specified database incident types. We have analysed 60 such models in an attempt to uncover how numerous DBF activities are really public even when the actions vary. We then generate a unified abstract view of DBF in the form of a metamodel. We identified, extracted, and proposed a common concept and reconciled concept definitions to propose a metamodel. We have applied a metamodelling process to guarantee that this metamodel is comprehensive and consistent. PMID:28146585

  12. Breeding ecology of the southern shrike, Lanius meridionalis, in an agrosystem of south–eastern Spain: the surprisingly excellent breeding success in a declining population

    Energy Technology Data Exchange (ETDEWEB)

    Moreno-Rueda, G.; Abril-Colon, I.; Lopez-Orta, A.; Alvarez-Benito, I.; Castillo-Gomez, C.; Comas, M.; Rivas, J.M.

    2016-07-01

    The southern shrike, Lanius meridionalis, is declining at the Spanish and European level. One cause of this decline could be low reproductive success due to low availability of prey in agricultural environments. To investigate this possibility we analysed the breeding ecology of a population of southern shrike in an agrosystem in Lomas de Padul (SE Spain). Our results suggest the population is declining in this area. However, contrary to expectations, the population showed the highest reproductive success (% nests in which at least one egg produces a fledgling) reported for this species to date (83.3%), with a productivity of 4.04 fledglings per nest. Reproductive success varied throughout the years, ranging from 75% in the worst year to 92.9% in the best year. Similarly, productivity ranged from 3.25 to 5.0 fledglings per nest depending on the year. Other aspects of reproductive biology, such as clutch size, brood size, and nestling diet, were similar to those reported in other studies. Based on these results, we hypothesise that the determinant of population decline acts on the juvenile fraction, drastically reducing the recruitment rate, or affecting the dispersion of adults and recruits. Nevertheless, the exact factor or factors are unknown. This study shows that a high reproductive success does not guarantee good health status of the population. (Author)

  13. Numerical studies of the metamodel fitting and validation processes

    OpenAIRE

    Iooss, Bertrand; Boussouf, Loïc; Feuillard, Vincent; Marrel, Amandine

    2010-01-01

    Complex computer codes, for instance simulating physical phenomena, are often too time expensive to be directly used to perform uncertainty, sensitivity, optimization and robustness analyses. A widely accepted method to circumvent this problem consists in replacing cpu time expensive computer models by cpu inexpensive mathematical functions, called metamodels. In this paper, we focus on the Gaussian process metamodel and two essential steps of its definition phase. First, the initial design o...

  14. More Supernova Surprises

    Science.gov (United States)

    2010-09-24

    SEP 2010 2. REPORT TYPE 3. DATES COVERED 00-00-2010 to 00-00-2010 4. TITLE AND SUBTITLE More Supernova Surprises 5a. CONTRACT NUMBER 5b. GRANT...PERSPECTIVES More Supernova Surprises ASTRONOMY J. Martin Laming Spectroscopic observations of the supernova SN1987A are providing a new window into high...a core-collapse supernova ) have stretched and motivated research that has expanded our knowledge of astrophysics. The brightest such event in

  15. An extensive catalog of operators for the coupled evolution of metamodels and models

    NARCIS (Netherlands)

    Herrmannnsdoerfer, M.; Vermolen, S.D.; Wachsmuth, G.

    2010-01-01

    Modeling languages and thus their metamodels are subject to change. When a metamodel is evolved, existing models may no longer conform to it. Manual migration of these models in response to metamodel evolution is tedious and error-prone. To significantly automate model migration, operator-based

  16. Mean-Variance-Validation Technique for Sequential Kriging Metamodels

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Tae Hee; Kim, Ho Sung [Hanyang University, Seoul (Korea, Republic of)

    2010-05-15

    The rigorous validation of the accuracy of metamodels is an important topic in research on metamodel techniques. Although a leave-k-out cross-validation technique involves a considerably high computational cost, it cannot be used to measure the fidelity of metamodels. Recently, the mean{sub 0} validation technique has been proposed to quantitatively determine the accuracy of metamodels. However, the use of mean{sub 0} validation criterion may lead to premature termination of a sampling process even if the kriging model is inaccurate. In this study, we propose a new validation technique based on the mean and variance of the response evaluated when sequential sampling method, such as maximum entropy sampling, is used. The proposed validation technique is more efficient and accurate than the leave-k-out cross-validation technique, because instead of performing numerical integration, the kriging model is explicitly integrated to accurately evaluate the mean and variance of the response evaluated. The error in the proposed validation technique resembles a root mean squared error, thus it can be used to determine a stop criterion for sequential sampling of metamodels.

  17. XLDM: AN XLINK-BASED MULTIDIMENSIONAL METAMODEL

    Directory of Open Access Journals (Sweden)

    Paulo Caetano da Silva

    2011-12-01

    Full Text Available The growth of data available on the Internet and the improvement of ways to handle them consist of an important issue while designing a data model. In this context, XML provides the necessary formalism to establish a standard to represent and exchange data. Since the technologies of data warehouse are often used for data analysis, it is necessary to define a cube model data to XML. However, data representation in XML may generate syntactic, semantic and structural heterogeneity problems on XML documents, which are not considered by related approaches. To solve these problems, it is required the definition of a data schema. This paper proposes a metamodel to specify XML document cubes, based on relationships between elements and XML documents. This approach solves the XML data heterogeneity problems by taking advantages of data schema definition and relationships defined by XLink. The methodology used provides formal rules to define the concepts proposed. Following this formalism is then instantiated using XML Schema and XLink. It also presents a case study in the medical field and a comparison with XBRL Dimensions and a financial and multidimensional data model which uses XLink

  18. High School Students' Meta-Modeling Knowledge

    Science.gov (United States)

    Fortus, David; Shwartz, Yael; Rosenfeld, Sherman

    2016-12-01

    Modeling is a core scientific practice. This study probed the meta-modeling knowledge (MMK) of high school students who study science but had not had any explicit prior exposure to modeling as part of their formal schooling. Our goals were to (A) evaluate the degree to which MMK is dependent on content knowledge and (B) assess whether the upper levels of the modeling learning progression defined by Schwarz et al. (2009) are attainable by Israeli K-12 students. Nine Israeli high school students studying physics, chemistry, biology, or general science were interviewed individually, once using a context related to the science subject that they were learning and once using an unfamiliar context. All the interviewees displayed MMK superior to that of elementary and middle school students, despite the lack of formal instruction on the practice. Their MMK was independent of content area, but their ability to engage in the practice of modeling was content dependent. This study indicates that, given proper support, the upper levels of the learning progression described by Schwarz et al. (2009) may be attainable by K-12 science students. The value of explicitly focusing on MMK as a learning goal in science education is considered.

  19. Surprises with Nonrelativistic Naturalness

    CERN Document Server

    Horava, Petr

    2016-01-01

    We explore the landscape of technical naturalness for nonrelativistic systems, finding surprises which challenge and enrich our relativistic intuition already in the simplest case of a single scalar field. While the immediate applications are expected in condensed matter and perhaps in cosmology, the study is motivated by the leading puzzles of fundamental physics involving gravity: The cosmological constant problem and the Higgs mass hierarchy problem.

  20. A Sequential Optimization Sampling Method for Metamodels with Radial Basis Functions

    Science.gov (United States)

    Pan, Guang; Ye, Pengcheng; Yang, Zhidong

    2014-01-01

    Metamodels have been widely used in engineering design to facilitate analysis and optimization of complex systems that involve computationally expensive simulation programs. The accuracy of metamodels is strongly affected by the sampling methods. In this paper, a new sequential optimization sampling method is proposed. Based on the new sampling method, metamodels can be constructed repeatedly through the addition of sampling points, namely, extrema points of metamodels and minimum points of density function. Afterwards, the more accurate metamodels would be constructed by the procedure above. The validity and effectiveness of proposed sampling method are examined by studying typical numerical examples. PMID:25133206

  1. Metamodel for nonlinear dynamic response analysis of damaged laminated composites

    Directory of Open Access Journals (Sweden)

    Mahmoudi S.

    2016-01-01

    Full Text Available Damage affects negatively the safety of the structure and can lead to failure. Thus, it is recommended to use structural health monitoring techniques in order to detect, localize and quantify damage. The main aim of the current work is the development of a numerical metamodel to investigate the dynamic behavior of damaged composite structures. Hence, a metamodelling for damage prediction and dynamic behavior analysis of laminate composite structures is proposed, wherein the stress state in the structure is used as indicative parameters and artificial neural networks as a learning tool.

  2. A development process meta-model for Web based expert systems: The Web engineering point of view

    DEFF Research Database (Denmark)

    Dokas, I.M.; Alapetite, Alexandre

    2006-01-01

    Similar to many legacy computer systems, expert systems can be accessed via the Web, forming a set of Web applications known as Web based expert systems. The tough Web competition, the way people and organizations rely on Web applications and theincreasing user requirements for better services have...... raised their complexity. Unfortunately, there is so far no clear answer to the question: How may the methods and experience of Web engineering and expert systems be combined and applied in order todevelop effective and successful Web based expert systems? In an attempt to answer this question......, a development process meta-model for Web based expert systems will be presented. Based on this meta-model, a publicly available Web based expert systemcalled Landfill Operation Management Advisor (LOMA) was developed. In addition, the results of an accessibility evaluation on LOMA – the first ever reported...

  3. Developing and applying metamodels of high resolution process-based simulations for high throughput exposure assessment of organic chemicals in riverine ecosystems

    Science.gov (United States)

    As defined by Wikipedia (https://en.wikipedia.org/wiki/Metamodeling), “(a) metamodel or surrogate model is a model of a model, and metamodeling is the process of generating such metamodels.” The goals of metamodeling include, but are not limited to (1) developing func...

  4. SM4AM: A Semantic Metamodel for Analytical Metadata

    DEFF Research Database (Denmark)

    Varga, Jovan; Romero, Oscar; Pedersen, Torben Bach

    2014-01-01

    . We present SM4AM, a Semantic Metamodel for Analytical Metadata created as an RDF formalization of the Analytical Metadata artifacts needed for user assistance exploitation purposes in next generation BI systems. We consider the Linked Data initiative and its relevance for user assistance...

  5. Accuracy vs. Robustness: Bi-criteria Optimized Ensemble of Metamodels

    Science.gov (United States)

    2014-12-01

    McCain Hall Tempe, AZ 85281, USA Starkville, MS 39762, USA Jeffery D. Weir Xianghua Chu Department of Operational Sciences Shenzhen ...stochastic environment will be considered instead, e.g., stochastic Kriging. What’s more, relationship between the problem properties and metamodeling

  6. A toolkit for detecting technical surprise.

    Energy Technology Data Exchange (ETDEWEB)

    Trahan, Michael Wayne; Foehse, Mark C.

    2010-10-01

    The detection of a scientific or technological surprise within a secretive country or institute is very difficult. The ability to detect such surprises would allow analysts to identify the capabilities that could be a military or economic threat to national security. Sandia's current approach utilizing ThreatView has been successful in revealing potential technological surprises. However, as data sets become larger, it becomes critical to use algorithms as filters along with the visualization environments. Our two-year LDRD had two primary goals. First, we developed a tool, a Self-Organizing Map (SOM), to extend ThreatView and improve our understanding of the issues involved in working with textual data sets. Second, we developed a toolkit for detecting indicators of technical surprise in textual data sets. Our toolkit has been successfully used to perform technology assessments for the Science & Technology Intelligence (S&TI) program.

  7. Surprises in astrophysical gasdynamics

    CERN Document Server

    Balbus, Steven A

    2016-01-01

    Much of astrophysics consists of the study of ionised gas under the influence of gravitational and magnetic fields. Thus, it is not possible to understand the astrophysical universe without a detailed knowledge of the dynamics of magnetised fluids. Fluid dynamics is, however, a notoriously tricky subject, in which it is all too easy for one's a priori intuition to go astray. In this review, we seek to guide the reader through a series of illuminating yet deceptive problems, all with an enlightening twist. We cover a broad range of topics including the instabilities acting in accretion discs, the hydrodynamics governing the convective zone of the Sun, the magnetic shielding of a cooling galaxy cluster, and the behaviour of thermal instabilities and evaporating clouds. The aim of this review is to surprise and intrigue even veteran astrophysical theorists with an idiosynchratic choice of problems and counterintuitive results. At the same time, we endeavour to bring forth the fundamental ideas, to set out import...

  8. SWAT meta-modeling as support of the management scenario analysis in large watersheds.

    Science.gov (United States)

    Azzellino, A; Çevirgen, S; Giupponi, C; Parati, P; Ragusa, F; Salvetti, R

    2015-01-01

    In the last two decades, numerous models and modeling techniques have been developed to simulate nonpoint source pollution effects. Most models simulate the hydrological, chemical, and physical processes involved in the entrainment and transport of sediment, nutrients, and pesticides. Very often these models require a distributed modeling approach and are limited in scope by the requirement of homogeneity and by the need to manipulate extensive data sets. Physically based models are extensively used in this field as a decision support for managing the nonpoint source emissions. A common characteristic of this type of model is a demanding input of several state variables that makes the calibration and effort-costing in implementing any simulation scenario more difficult. In this study the USDA Soil and Water Assessment Tool (SWAT) was used to model the Venice Lagoon Watershed (VLW), Northern Italy. A Multi-Layer Perceptron (MLP) network was trained on SWAT simulations and used as a meta-model for scenario analysis. The MLP meta-model was successfully trained and showed an overall accuracy higher than 70% both on the training and on the evaluation set, allowing a significant simplification in conducting scenario analysis.

  9. Surprises in astrophysical gasdynamics

    Science.gov (United States)

    Balbus, Steven A.; Potter, William J.

    2016-06-01

    Much of astrophysics consists of the study of ionized gas under the influence of gravitational and magnetic fields. Thus, it is not possible to understand the astrophysical universe without a detailed knowledge of the dynamics of magnetized fluids. Fluid dynamics is, however, a notoriously tricky subject, in which it is all too easy for one’s a priori intuition to go astray. In this review, we seek to guide the reader through a series of illuminating yet deceptive problems, all with an enlightening twist. We cover a broad range of topics including the instabilities acting in accretion discs, the hydrodynamics governing the convective zone of the Sun, the magnetic shielding of a cooling galaxy cluster, and the behaviour of thermal instabilities and evaporating clouds. The aim of this review is to surprise and intrigue even veteran astrophysical theorists with an idiosyncratic choice of problems and counterintuitive results. At the same time, we endeavour to bring forth the fundamental ideas, to set out important assumptions, and to describe carefully whatever novel techniques may be appropriate to the problem at hand. By beginning at the beginning, and analysing a wide variety of astrophysical settings, we seek not only to make this review suitable for fluid dynamic veterans, but to engage novice recruits as well with what we hope will be an unusual and instructive introduction to the subject.

  10. Surprises in astrophysical gasdynamics.

    Science.gov (United States)

    Balbus, Steven A; Potter, William J

    2016-06-01

    Much of astrophysics consists of the study of ionized gas under the influence of gravitational and magnetic fields. Thus, it is not possible to understand the astrophysical universe without a detailed knowledge of the dynamics of magnetized fluids. Fluid dynamics is, however, a notoriously tricky subject, in which it is all too easy for one's a priori intuition to go astray. In this review, we seek to guide the reader through a series of illuminating yet deceptive problems, all with an enlightening twist. We cover a broad range of topics including the instabilities acting in accretion discs, the hydrodynamics governing the convective zone of the Sun, the magnetic shielding of a cooling galaxy cluster, and the behaviour of thermal instabilities and evaporating clouds. The aim of this review is to surprise and intrigue even veteran astrophysical theorists with an idiosyncratic choice of problems and counterintuitive results. At the same time, we endeavour to bring forth the fundamental ideas, to set out important assumptions, and to describe carefully whatever novel techniques may be appropriate to the problem at hand. By beginning at the beginning, and analysing a wide variety of astrophysical settings, we seek not only to make this review suitable for fluid dynamic veterans, but to engage novice recruits as well with what we hope will be an unusual and instructive introduction to the subject.

  11. Some Surprises in Relativistic Gravity

    CERN Document Server

    Santos, N O

    2016-01-01

    General Relativity has had tremendous success both on the theoretical and the experimental fronts for over a century now. However, the contents of the theory are far from exhausted. Only very recently, with the detection of gravitational waves from colliding black holes, we have started probing the behavior of gravity in the strongly non-linear regime. Even today, the studies of black holes keep revealing more and more paradoxes and bizarre results. In this paper, inspired by David Hilbert's startling observation, we show that, contrary to the conventional wisdom, a freely falling test particle feels gravitational repulsion by a black hole as seen by the asymptotic observer. We dig deeper into this surprising behavior of relativistic gravity and offer some explanations.

  12. A Meta-model Describing the Development Process of Mobile Learning

    Science.gov (United States)

    Wingkvist, Anna; Ericsson, Morgan

    This paper presents a meta-model to describe the development process of mobile learning initiatives. These initiatives are often small scale trials that are not integrated in the intended setting, but carried out outside of the setting. This results in sustainability issues, i.e., problems to integrate the results of the initiative as learning aids. In order to address the sustainability issues, and in turn help to understand the scaling process, a meta-model is introduced. This meta-model divides the development into four areas of concern, and the life cycle of any mobile learning initiative into four stages. The meta-model was developed by analyzing and describing how a podcasting initiative was developed, and is currently being evaluated as a tool to both describe and evaluate mobile learning initiatives. The meta-model was developed based on a mobile learning initiative, but the meta-model itself is extendible to other forms of technology-enhanced learning.

  13. Fast Quadratic Local Meta-Models for Evolutionary Optimization of Anguilliform Swimmers

    OpenAIRE

    Kern, Stefan; Hansen, Nikolaus; Koumoutsakos, Petros

    2004-01-01

    International audience; We combine second order local regression meta-models with the Covariance Matrix Adaptation Evolution Strategy in order to enhance its efficiency in the optimization of computationally expensive problems. Computationally intensive direct numerical simulations of an anguilliform swimmer provide the testbed for the optimization. We propose two concepts to reduce the computational cost of the meta-model building. The novel versions of the local meta-model assisted Evolutio...

  14. Some meta-modeling and optimization techniques for helicopter pre-sizing.

    OpenAIRE

    Tremolet, A.; Basset, P.M.

    2012-01-01

    Optimization and meta-models are key elements of modern engineering techniques. The Multidisciplinary Design Optimization (MDO) allows solving strongly coupled physical problems aiming at the global system optimization. For these multidisciplinary optimizations, meta-models can be required as surrogates for complex and high computational cost codes. Meta-modeling is also used for catching general trends and underlying relationships between parameters within a database. The application of thes...

  15. Feature and Meta-Models in Clafer: Mixed, Specialized, and Coupled

    DEFF Research Database (Denmark)

    Bąk, Kacper; Czarnecki, Krzysztof; Wasowski, Andrzej

    2011-01-01

    We present Clafer, a meta-modeling language with first-class support for feature modeling. We designed Clafer as a concise notation for meta-models, feature models, mixtures of meta- and feature models (such as components with options), and models that couple feature models and meta-models via co...... models concisely and show that Clafer meets its design objectives using a sample product line. We evaluated Clafer and how it lends itself to analysis on sample feature models, meta-models, and model templates of an E-Commerce platform....

  16. ROLE OF META-MODEL IN ENGINEERING DATA WAREHOUSE

    Institute of Scientific and Technical Information of China (English)

    SHENGuo-hua; HUANGZhi-qiu; WANGChuan-dong

    2004-01-01

    Engineering data are separately organized and their schemas are increasingly complex and variable. Engineering data management systems are needed to be able to manage the unified data and to be both customizable and extensible. The design of the systems is heavily dependent on the flexibility and self-description of the data model. The characteristics of engineering data and their management facts are analyzed. Then engineering data warehouse (EDW) architecture and multi-layer metamodels are presented. Also an approach to manage anduse engineering data by a meta object is proposed. Finally, an application flight test EDW system (FTEDWS) is described and meta-objects to manage engineering data in the data warehouse are used. It shows that adopting a meta-modeling approach provides a support for interchangeability and a sufficiently flexible environment in which the system evolution and the reusability can be handled.

  17. Virtual tryout planning in automotive industry based on simulation metamodels

    Science.gov (United States)

    Harsch, D.; Heingärtner, J.; Hortig, D.; Hora, P.

    2016-11-01

    Deep drawn sheet metal parts are increasingly designed to the feasibility limit, thus achieving a robust manufacturing is often challenging. The fluctuation of process and material properties often lead to robustness problems. Therefore, numerical simulations are used to detect the critical regions. To enhance the agreement with the real process conditions, the material data are acquired through a variety of experiments. Furthermore, the force distribution is taken into account. The simulation metamodel contains the virtual knowledge of a particular forming process, which is determined based on a series of finite element simulations with variable input parameters. Based on the metamodels, virtual process windows can be displayed for different configurations. This helps to improve the operating point as well as to adjust process settings in case the process becomes unstable. Furthermore, the time of tool tryout can be shortened due to transfer of the virtual knowledge contained in the metamodels on the optimisation of the drawbeads. This allows the tool manufacturer to focus on the essential, to save time and to recognize complex relationships.

  18. Application of Metamodels to Identification of Metallic Materials Models

    Directory of Open Access Journals (Sweden)

    Maciej Pietrzyk

    2016-01-01

    Full Text Available Improvement of the efficiency of the inverse analysis (IA for various material tests was the objective of the paper. Flow stress models and microstructure evolution models of various complexity of mathematical formulation were considered. Different types of experiments were performed and the results were used for the identification of models. Sensitivity analysis was performed for all the models and the importance of parameters in these models was evaluated. Metamodels based on artificial neural network were proposed to simulate experiments in the inverse solution. Performed analysis has shown that significant decrease of the computing times could be achieved when metamodels substitute finite element model in the inverse analysis, which is the case in the identification of flow stress models. Application of metamodels gave good results for flow stress models based on closed form equations accounting for an influence of temperature, strain, and strain rate (4 coefficients and additionally for softening due to recrystallization (5 coefficients and for softening and saturation (7 coefficients. Good accuracy and high efficiency of the IA were confirmed. On the contrary, identification of microstructure evolution models, including phase transformation models, did not give noticeable reduction of the computing time.

  19. Multi-Criterion Optimal Design of Automotive Door Based on Metamodeling Technique and Genetic Algorithm

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    A method for optimizing automotive doors under multiple criteria involving the side impact,stiffness, natural frequency, and structure weight is presented. Metamodeling technique is employed to construct approximations to replace the high computational simulation models. The approximating functions for stiffness and natural frequency are constructed using Taylor series approximation. Three popular approximation techniques, i. e. polynomial response surface (PRS), stepwise regression (SR), and Kriging are studied on their accuracy in the construction of side impact functions. Uniform design is employed to sample the design space of the door impact analysis. The optimization problem is solved by a multi-objective genetic algorithm. It is found that SR technique is superior to PRS and Kriging techniques in terms of accuracy in this study. The numerical results demonstrate that the method successfully generates a well-spread Pareto optimal set. From this Pareto optimal set, decision makers can select the most suitable design according to the vehicle program and its application.

  20. Surprising radiation detectors

    CERN Document Server

    Fleischer, Robert

    2003-01-01

    Radiation doses received by the human body can be measured indirectly and retrospectively by counting the tracks left by particles in ordinary objects like pair of spectacles, glassware, compact disks...This method has been successfully applied to determine neutron radiation doses received 50 years ago on the Hiroshima site. Neutrons themselves do not leave tracks in bulk matter but glass contains atoms of uranium that may fission when hurt by a neutron, the recoil of the fission fragments generates a track that is detectable. The most difficult is to find adequate glass items and to evaluate the radiation shield they benefited at their initial place. The same method has been used to determine the radiation dose due to the pile-up of radon in houses. In that case the tracks left by alpha particles due to the radioactive decay of polonium-210 have been counted on the superficial layer of the window panes. Other materials like polycarbonate plastics have been used to determine the radiation dose due to heavy io...

  1. idSpace D2.3 – Semantic meta-model integration and transformations v2

    DEFF Research Database (Denmark)

    Dolog, Peter; Grube, Pascal; Schmid, Klaus;

    2009-01-01

    This deliverable discusses an extended set of requirements for transformations and metamodel for creativity techniques. Based on the requirements, the deliverable provides refined meta-model. The metamodel allows for more advanced transforma-tion concepts besides the previously delivered graph...

  2. A MOF Metamodel for the Development of Context-Aware Mobile Applications

    NARCIS (Netherlands)

    Guareis de Farias, C.R.; Leite, M.M.; Calvi, C.Z.; Mantovaneli Pessoa, R.; Pereira Filho, J.

    2007-01-01

    Context-aware mobile applications are increasingly attracting interest of the research community. To facilitate the development of this class of applications, it is necessary that both applications and support platforms share a common context metamodel. This paper presents a metamodel defined using

  3. MISTRAL: A language for model transformations in the MOF meta-modeling architecture

    NARCIS (Netherlands)

    Kurtev, I.; Berg, van den K.G.

    2005-01-01

    In the Meta Object Facility (MOF) meta-modeling architecture a number of model transformation scenarios can be identified. It could be expected that a meta-modeling architecture will be accompanied by a transformation technology supporting the model transformation scenarios in a uniform way. Despite

  4. A MOF Metamodel for the Development of Context-Aware Mobile Applications

    NARCIS (Netherlands)

    Guareis de farias, Cléver; Leite, M.M.; Calvi, C.Z.; Mantovaneli Pessoa, Rodrigo; Pereira Filho, J.G.; Pereira Filho, J.

    Context-aware mobile applications are increasingly attracting interest of the research community. To facilitate the development of this class of applications, it is necessary that both applications and support platforms share a common context metamodel. This paper presents a metamodel defined using

  5. Design and Use of CSP Meta-Model for Embedded Control Software Development

    NARCIS (Netherlands)

    Bezemer, Maarten M.; Wilterdink, Robert J.W.; Broenink, Jan F.; Welch, Peter H.; Barnes, Frederick R.M.; Chalmers, Kevin; Baekgaard Pedersen, Jan; Sampson, Adam T.

    2012-01-01

    Software that is used to control machines and robots must be predictable and reliable. Model-Driven Design (MDD) techniques are used to comply with both the technical and business needs. This paper introduces a CSP meta-model that is suitable for these MDD techniques. The meta-model describes the st

  6. Surprise... Surprise..., An Empirical Investigation on How Surprise is Connected to Customer Satisfaction

    NARCIS (Netherlands)

    J. Vanhamme (Joëlle)

    2003-01-01

    textabstractThis research investigates the specific influence of the emotion of surprise on customer transaction-specific satisfaction. Four empirical studies-two field studies (a diary study and a cross section survey) and two experiments-were conducted. The results show that surprise positively

  7. Surprise... Surprise..., An Empirical Investigation on How Surprise is Connected to Customer Satisfaction

    NARCIS (Netherlands)

    J. Vanhamme (Joëlle)

    2003-01-01

    textabstractThis research investigates the specific influence of the emotion of surprise on customer transaction-specific satisfaction. Four empirical studies-two field studies (a diary study and a cross section survey) and two experiments-were conducted. The results show that surprise positively [n

  8. Artificial intelligence metamodel comparison and application to wind turbine airfoil uncertainty analysis

    Directory of Open Access Journals (Sweden)

    Yaping Ju

    2016-05-01

    Full Text Available The Monte Carlo simulation method for turbomachinery uncertainty analysis often requires performing a huge number of simulations, the computational cost of which can be greatly alleviated with the help of metamodeling techniques. An intensive comparative study was performed on the approximation performance of three prospective artificial intelligence metamodels, that is, artificial neural network, radial basis function, and support vector regression. The genetic algorithm was used to optimize the predetermined parameters of each metamodel for the sake of a fair comparison. Through testing on 10 nonlinear functions with different problem scales and sample sizes, the genetic algorithm–support vector regression metamodel was found more accurate and robust than the other two counterparts. Accordingly, the genetic algorithm–support vector regression metamodel was selected and combined with the Monte Carlo simulation method for the uncertainty analysis of a wind turbine airfoil under two types of surface roughness uncertainties. The results show that the genetic algorithm–support vector regression metamodel can capture well the uncertainty propagation from the surface roughness to the airfoil aerodynamic performance. This work is useful to the application of metamodeling techniques in the robust design optimization of turbomachinery.

  9. Comparative study, based on metamodels, of methods for controlling performance

    Directory of Open Access Journals (Sweden)

    Aitouche Samia

    2012-05-01

    Full Text Available The continuing evolution of technology and human behavior puts the company in an uncertain and evolving environment. The company must be responsive and even proactive; therefore, control performance becomes increasingly difficult. Choosing the best method of ensuring control by the management policy of the company and its strategy is also a decision problem. The aim of this paper is the comparative study of three methods: the Balanced Scorecard, GIMSI and SKANDIAs NAVIGATOR for choosing the best method for ensuring the orderly following the policy of the company while maintaining its durability. Our work is divided into three parts. We firstly proposed original structural and kinetic metamodels for the three methods that allow an overall view of a method. Secondly, based on the three metamodels, we have drawn a generic comparison to analyze completeness of the method. Thirdly, we performed a restrictive comparison based on a restrictive set of criteria related to the same aspect example organizational learning, which is one of the bricks of knowledge management for a reconciliation to a proactive organization in an environment disturbed and uncertain, and the urgent needs. We note that we applied the three methods are applied in our precedent works. [1][23

  10. An Object-Oriented Metamodel for Bunge-Wand-Weber Ontology

    CERN Document Server

    Kiwelekar, Arvind W

    2010-01-01

    A UML based metamodel for Bunge-Wand-Weber (BWW) ontology is presented. BWW ontology is a generic framework for analysis and conceptualization of real world objects. It includes categories that can be applied to analyze and classify objects found in an information system. In the context of BWW ontology, the metamodel is a representation of the ontological categories and relationships among them. An objective behind developing an object-oriented metamodel has been to model BWW ontology in terms of widely used notions in software development. The main contributions of this paper are a classification for ontological categories, a description template, and representations through UML and typed based models.

  11. OPTIMIZATION METHOD FOR VIRTUAL PRODUCT DEVELOPMENT BASED ON SIMULATION METAMODEL AND ITS APPLICATION

    Institute of Scientific and Technical Information of China (English)

    Pan Jun; Fan Xiumin; Ma Dengzhe; Jin Ye

    2003-01-01

    Virtual product development (VPD) is essentially based on simulation. Due to computational inefficiency, traditional engineering simulation software and optimization methods are inadequate to analyze optimization problems in VPD. Optimization method based on simulation metamodel for virtual product development is proposed to satisfy the needs of complex optimal designs driven by VPD. This method extends the current design of experiments (DOE) by various metamodeling technologies. Simulation metamodels are built to approximate detailed simulation codes, so as to provide link between optimization and simulation, or serve as a bridge for simulation software integration among different domains. An example of optimal design for composite material structure is used to demonstrate the newly introduced method.

  12. Surprise as a design strategy

    NARCIS (Netherlands)

    Ludden, G.D.S.; Schifferstein, H.N.J.; Hekkert, P.P.M.

    2008-01-01

    Imagine yourself queuing for the cashier’s desk in a supermarket. Naturally, you have picked the wrong line, the one that does not seem to move at all. Soon, you get tired of waiting. Now, how would you feel if the cashier suddenly started to sing? Many of us would be surprised and, regardless of

  13. Surprise as a design strategy

    NARCIS (Netherlands)

    Ludden, G.D.S.; Schifferstein, H.N.J.; Hekkert, P.P.M.

    2008-01-01

    Imagine yourself queuing for the cashier’s desk in a supermarket. Naturally, you have picked the wrong line, the one that does not seem to move at all. Soon, you get tired of waiting. Now, how would you feel if the cashier suddenly started to sing? Many of us would be surprised and, regardless of th

  14. Towards a new metamodel for the Task Flow Model of the Discovery Method

    CERN Document Server

    Fernandez-y-Fernandez, Carlos Alberto

    2012-01-01

    This paper presents our proposal for the evolution of the metamodel for the Task Algebra in the Task Flow model for the Discovery Method. The original Task Algebra is based on simple and compound tasks structured using operators such as sequence, selection, and parallel composition. Recursion and encapsulation were also considered. We propose additional characteristics to improve the capabilities of the metamodel to represent accurately the Task Flow Model.

  15. A Hybrid Spline Metamodel for Photovoltaic/Wind/Battery Energy Systems

    OpenAIRE

    ZAIBI, Malek; LAYADI, Toufik Madani; Champenois, Gérard; ROBOAM, xavier; Sareni, Bruno; Belhadj, Jamel

    2015-01-01

    This paper proposes a metamodel design for a Photovoltaic/Wind/Battery Energy System. The modeling of a hybrid PV/wind generator coupled with two kinds of storage i.e. electric (battery) and hydraulic (tanks) devices is investigated. A metamodel is carried out by hybrid spline interpolation to solve the relationships between several design variables i.e. the design parameters of different subsystems and their associate response variables i.e. system indicators performance. The developed model...

  16. Metamodels for Optimum Design of Outer-Rotor Synchronous Reluctance Motor

    Directory of Open Access Journals (Sweden)

    Lavrinovicha Ludmila

    2014-05-01

    Full Text Available A new design of synchronous reluctance motor with segment-shaped outer rotor is presented and investigated in this paper. In order to obtain correct recommendations for optimal design of the studied synchronous reluctance motor, analytical relations of motor electromagnetic parameters and geometrical dimensions (also known as metamodels have been synthesized. Electromagnetic parameters, which have been used for metamodels synthesis, are obtained by means of magnetostatic field numerical calculations with finite element method using software QuickField.

  17. Performance Comparison of Two Meta-Model for the Application to Finite Element Model Updating of Structures

    Institute of Scientific and Technical Information of China (English)

    Yang Liu; DeJun Wang; Jun Ma; Yang Li

    2014-01-01

    To investigate the application of meta-model for finite element ( FE) model updating of structures, the performance of two popular meta-model, i.e., Kriging model and response surface model (RSM), were compared in detail. Firstly, above two kinds of meta-model were introduced briefly. Secondly, some key issues of the application of meta-model to FE model updating of structures were proposed and discussed, and then some advices were presented in order to select a reasonable meta-model for the purpose of updating the FE model of structures. Finally, the procedure of FE model updating based on meta-model was implemented by updating the FE model of a truss bridge model with the measured modal parameters. The results showed that the Kriging model was more proper for FE model updating of complex structures.

  18. Meta-model of EPortfolio Usage in Different Environments

    Directory of Open Access Journals (Sweden)

    Igor Balaban

    2011-09-01

    Full Text Available EPortfolio offers a new philosophy of teaching and learning, giving the learner an opportunity to express oneself, to show one’s past work and experience to all the interested parties ranging from teachers to potential employers. However, an integral model for ePortfolio implementation in academic institutions that would take into account three different levels of stakeholders: 1. Individual (student and teacher; 2. Institution; and 3. Employer, currently does not exist. In this paper the role of ePortfolio in academic environment as well as the context in which ePortfolio operates is analyzed in detail. As a result of the comprehensive analysis that takes into account individual, academic institution and employer, a meta-model of ePortfolio usage in Lifelong Learning is proposed.

  19. Modeling Enterprise Authorization: A Unified Metamodel and Initial Validation

    Directory of Open Access Journals (Sweden)

    Matus Korman

    2016-07-01

    Full Text Available Authorization and its enforcement, access control, have stood at the beginning of the art and science of information security, and remain being crucial pillar of security in the information technology (IT and enterprises operations. Dozens of different models of access control have been proposed. Although Enterprise Architecture as the discipline strives to support the management of IT, support for modeling access policies in enterprises is often lacking, both in terms of supporting the variety of individual models of access control nowadays used, and in terms of providing a unified ontology capable of flexibly expressing access policies for all or the most of the models. This study summarizes a number of existing models of access control, proposes a unified metamodel mapped to ArchiMate, and illustrates its use on a selection of example scenarios and two business cases.

  20. The Lagrangian Ensemble metamodel for simulating plankton ecosystems

    Science.gov (United States)

    Woods, J. D.

    2005-10-01

    This paper presents a detailed account of the Lagrangian Ensemble (LE) metamodel for simulating plankton ecosystems. It uses agent-based modelling to describe the life histories of many thousands of individual plankters. The demography of each plankton population is computed from those life histories. So too is bio-optical and biochemical feedback to the environment. The resulting “virtual ecosystem” is a comprehensive simulation of the plankton ecosystem. It is based on phenotypic equations for individual micro-organisms. LE modelling differs significantly from population-based modelling. The latter uses prognostic equations to compute demography and biofeedback directly. LE modelling diagnoses them from the properties of individual micro-organisms, whose behaviour is computed from prognostic equations. That indirect approach permits the ecosystem to adjust gracefully to changes in exogenous forcing. The paper starts with theory: it defines the Lagrangian Ensemble metamodel and explains how LE code performs a number of computations “behind the curtain”. They include budgeting chemicals, and deriving biofeedback and demography from individuals. The next section describes the practice of LE modelling. It starts with designing a model that complies with the LE metamodel. Then it describes the scenario for exogenous properties that provide the computation with initial and boundary conditions. These procedures differ significantly from those used in population-based modelling. The next section shows how LE modelling is used in research, teaching and planning. The practice depends largely on hindcasting to overcome the limits to predictability of weather forecasting. The scientific method explains observable ecosystem phenomena in terms of finer-grained processes that cannot be observed, but which are controlled by the basic laws of physics, chemistry and biology. What-If? Prediction ( WIP), used for planning, extends hindcasting by adding events that describe

  1. A metamodeling approach to estimate N2O emissions from agricultural soils

    Science.gov (United States)

    Perlman, J.; Hijmans, R. J.; Horwath, W. R.

    2012-12-01

    Because of the complexity of process-based ecological models such as the "DeNitrification DeComposition" (DNDC) model, predictions made with such models can be difficult to explain, and it can take a very long time to run them. We developed metamodels of N2O emissions from maize and wheat by running DNDC for a diverse sample of global climate and soil types, and then fitting the model output as a function of model input using the Random Forest machine learning algorithm. Correlation coefficients between holdout data (DNDC output not used for the metamodel) and metamodel predictions were 0.97 and 0.95 for maize and wheat respectively. Making predictions with the metamodels is on the order of 4.5*10^4 times faster than running DNDC. For both maize and wheat, the metamodels show that DNDC predicts that N2O emissions are highly sensitive to soil organic carbon (SOC ), somewhat sensitive to N-input, pH, clay fraction and temperature, and insensitive to bulk density, precipitation and irrigation. Wheat showed somewhat higher sensitivity than maize to most of the variables. We used the metamodels to estimate N2O emissions from maize and wheat crops at very high resolution and also present new global N2O emission estimates for these crops.

  2. Robust, Optimal Water Infrastructure Planning Under Deep Uncertainty Using Metamodels

    Science.gov (United States)

    Maier, H. R.; Beh, E. H. Y.; Zheng, F.; Dandy, G. C.; Kapelan, Z.

    2015-12-01

    Optimal long-term planning plays an important role in many water infrastructure problems. However, this task is complicated by deep uncertainty about future conditions, such as the impact of population dynamics and climate change. One way to deal with this uncertainty is by means of robustness, which aims to ensure that water infrastructure performs adequately under a range of plausible future conditions. However, as robustness calculations require computationally expensive system models to be run for a large number of scenarios, it is generally computationally intractable to include robustness as an objective in the development of optimal long-term infrastructure plans. In order to overcome this shortcoming, an approach is developed that uses metamodels instead of computationally expensive simulation models in robustness calculations. The approach is demonstrated for the optimal sequencing of water supply augmentation options for the southern portion of the water supply for Adelaide, South Australia. A 100-year planning horizon is subdivided into ten equal decision stages for the purpose of sequencing various water supply augmentation options, including desalination, stormwater harvesting and household rainwater tanks. The objectives include the minimization of average present value of supply augmentation costs, the minimization of average present value of greenhouse gas emissions and the maximization of supply robustness. The uncertain variables are rainfall, per capita water consumption and population. Decision variables are the implementation stages of the different water supply augmentation options. Artificial neural networks are used as metamodels to enable all objectives to be calculated in a computationally efficient manner at each of the decision stages. The results illustrate the importance of identifying optimal staged solutions to ensure robustness and sustainability of water supply into an uncertain long-term future.

  3. Tackling Biocomplexity with Meta-models for Species Risk Assessment

    Directory of Open Access Journals (Sweden)

    Philip J. Nyhus

    2007-06-01

    Full Text Available We describe results of a multi-year effort to strengthen consideration of the human dimension into endangered species risk assessments and to strengthen research capacity to understand biodiversity risk assessment in the context of coupled human-natural systems. A core group of social and biological scientists have worked with a network of more than 50 individuals from four countries to develop a conceptual framework illustrating how human-mediated processes influence biological systems and to develop tools to gather, translate, and incorporate these data into existing simulation models. A central theme of our research focused on (1 the difficulties often encountered in identifying and securing diverse bodies of expertise and information that is necessary to adequately address complex species conservation issues; and (2 the development of quantitative simulation modeling tools that could explicitly link these datasets as a way to gain deeper insight into these issues. To address these important challenges, we promote a "meta-modeling" approach where computational links are constructed between discipline-specific models already in existence. In this approach, each model can function as a powerful stand-alone program, but interaction between applications is achieved by passing data structures describing the state of the system between programs. As one example of this concept, an integrated meta-model of wildlife disease and population biology is described. A goal of this effort is to improve science-based capabilities for decision making by scientists, natural resource managers, and policy makers addressing environmental problems in general, and focusing on biodiversity risk assessment in particular.

  4. Polynomial meta-models with canonical low-rank approximations: numerical insights and comparison to sparse polynomial chaos expansions

    OpenAIRE

    Konakli, Katerina; Sudret, Bruno

    2015-01-01

    The growing need for uncertainty analysis of complex computational models has led to an expanding use of meta-models across engineering and sciences. The efficiency of meta-modeling techniques relies on their ability to provide statistically-equivalent analytical representations based on relatively few evaluations of the original model. Polynomial chaos expansions (PCE) have proven a powerful tool for developing meta-models in a wide range of applications; the key idea thereof is to expand th...

  5. Brazilian rescue plan sparks surprise

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    According to Financial Times,when Guido Mantega,Brazil's finance minister,suddenly proposed a “Bric” rescue package for the eurozone this week,he caught not only other world leaders by surprise but also many of his fellow countrymen.Even as officials from other members of the so-called Bric grouping,Russia,India and China,said it was the first they heard of the idea,many ordinary Brazilians expressed shock at the notion of bailing out the world's richest trading bloc.

  6. Multi-objective optimization of gear forging process based on adaptive surrogate meta-models

    Science.gov (United States)

    Meng, Fanjuan; Labergere, Carl; Lafon, Pascal; Daniel, Laurent

    2013-05-01

    In forging industry, net shape or near net shape forging of gears has been the subject of considerable research effort in the last few decades. So in this paper, a multi-objective optimization methodology of net shape gear forging process design has been discussed. The study is mainly done in four parts: building parametric CAD geometry model, simulating the forging process, fitting surrogate meta-models and optimizing the process by using an advanced algorithm. In order to maximally appropriate meta-models of the real response, an adaptive meta-model based design strategy has been applied. This is a continuous process: first, bui Id a preliminary version of the meta-models after the initial simulated calculations; second, improve the accuracy and update the meta-models by adding some new representative samplings. By using this iterative strategy, the number of the initial sample points for real numerical simulations is greatly decreased and the time for the forged gear design is significantly shortened. Finally, an optimal design for an industrial application of a 27-teeth gear forging process was introduced, which includes three optimization variables and two objective functions. A 3D FE nu merical simulation model is used to realize the process and an advanced thermo-elasto-visco-plastic constitutive equation is considered to represent the material behavior. The meta-model applied for this example is kriging and the optimization algorithm is NSGA-II. At last, a relatively better Pareto optimal front (POF) is gotten with gradually improving the obtained surrogate meta-models.

  7. Meta-Model and UML Profile for Requirements Management of Software and Embedded Systems

    Directory of Open Access Journals (Sweden)

    Arpinen Tero

    2011-01-01

    Full Text Available Software and embedded system companies today encounter problems related to requirements management tool integration, incorrect tool usage, and lack of traceability. This is due to utilized tools with no clear meta-model and semantics to communicate requirements between different stakeholders. This paper presents a comprehensive meta-model for requirements management. The focus is on software and embedded system domains. The goal is to define generic requirements management domain concepts and abstract interfaces between requirements management and system development. This leads to a portable requirements management meta-model which can be adapted with various system modeling languages. The created meta-model is prototyped by translating it into a UML profile. The profile is imported into a UML tool which is used for rapid evaluation of meta-model concepts in practice. The developed profile is associated with a proof of concept report generator tool that automatically produces up-to-date documentation from the models in form of web pages. The profile is adopted to create an example model of embedded system requirement specification which is built with the profile.

  8. A hybrid model for mapping simplified seismic response via a GIS-metamodel approach

    Directory of Open Access Journals (Sweden)

    G. Grelle

    2014-02-01

    Full Text Available An hybrid model, consisting of GIS and metamodel (model of model procedures, was introduced with the aim of estimating the 1-D spatial seismic site response. Inputs and outputs are provided and processed by means of an appropriate GIS model, named GIS Cubic Model (GCM. This discretizes the seismic underground half-space in a pseudo-tridimensional way. GCM consists of a layered parametric structure aimed at resolving a predicted metamodel by means of pixel to pixel vertical computing. The metamodel leading to the determination of a bilinear-polynomial function is able to design the classic shape of the spectral acceleration response in relation to the main physical parameters that characterize the spectrum itself. The main physical parameters consist of (i the average shear wave velocity of the shallow layer, (ii the fundamental period and, (iii the period where the spatial spectral response is required. The metamodel is calibrated on theoretical spectral accelerations regarding the local likely Vs-profiles, which are obtained using the Monte Carlo simulation technique on the basis of the GCM information. Therefore, via the GCM structure and the metamodel, the hybrid model provides maps of normalized acceleration response spectra. The hybrid model was applied and tested on the built-up area of the San Giorgio del Sannio village, located in a high-risk seismic zone of Southern Italy.

  9. Structural optimization for a jaw using iterative Kriging metamodels

    Energy Technology Data Exchange (ETDEWEB)

    Bang, Il Kwon; Han, Dong Seop; Han, Geun Jo; Lee, Kwon Hee [Dong-A University, Busan (Korea, Republic of)

    2008-09-15

    Rail clamps are mechanical components installed to fix the container crane to its lower members against wind blast or slip. Rail clamps should be designed to survive harsh wind loading conditions. In this study, a jaw structure, which is a part of a wedge-typed rail clamp, is optimized with respect to its strength under a severe wind loading condition. According to the classification of structural optimization, the structural optimization of a jaw is included in the category of shape optimization. Conventional structural optimization methods have difficulties in defining complex shape design variables and preventing mesh distortions. To overcome the difficulties, a metamodel using the Kriging interpolation method is introduced to replace the true response by an approximate one. This research presents the shape optimization of a jaw using iterative Kriging interpolation models and a simulated annealing algorithm. The new Kriging models are iteratively constructed by refining the former Kriging models. This process is continued until the convergence criteria are satisfied. The optimum results obtained by the suggested method are compared with those obtained by the DOE (design of experiments) and VT (variation technology) methods built in ANSYS WORKBENCH

  10. General Meta-Models to Analysis of Software Architecture Definitions

    Directory of Open Access Journals (Sweden)

    GholamAli Nejad HajAli Irani

    2011-12-01

    Full Text Available An important step for understanding the architecture will be obtained by providing a clear definition from that. More than 150 valid definitions presented for identifying the software architecture. So an analogy among them is needed to give us a better understanding on the existing definitions. In this paper an analysis over different issues of current definitions is provided based on the incorporated elements. In conjunction with this objective first, the definitions are collected and, after conducting an analysis over them, are broken into different constituent elements which are shown in one table. Then some selected parameters in the table are classified into groups for comparison purposes. Then all parameters of each individual group are specified and compared with each other. This procedure is rendered for all groups respectively. Finally, a meta-model is developed for each group. The aim is not to accept or reject a specific definition, but rather is to contrast the definitions and their respective constituent elements in order to construct a background for gaining better perceptions on software architecture which in turn can benefit the introduction of an appropriate definition.

  11. Design Optimization of Centrifugal Pump Using Radial Basis Function Metamodels

    Directory of Open Access Journals (Sweden)

    Yu Zhang

    2014-05-01

    Full Text Available Optimization design of centrifugal pump is a typical multiobjective optimization (MOO problem. This paper presents an MOO design of centrifugal pump with five decision variables and three objective functions, and a set of centrifugal pumps with various impeller shroud shapes are studied by CFD numerical simulations. The important performance indexes for centrifugal pump such as head, efficiency, and required net positive suction head (NPSHr are investigated, and the results indicate that the geometry shape of impeller shroud has strong effect on the pump's performance indexes. Based on these, radial basis function (RBF metamodels are constructed to approximate the functional relationship between the shape parameters of impeller shroud and the performance indexes of pump. To achieve the objectives of maximizing head and efficiency and minimizing NPSHr simultaneously, multiobjective evolutionary algorithm based on decomposition (MOEA/D is applied to solve the triobjective optimization problem, and a final design point is selected from the Pareto solution set by means of robust design. Compared with the values of prototype test and CFD simulation, the solution of the final design point exhibits a good consistency.

  12. Efficient PCA-driven EAs and metamodel-assisted EAs, with applications in turbomachinery

    Science.gov (United States)

    Kyriacou, Stylianos A.; Asouti, Varvara G.; Giannakoglou, Kyriakos C.

    2014-07-01

    This article presents methods to enhance the efficiency of Evolutionary Algorithms (EAs), particularly those assisted by surrogate evaluation models or metamodels. The gain in efficiency becomes important in applications related to industrial optimization problems with a great number of design variables. The development is based on the principal components analysis of the elite members of the evolving EA population, the outcome of which is used to guide the application of evolution operators and/or train dependable metamodels/artificial neural networks by reducing the number of sensory units. Regarding the latter, the metamodels are trained with less computing cost and yield more relevant objective function predictions. The proposed methods are applied to constrained, single- and two-objective optimization of thermal and hydraulic turbomachines.

  13. Screening and metamodeling of computer experiments with functional outputs. Application to thermal-hydraulic computations

    CERN Document Server

    Auder, Benjamin; Iooss, Bertrand; Marques, Michel

    2010-01-01

    To perform uncertainty, sensitivity or optimization analysis on scalar variables calculated by a cpu time expensive computer code, a widely accepted methodology consists in first identifying the most influential uncertain inputs (by screening techniques), and then in replacing the cpu time expensive model by a cpu inexpensive mathematical function, called a metamodel. This paper extends this methodology to the functional output case, for instance when the model output variables are curves. The screening approach is based on the analysis of variance and principal component analysis of output curves. The functional metamodeling consists in a curve classification step, a dimension reduction step, then a classical metamodeling step. An industrial nuclear reactor application (dealing with uncertainties in the pressurized thermal shock analysis) illustrates all these steps.

  14. Double-stage Metamodel and Its Application in Aerodynamic Design Optimization

    Institute of Scientific and Technical Information of China (English)

    ZHANG Dehu; GAO Zhenghong; HUANG Likeng; WANG Mingliang

    2011-01-01

    Constructing metamodel with global high-fidelity in design space is significant in engineering design.In this paper,a dou ble-stage metamodel(DSM)which integrates advantages of both interpolation mctamodel and regression metamodel is constructed.It takes regression model as the first stage to fit overall distribution of the original model,and then interpolation model of regression model approximation error is used as the second stage to improve accuracy.Under the same conditions and with the same samples,DSM expresses higher fidelity and represents physical characteristics of original model better.Besides,in order to validate DSM characteristics,three examples including Ackley finction,airfoil aerodynamic analysis and wing aerodynamic analysis are investigated.In the end,airfoil and wing aerodynamic design optimizations using genetic algorithm are presented to verify the engineering applicability of DSM.

  15. Multi-Objective Optimization Algorithms Design based on Support Vector Regression Metamodeling

    Directory of Open Access Journals (Sweden)

    Qi Zhang

    2013-11-01

    Full Text Available In order to solve the multi-objective optimization problem in the complex engineering, in this paper a NSGA-II multi-objective optimization algorithms based on Support Vector Regression Metamodeling is presented. Appropriate design parameter samples are selected by experimental design theories, and the response samples are obtained from the experiments or numerical simulations, used the SVM method to establish the metamodels of the objective performance functions and constraints, and reconstructed the original optimal problem. The reconstructed metamodels was solved by NSGA-II algorithm and took the structure optimization of the microwave power divider as an example to illustrate the proposed methodology and solve themulti-objective optimization problem. The results show that this methodology is feasible and highly effective, and thus it can be used in the optimum design of engineering fields.

  16. Evaluation of convergence behavior of metamodeling techniques for bridging scales in multi-scale multimaterial simulation

    Energy Technology Data Exchange (ETDEWEB)

    Sen, Oishik, E-mail: oishik-sen@uiowa.edu [Mechanical and Industrial Engineering, The University of Iowa, Iowa City, IA 52242 (United States); Davis, Sean, E-mail: sean.davis@mail.mcgill.ca [Aerospace Engineering, San Diego State University, San Diego, CA 92115 (United States); Jacobs, Gustaaf, E-mail: gjacobs@sdsu.edu [Aerospace Engineering, San Diego State University, San Diego, CA 92115 (United States); Udaykumar, H.S., E-mail: hs-kumar@uiowa.edu [Mechanical and Industrial Engineering, The University of Iowa, Iowa City, IA 52242 (United States)

    2015-08-01

    The effectiveness of several metamodeling techniques, viz. the Polynomial Stochastic Collocation method, Adaptive Stochastic Collocation method, a Radial Basis Function Neural Network, a Kriging Method and a Dynamic Kriging Method is evaluated. This is done with the express purpose of using metamodels to bridge scales between micro- and macro-scale models in a multi-scale multimaterial simulation. The rate of convergence of the error when used to reconstruct hypersurfaces of known functions is studied. For sufficiently large number of training points, Stochastic Collocation methods generally converge faster than the other metamodeling techniques, while the DKG method converges faster when the number of input points is less than 100 in a two-dimensional parameter space. Because the input points correspond to computationally expensive micro/meso-scale computations, the DKG is favored for bridging scales in a multi-scale solver.

  17. Computational Intelligence and Wavelet Transform Based Metamodel for Efficient Generation of Not-Yet Simulated Waveforms.

    Science.gov (United States)

    Oltean, Gabriel; Ivanciu, Laura-Nicoleta

    2016-01-01

    The design and verification of complex electronic systems, especially the analog and mixed-signal ones, prove to be extremely time consuming tasks, if only circuit-level simulations are involved. A significant amount of time can be saved if a cost effective solution is used for the extensive analysis of the system, under all conceivable conditions. This paper proposes a data-driven method to build fast to evaluate, but also accurate metamodels capable of generating not-yet simulated waveforms as a function of different combinations of the parameters of the system. The necessary data are obtained by early-stage simulation of an electronic control system from the automotive industry. The metamodel development is based on three key elements: a wavelet transform for waveform characterization, a genetic algorithm optimization to detect the optimal wavelet transform and to identify the most relevant decomposition coefficients, and an artificial neuronal network to derive the relevant coefficients of the wavelet transform for any new parameters combination. The resulted metamodels for three different waveform families are fully reliable. They satisfy the required key points: high accuracy (a maximum mean squared error of 7.1x10-5 for the unity-based normalized waveforms), efficiency (fully affordable computational effort for metamodel build-up: maximum 18 minutes on a general purpose computer), and simplicity (less than 1 second for running the metamodel, the user only provides the parameters combination). The metamodels can be used for very efficient generation of new waveforms, for any possible combination of dependent parameters, offering the possibility to explore the entire design space. A wide range of possibilities becomes achievable for the user, such as: all design corners can be analyzed, possible worst-case situations can be investigated, extreme values of waveforms can be discovered, sensitivity analyses can be performed (the influence of each parameter on the

  18. Computational Intelligence and Wavelet Transform Based Metamodel for Efficient Generation of Not-Yet Simulated Waveforms

    Science.gov (United States)

    Oltean, Gabriel; Ivanciu, Laura-Nicoleta

    2016-01-01

    The design and verification of complex electronic systems, especially the analog and mixed-signal ones, prove to be extremely time consuming tasks, if only circuit-level simulations are involved. A significant amount of time can be saved if a cost effective solution is used for the extensive analysis of the system, under all conceivable conditions. This paper proposes a data-driven method to build fast to evaluate, but also accurate metamodels capable of generating not-yet simulated waveforms as a function of different combinations of the parameters of the system. The necessary data are obtained by early-stage simulation of an electronic control system from the automotive industry. The metamodel development is based on three key elements: a wavelet transform for waveform characterization, a genetic algorithm optimization to detect the optimal wavelet transform and to identify the most relevant decomposition coefficients, and an artificial neuronal network to derive the relevant coefficients of the wavelet transform for any new parameters combination. The resulted metamodels for three different waveform families are fully reliable. They satisfy the required key points: high accuracy (a maximum mean squared error of 7.1x10-5 for the unity-based normalized waveforms), efficiency (fully affordable computational effort for metamodel build-up: maximum 18 minutes on a general purpose computer), and simplicity (less than 1 second for running the metamodel, the user only provides the parameters combination). The metamodels can be used for very efficient generation of new waveforms, for any possible combination of dependent parameters, offering the possibility to explore the entire design space. A wide range of possibilities becomes achievable for the user, such as: all design corners can be analyzed, possible worst-case situations can be investigated, extreme values of waveforms can be discovered, sensitivity analyses can be performed (the influence of each parameter on the

  19. Computational Intelligence and Wavelet Transform Based Metamodel for Efficient Generation of Not-Yet Simulated Waveforms.

    Directory of Open Access Journals (Sweden)

    Gabriel Oltean

    Full Text Available The design and verification of complex electronic systems, especially the analog and mixed-signal ones, prove to be extremely time consuming tasks, if only circuit-level simulations are involved. A significant amount of time can be saved if a cost effective solution is used for the extensive analysis of the system, under all conceivable conditions. This paper proposes a data-driven method to build fast to evaluate, but also accurate metamodels capable of generating not-yet simulated waveforms as a function of different combinations of the parameters of the system. The necessary data are obtained by early-stage simulation of an electronic control system from the automotive industry. The metamodel development is based on three key elements: a wavelet transform for waveform characterization, a genetic algorithm optimization to detect the optimal wavelet transform and to identify the most relevant decomposition coefficients, and an artificial neuronal network to derive the relevant coefficients of the wavelet transform for any new parameters combination. The resulted metamodels for three different waveform families are fully reliable. They satisfy the required key points: high accuracy (a maximum mean squared error of 7.1x10-5 for the unity-based normalized waveforms, efficiency (fully affordable computational effort for metamodel build-up: maximum 18 minutes on a general purpose computer, and simplicity (less than 1 second for running the metamodel, the user only provides the parameters combination. The metamodels can be used for very efficient generation of new waveforms, for any possible combination of dependent parameters, offering the possibility to explore the entire design space. A wide range of possibilities becomes achievable for the user, such as: all design corners can be analyzed, possible worst-case situations can be investigated, extreme values of waveforms can be discovered, sensitivity analyses can be performed (the influence of each

  20. Den Dingen auf den Grund gehen. Meta-Modell Arbeit in der E-Mailberatung.

    Directory of Open Access Journals (Sweden)

    Stefan Schumacher

    2011-10-01

    Full Text Available Ausgehend von der Frage, wie in der schriftlichen Kommunikation Informationsklarheit und Interpretationssicherheit verbessert werden können, beschreibt der Autor das sogenannte Meta-Modell der Sprache, das auf dem Hintergrund des Sprachwissenschaftlers Noam Chomsky von den Begründern des Neurolinguistischen Programmierens (NLP vorgestellt wurde. Bandler und Grinder stellen verschiedene Formen der sprachlichen Auslassung dar, die in der Kommunikation entstehenkönnen. Der Autor streicht die Bedeutung für die Onlineberatung heraus und zeigt anhand praktischer Beispiele, wie mit dem Meta-Modell der Sprache in der Beratung gearbeitet werden kann.

  1. Some Surprising Introductory Physics Facts and Numbers

    Science.gov (United States)

    Mallmann, A. James

    2016-01-01

    In the entertainment world, people usually like, and find memorable, novels, short stories, and movies with surprise endings. This suggests that classroom teachers might want to present to their students examples of surprising facts associated with principles of physics. Possible benefits of finding surprising facts about principles of physics are…

  2. Young Galaxy's Magnetism Surprises Astronomers

    Science.gov (United States)

    2008-10-01

    Astronomers have made the first direct measurement of the magnetic field in a young, distant galaxy, and the result is a big surprise. Looking at a faraway protogalaxy seen as it was 6.5 billion years ago, the scientists measured a magnetic field at least 10 times stronger than that of our own Milky Way. They had expected just the opposite. The GBT Robert C. Byrd Green Bank Telescope CREDIT: NRAO/AUI/NSF The scientists made the discovery using the National Science Foundation's ultra-sensitive Robert C. Byrd Green Bank Telescope (GBT) in West Virginia. "This new measurement indicates that magnetic fields may play a more important role in the formation and evolution of galaxies than we have realized," said Arthur Wolfe, of the University of California-San Diego (UCSD). At its great distance, the protogalaxy is seen as it was when the Universe was about half its current age. According to the leading theory, cosmic magnetic fields are generated by the dynamos of rotating galaxies -- a process that would produce stronger fields with the passage of time. In this scenario, the magnetic fields should be weaker in the earlier Universe, not stronger. The new, direct magnetic-field measurement comes on the heels of a July report by Swiss and American astronomers who made indirect measurements that also implied strong magnetic fields in the early Universe. "Our results present a challenge to the dynamo model, but they do not rule it out," Wolfe said. There are other possible explanations for the strong magnetic field seen in the one protogalaxy Wolfe's team studied. "We may be seeing the field close to the central region of a massive galaxy, and we know such fields are stronger toward the centers of nearby galaxies. Also, the field we see may have been amplified by a shock wave caused by the collision of two galaxies," he said. The protogalaxy studied with the GBT, called DLA-3C286, consists of gas with little or no star formation occurring in it. The astronomers suspect that

  3. A surprising exception. Himachal's success in promoting female education.

    Science.gov (United States)

    Dreze, J

    1999-01-01

    Gender inequalities in India are derived partly from the economic dependence of women on men. Low levels of formal education among women reinforce the asymmetry of power between the sexes. A general pattern of sharp gender bias in education levels is noted in most Indian states; however, in the small state of Himachal Pradesh, school participation rates are almost as high for girls as for boys. Rates of school participation for girls at the primary level is close to universal in this state, and while gender bias persists at higher levels of education, it is much lower than elsewhere in India and rapidly declining. This was not the case 50 years ago; educational levels in Himachal Pradesh were no higher than in Bihar or Uttar Pradesh. Today, the spectacular transition towards universal elementary education in Himachal Pradesh has contributed to the impressive reduction of poverty, mortality, illness, undernutrition, and related deprivations.

  4. Comparison of most adaptive meta model With newly created Quality Meta-Model using CART Algorithm

    Directory of Open Access Journals (Sweden)

    Jasbir Malik

    2012-09-01

    Full Text Available To ensure that the software developed is of high quality, it is now widely accepted that various artifacts generated during the development process should be rigorously evaluated using domain-specific quality model. However, a domain-specific quality model should be derived from a generic quality model which is time-proven, well-validated and widely-accepted. This thesis lays down a clear definition of quality meta-model and then identifies various quality meta-models existing in the research and practice-domains. This thesis then compares the various existing quality meta-models to identify which model is the most adaptable to various domains. A set of criteria is used to compare the various quality meta-models. In this we specify the categories, as the CART Algorithms is completely a tree architecture which works on either true or false meta model decision making power .So in the process it has been compared that , if the following items has been found in one category then it falls under true section else under false section .

  5. Meta-modelling, visualization and emulation of multi-dimensional data for virtual production intelligence

    Science.gov (United States)

    Schulz, Wolfgang; Hermanns, Torsten; Al Khawli, Toufik

    2017-07-01

    Decision making for competitive production in high-wage countries is a daily challenge where rational and irrational methods are used. The design of decision making processes is an intriguing, discipline spanning science. However, there are gaps in understanding the impact of the known mathematical and procedural methods on the usage of rational choice theory. Following Benjamin Franklin's rule for decision making formulated in London 1772, he called "Prudential Algebra" with the meaning of prudential reasons, one of the major ingredients of Meta-Modelling can be identified finally leading to one algebraic value labelling the results (criteria settings) of alternative decisions (parameter settings). This work describes the advances in Meta-Modelling techniques applied to multi-dimensional and multi-criterial optimization by identifying the persistence level of the corresponding Morse-Smale Complex. Implementations for laser cutting and laser drilling are presented, including the generation of fast and frugal Meta-Models with controlled error based on mathematical model reduction Reduced Models are derived to avoid any unnecessary complexity. Both, model reduction and analysis of multi-dimensional parameter space are used to enable interactive communication between Discovery Finders and Invention Makers. Emulators and visualizations of a metamodel are introduced as components of Virtual Production Intelligence making applicable the methods of Scientific Design Thinking and getting the developer as well as the operator more skilled.

  6. Modeling units of study from a pedagogical perspective: the pedagogical meta-model behind EML

    NARCIS (Netherlands)

    Koper, Rob

    2003-01-01

    This text is a short summary of the work on pedagogical analysis carried out when EML (Educational Modelling Language) was being developed. Because we address pedagogical meta-models the consequence is that I must justify the underlying pedagogical models it describes. I have included a (far from co

  7. MDA-Based 3G Service Creation Approach and Telecom Service Domain Meta-Model

    Institute of Scientific and Technical Information of China (English)

    QIAO Xiu-quan; LI Xiao-feng; LI Yan

    2006-01-01

    This paper presents a model-driven 3G service creation approach based on model driven architecture technology.The focus of the paper is the methodology of designing telecommunication service-related meta-model and its profile implementation mechanism. This approach enhances the reusability of applications through separation of service logic models from concrete open application programming interface technologies and implementation technologies.

  8. Customized sequential designs for random simulation experiments: Kriging metamodeling and bootstrapping

    NARCIS (Netherlands)

    Beers, van W.C.M.; Kleijnen, J.P.C.

    2005-01-01

    This paper proposes a novel method to select an experimental design for interpolation in random simulation, especially discrete event simulation. (Though the paper focuses on Kriging, this design approach may also apply to other types of metamodels such as linear regression models.) Assuming that

  9. Evaluative Appraisals of Environmental Mystery and Surprise

    Science.gov (United States)

    Nasar, Jack L.; Cubukcu, Ebru

    2011-01-01

    This study used a desktop virtual environment (VE) of 15 large-scale residential streets to test the effects of environmental mystery and surprise on response. In theory, mystery and surprise should increase interest and visual appeal. For each VE, participants walked through an approach street and turned right onto a post-turn street. We designed…

  10. Evaluative Appraisals of Environmental Mystery and Surprise

    Science.gov (United States)

    Nasar, Jack L.; Cubukcu, Ebru

    2011-01-01

    This study used a desktop virtual environment (VE) of 15 large-scale residential streets to test the effects of environmental mystery and surprise on response. In theory, mystery and surprise should increase interest and visual appeal. For each VE, participants walked through an approach street and turned right onto a post-turn street. We designed…

  11. Analyst Information Precision and Small Earnings Surprises

    NARCIS (Netherlands)

    S. Bissessur; D. Veenman

    2014-01-01

    Prior research attributes zero and small positive earnings surprises to managers’ incentives for earnings management. In contrast, this study introduces and empirically tests an explanation for zero and small positive earnings surprises based on predictable variation in analyst forecast errors. We a

  12. Cognitive and Social Perspectives on Surprise

    Science.gov (United States)

    Adhami, Mundler

    2007-01-01

    Meanings of "surprise" are wide and include uplifting and engaging facets like wonder and amazement on the one hand as well as ones that may be of the opposite nature like interruption and disrupt on the other. Pedagogically, educators who use surprise in class activities are focusing on students being "taken aback" by a situation, hopefully…

  13. A new content framework and metamodel for Enterprise Architecture and IS Strategic Planning

    Directory of Open Access Journals (Sweden)

    Mouhsine Lakhdissi

    2012-03-01

    Full Text Available IS Strategic Planning and Enterprise Architecture are two major disciplines in IT Architecture and Governance. They pursue the same objectives and have much in common. While ISSP has benefited from business strategic planning methods and techniques, it has not evolved much since the 90s and lacks from formal, tooled and standard methodology. In the other hand, Enterprise Architecture has known a very fast progression in the last years helped in that by market's needs and research in the domain of Entreprise Modeling. The basic component underlying both fields is the content framework and metamodel necessary to describe existing and future state. The aim of this paper is to present a new EA content framework and metamodel taking into consideration ISSP concerns and bridges the gap between these two fields.

  14. Shape optimization of wire-wrapped fuel assembly using Kriging metamodeling technique

    Energy Technology Data Exchange (ETDEWEB)

    Raza, Wasim [Department of Mechanical Engineering, Inha University, 253 Yonghyun-Dong, Nam-Gu, Incheon 402-751 (Korea, Republic of); Kim, Kwang-Yong [Department of Mechanical Engineering, Inha University, 253 Yonghyun-Dong, Nam-Gu, Incheon 402-751 (Korea, Republic of)], E-mail: kykim@inha.ac.kr

    2008-06-15

    In this work, shape optimization of a wire-wrapped fuel assembly in a liquid metal reactor has been carried out by combining a three-dimensional Reynolds-averaged Navier-Stokes analysis with the Kriging method, a well-known metamodeling technique for optimization. Sequential quadratic programming (SQP) is used to search the optimal point from the constructed metamodel. Two geometric design variables are selected for the optimization and design space is sampled using Latin Hypercube Sampling (LHS). The optimization problem has been defined as a maximization of the objective function, which is as a linear combination of heat transfer and friction loss related terms with a weighing factor. The objective function value is more sensitive to the ratio of the wire spacer diameter to the fuel rod diameter than to the ratio of the wire wrap pitch to the fuel rod diameter. The optimal values of the design variables are obtained by varying the weighting factor.

  15. Multiple-optima search method based on a metamodel and mathematical morphology

    Science.gov (United States)

    Li, Yulin; Liu, Li; Long, Teng; Chen, Xin

    2016-03-01

    This article investigates a non-population-based optimization method using mathematical morphology and the radial basis function (RBF) for multimodal computationally intensive functions. To obtain several feasible solutions, mathematical morphology is employed to search promising regions. Sequential quadratic programming is used to exploit the possible areas to determine the exact positions of the potential optima. To relieve the computational burden, metamodelling techniques are employed. The RBF metamodel in different iterations varies considerably so that the positions of potential optima are moving during optimization. To find the pair of correlative potential optima between the latest two iterations, a tolerance is presented. Furthermore, to ensure that all the output minima are the global or local optima, an optimality judgement criterion is introduced.

  16. Application of experimental design techniques to structural simulation meta-model building using neural network

    Institute of Scientific and Technical Information of China (English)

    费庆国; 张令弥

    2004-01-01

    Neural networks are being used to construct meta-models in numerical simulation of structures. In addition to network structures and training algorithms, training samples also greatly affect the accuracy of neural network models. In this paper, some existing main sampling techniques are evaluated, including techniques based on experimental design theory,random selection, and rotating sampling. First, advantages and disadvantages of each technique are reviewed. Then, seven techniques are used to generate samples for training radial neural networks models for two benchmarks: an antenna model and an aircraft model. Results show that the uniform design, in which the number of samples and mean square error network models are considered, is the best sampling technique for neural network based meta-model building.

  17. Metamodels for New Designs of Outer-Rotor Brushless Synchronous Electric Motors

    Science.gov (United States)

    Dirba, J.; Lavrinovicha, L.

    2014-04-01

    The authors consider the possibilities to synthesise metamodels for the analysis and optimisation of brushless synchronous motors. The metamodels are presented for new designs of the outer-rotor permanent magnet synchronous motor and the outer-rotor reluctance motor. The metamodels are synthesised based on the results obtained by the numerical calculations of magnetic field taking into account magnetic saturation. Analysis of the results for the motor magnetic field and tests of the metamodels at the selected and intermediate points shows that these can be synthesised with acceptable accuracy using numerical calculations instead of expensive real experiments. Rakstā ir apskatītas metamodeļu iegūšanas iespējas to izmantošanai bezkontaktu sinhrono dzinēju analīzē un optimizācijā. Ir iegūti metamodeļi sinhronam dzinējam ar pastāvīgajiem magnētiem un reaktīvam dzinējam ar ārējo rotoru. Sintezēto metamodeļu iegūšanai izmantoti elektrisko dzinēju magnētiskā lauka skaitlisko aprēķinu rezultāti, ievērojot magnētiskās ķēdes piesātinājumu. Metamodeļu pārbaude aprēķinu un starppunktos parādīja, ka to iegūšanai dārgo reālo eksperimentu vietā var izmantot magnētiskā lauka aprēķinu rezultātus.

  18. Accelerated optimizations of an electromagnetic acoustic transducer with artificial neural networks as metamodels

    Directory of Open Access Journals (Sweden)

    S. Wang

    2017-08-01

    Full Text Available Electromagnetic acoustic transducers (EMATs are noncontact transducers generating ultrasonic waves directly in the conductive sample. Despite the advantages, their transduction efficiencies are relatively low, so it is imperative to build accurate multiphysics models of EMATs and optimize the structural parameters accordingly, using a suitable optimization algorithm. The optimizing process often involves a large number of runs of the computationally expensive numerical models, so metamodels as substitutes for the real numerical models are helpful for the optimizations. In this work the focus is on the artificial neural networks as the metamodels of an omnidirectional EMAT, including the multilayer feedforward networks trained with the basic and improved back propagation algorithms and the radial basis function networks with exact and nonexact interpolations. The developed neural-network programs are tested on an example problem. Then the model of an omnidirectional EMAT generating Lamb waves in a linearized steel plate is introduced, and various approaches to calculate the amplitudes of the displacement component waveforms are discussed. The neural-network metamodels are then built for the EMAT model and compared to the displacement component amplitude (or ratio of amplitudes surface data on a discrete grid of the design variables as the reference, applying a multifrequency model with FFT (fast Fourier transform/IFFT (inverse FFT processing. Finally the two-objective optimization problem is formulated with one objective function minimizing the ratio of the amplitude of the S0-mode Lamb wave to that of the A0 mode, and the other objective function minimizing as the negative amplitude of the A0 mode. Pareto fronts in the criterion space are solved with the neural-network models and the total time consumption is greatly decreased. From the study it could be observed that the radial basis function network with exact interpolation has the best

  19. Metamodel-based Editor for Service Oriented Architecture(MED4SOA)

    OpenAIRE

    Gjataj, Rudin

    2007-01-01

    In software development tool support is essential. Since the standardization of UML and Model Driven Architecture (MDA), new approaches in the design and implementation of software systems have flourished. These approaches are specific architectures like Service Oriented Architecture (SOA) or specialised MDA flavours, like Model Driven Software Development (MDSD). In this thesis we provide a Metamodel-based Editor for Service Oriented Architecture(MED4SOA). It is a graphical modeling e...

  20. Network Metamodeling: The Effect of Correlation Metric Choice on Phylogenomic and Transcriptomic Network Topology

    Energy Technology Data Exchange (ETDEWEB)

    Weighill, Deborah A [ORNL; Jacobson, Daniel A [ORNL

    2017-01-01

    We explore the use of a network meta-modeling approach to compare the effects of similarity metrics used to construct biological networks on the topology of the resulting networks. This work reviews various similarity metrics for the construction of networks and various topology measures for the characterization of resulting network topology, demonstrating the use of these metrics in the construction and comparison of phylogenomic and transcriptomic networks.

  1. A knowledge representation meta-model for rule-based modelling of signalling networks

    Directory of Open Access Journals (Sweden)

    Adrien Basso-Blandin

    2016-03-01

    Full Text Available The study of cellular signalling pathways and their deregulation in disease states, such as cancer, is a large and extremely complex task. Indeed, these systems involve many parts and processes but are studied piecewise and their literatures and data are consequently fragmented, distributed and sometimes—at least apparently—inconsistent. This makes it extremely difficult to build significant explanatory models with the result that effects in these systems that are brought about by many interacting factors are poorly understood. The rule-based approach to modelling has shown some promise for the representation of the highly combinatorial systems typically found in signalling where many of the proteins are composed of multiple binding domains, capable of simultaneous interactions, and/or peptide motifs controlled by post-translational modifications. However, the rule-based approach requires highly detailed information about the precise conditions for each and every interaction which is rarely available from any one single source. Rather, these conditions must be painstakingly inferred and curated, by hand, from information contained in many papers—each of which contains only part of the story. In this paper, we introduce a graph-based meta-model, attuned to the representation of cellular signalling networks, which aims to ease this massive cognitive burden on the rule-based curation process. This meta-model is a generalization of that used by Kappa and BNGL which allows for the flexible representation of knowledge at various levels of granularity. In particular, it allows us to deal with information which has either too little, or too much, detail with respect to the strict rule-based meta-model. Our approach provides a basis for the gradual aggregation of fragmented biological knowledge extracted from the literature into an instance of the meta-model from which we can define an automated translation into executable Kappa programs.

  2. Network Metamodeling: Effect of Correlation Metric Choice on Phylogenomic and Transcriptomic Network Topology.

    Science.gov (United States)

    Weighill, Deborah A; Jacobson, Daniel

    2017-01-10

    We explore the use of a network meta-modeling approach to compare the effects of similarity metrics used to construct biological networks on the topology of the resulting networks. This work reviews various similarity metrics for the construction of networks and various topology measures for the characterization of resulting network topology, demonstrating the use of these metrics in the construction and comparison of phylogenomic and transcriptomic networks.

  3. Deciphering network community structure by surprise

    National Research Council Canada - National Science Library

    Aldecoa, Rodrigo; Marín, Ignacio

    2011-01-01

    .... A fundamental, unsolved problem is how to characterize the community structure of a network. Here, using both standard and novel benchmarks, we show that maximization of a simple global parameter, which we call Surprise...

  4. A Surprising Culprit Behind Celiac Disease?

    Science.gov (United States)

    ... news/fullstory_164503.html A Surprising Culprit Behind Celiac Disease? Study suggests harmless viruses may set stage ... typically harmless type of virus might sometimes trigger celiac disease, a new study suggests. Celiac disease is ...

  5. An adaptive metamodel-based global optimization algorithm for black-box type problems

    Science.gov (United States)

    Jie, Haoxiang; Wu, Yizhong; Ding, Jianwan

    2015-11-01

    In this article, an adaptive metamodel-based global optimization (AMGO) algorithm is presented to solve unconstrained black-box problems. In the AMGO algorithm, a type of hybrid model composed of kriging and augmented radial basis function (RBF) is used as the surrogate model. The weight factors of hybrid model are adaptively selected in the optimization process. To balance the local and global search, a sub-optimization problem is constructed during each iteration to determine the new iterative points. As numerical experiments, six standard two-dimensional test functions are selected to show the distributions of iterative points. The AMGO algorithm is also tested on seven well-known benchmark optimization problems and contrasted with three representative metamodel-based optimization methods: efficient global optimization (EGO), GutmannRBF and hybrid and adaptive metamodel (HAM). The test results demonstrate the efficiency and robustness of the proposed method. The AMGO algorithm is finally applied to the structural design of the import and export chamber of a cycloid gear pump, achieving satisfactory results.

  6. Metamodel for Efficient Estimation of Capacity-Fade Uncertainty in Li-Ion Batteries for Electric Vehicles

    Directory of Open Access Journals (Sweden)

    Jaewook Lee

    2015-06-01

    Full Text Available This paper presents an efficient method for estimating capacity-fade uncertainty in lithium-ion batteries (LIBs in order to integrate them into the battery-management system (BMS of electric vehicles, which requires simple and inexpensive computation for successful application. The study uses the pseudo-two-dimensional (P2D electrochemical model, which simulates the battery state by solving a system of coupled nonlinear partial differential equations (PDEs. The model parameters that are responsible for electrode degradation are identified and estimated, based on battery data obtained from the charge cycles. The Bayesian approach, with parameters estimated by probability distributions, is employed to account for uncertainties arising in the model and battery data. The Markov Chain Monte Carlo (MCMC technique is used to draw samples from the distributions. The complex computations that solve a PDE system for each sample are avoided by employing a polynomial-based metamodel. As a result, the computational cost is reduced from 5.5 h to a few seconds, enabling the integration of the method into the vehicle BMS. Using this approach, the conservative bound of capacity fade can be determined for the vehicle in service, which represents the safety margin reflecting the uncertainty.

  7. Surprises in numerical expressions of physical constants

    CERN Document Server

    Amir, Ariel; Tokieda, Tadashi

    2016-01-01

    In science, as in life, `surprises' can be adequately appreciated only in the presence of a null model, what we expect a priori. In physics, theories sometimes express the values of dimensionless physical constants as combinations of mathematical constants like pi or e. The inverse problem also arises, whereby the measured value of a physical constant admits a `surprisingly' simple approximation in terms of well-known mathematical constants. Can we estimate the probability for this to be a mere coincidence, rather than an inkling of some theory? We answer the question in the most naive form.

  8. Surprising Connections between Partitions and Divisors

    Science.gov (United States)

    Osler, Thomas J.; Hassen, Abdulkadir; Chandrupatla, Tirupathi R.

    2007-01-01

    The sum of the divisors of a positive integer is one of the most interesting concepts in multiplicative number theory, while the number of ways of expressing a number as a sum is a primary topic in additive number theory. In this article, we describe some of the surprising connections between and similarities of these two concepts.

  9. Surprises from extragalactic propagation of UHECRs

    CERN Document Server

    Boncioli, Denise; Grillo, Aurelio

    2015-01-01

    Ultra-high energy cosmic ray experimental data are now of very good statistical significance even in the region of the expected GZK feature. The identification of their sources requires sophisticate analysis of their propagation in the extragalactic space. When looking at the details of this propagation some unforeseen features emerge. We will discuss some of these "surprises".

  10. Metamodel-assisted evolutionary algorithms for the unit commitment problem with probabilistic outages

    Energy Technology Data Exchange (ETDEWEB)

    Georgopoulou, Chariklia A.; Giannakoglou, Kyriakos C. [National Technical University of Athens, School of Mechanical Engineering, Lab. of Thermal Turbomachines, Parallel CFD and Optimization Unit, P.O. Box 64069, Athens 157 10 (Greece)

    2010-05-15

    An efficient method for solving power generating unit commitment (UC) problems with probabilistic unit outages is proposed. It is based on a two-level evolutionary algorithm (EA) minimizing the expected total operating cost (TOC) of a system of power generating units over a scheduling period, with known failure and repair rates of each unit. To compute the cost function value of each EA population member, namely a candidate UC schedule, a Monte Carlo simulation must be carried out. Some thousands of replicates are generated according to the units' outage and repair rates and the corresponding probabilities. Each replicate is represented by a series of randomly generated availability and unavailability periods of time for each unit and the UC schedule under consideration accordingly. The expected TOC is the average of the TOCs of all Monte Carlo replicates. Therefore, the CPU cost per Monte Carlo evaluation increases noticeably and so does the CPU cost of running the EA. To reduce it, the use of a metamodel-assisted EA (MAEA) with on-line trained surrogate evaluation models or metamodels (namely, radial-basis function networks) is proposed. A novelty of this method is that the metamodels are trained on a few ''representative'' unit outage scenarios selected among the Monte Carlo replicates generated once during the optimization and, then, used to predict the expected TOC. Based on this low cost, approximate pre-evaluation, only a few top individuals within each generation undergo Monte Carlo simulations. The proposed MAEA is demonstrated on test problems and shown to drastically reduce the CPU cost, compared to EAs which are exclusively based on Monte Carlo simulations. (author)

  11. Metamodeling-driven IP reuse for SoC integration and microprocessor design

    CERN Document Server

    Mathaikutty, Deepak A

    2009-01-01

    This cutting-edge resource offers you an in-depth understanding of metamodeling approaches for the reuse of intellectual properties (IPs) in the form of reusable design or verification components. The books covers the essential issues associated with fast and effective integration of reusable design components into a system-on-a-chip (SoC) to achieve faster design turn-around time. Moreover, it addresses key factors related to the use of reusable verification IPs for a "write once, use many times" verification strategy - another effective approach that can attain a faster product design cycle.

  12. Moving Object Trajectories Meta-Model And Spatio-Temporal Queries

    CERN Document Server

    Boulmakoul, Azedine; Lbath, Ahmed

    2012-01-01

    In this paper, a general moving object trajectories framework is put forward to allow independent applications processing trajectories data benefit from a high level of interoperability, information sharing as well as an efficient answer for a wide range of complex trajectory queries. Our proposed meta-model is based on ontology and event approach, incorporates existing presentations of trajectory and integrates new patterns like space-time path to describe activities in geographical space-time. We introduce recursive Region of Interest concepts and deal mobile objects trajectories with diverse spatio-temporal sampling protocols and different sensors available that traditional data model alone are incapable for this purpose.

  13. Radar Design to Protect Against Surprise

    Energy Technology Data Exchange (ETDEWEB)

    Doerry, Armin W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Technological and doctrinal surprise is about rendering preparations for conflict as irrelevant or ineffective . For a sensor, this means essentially rendering the sensor as irrelevant or ineffective in its ability to help determine truth. Recovery from this sort of surprise is facilitated by flexibility in our own technology and doctrine. For a sensor, this mean s flexibility in its architecture, design, tactics, and the designing organizations ' processes. - 4 - Acknowledgements This report is the result of a n unfunded research and development activity . Sandia National Laboratories is a multi - program laboratory manage d and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE - AC04 - 94AL85000.

  14. Surprise Leads to Noisier Perceptual Decisions

    Directory of Open Access Journals (Sweden)

    Marta I Garrido

    2011-02-01

    Full Text Available Surprising events in the environment can impair task performance. This might be due to complete distraction, leading to lapses during which performance is reduced to guessing. Alternatively, unpredictability might cause a graded withdrawal of perceptual resources from the task at hand and thereby reduce sensitivity. Here we attempt to distinguish between these two mechanisms. Listeners performed a novel auditory pitch—duration discrimination, where stimulus loudness changed occasionally and incidentally to the task. Responses were slower and less accurate in the surprising condition, where loudness changed unpredictably, than in the predictable condition, where the loudness was held constant. By explicitly modelling both lapses and changes in sensitivity, we found that unpredictable changes diminished sensitivity but did not increase the rate of lapses. These findings suggest that background environmental uncertainty can disrupt goal-directed behaviour. This graded processing strategy might be adaptive in potentially threatening contexts, and reflect a flexible system for automatic allocation of perceptual resources.

  15. Radar Design to Protect Against Surprise.

    Energy Technology Data Exchange (ETDEWEB)

    Doerry, Armin W.

    2015-02-01

    Technological and doctrinal surprise is about rendering preparations for conflict as irrelevant or ineffective . For a sensor, this means essentially rendering the sensor as irrelevant or ineffective in its ability to help determine truth. Recovery from this sort of surprise is facilitated by flexibility in our own technology and doctrine. For a sensor, this mean s flexibility in its architecture, design, tactics, and the designing organizations ' processes. - 4 - Acknowledgements This report is the result of a n unfunded research and development activity . Sandia National Laboratories is a multi - program laboratory manage d and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE - AC04 - 94AL85000.

  16. Surprise-Based Learning for Autonomous Systems

    Science.gov (United States)

    2009-02-28

    for scientific theories containing recursive theoretical terms". British Journal of Philosophy of Science, 44. 641-652, 1993. Piaget J.. "The Origins...paradigm stems from Piaget’s theory of Developmental Psychology [5], Herben Simon’s theory on dual-space search for knowledge and problem solving [6...34, Twenty-First Conference on Uncertainty in Artificial Intelligence, Edinburgh, Scotland, July 2005. [34] Itti L., Baldi P., "A Surprising Theory of

  17. Performance improvement of a moment method for reliability analysis using kriging metamodels

    Energy Technology Data Exchange (ETDEWEB)

    Ju, Byeong Hyeon; Cho, Tae Min; Lee, Byung Chai [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Jung, Do Hyun [Korea Automotive Technology Institute, Chonan (Korea, Republic of)

    2006-08-15

    Many methods for reliability analysis have been studied and one of them, a moment method, has the advantage that it doesn't require sensitivities of performance functions. The moment method for reliability analysis requires the first four moments of a performance function and then Pearson system is used for the probability of failure where the accuracy of the probability of failure greatly depends on that of the first four moments. But it is generally impossible to assess them analytically for multidimensional functions, and numerical integration is mainly used to estimate the moment. However, numerical integration requires many function evaluations and in case of involving finite element analyses, the calculation of the first four moments is very time-consuming. To solve the problem, this research proposes a new method of approximating the first four moments based on kriging metamodel. The proposed method substitutes the kriging metamodel for the performance function and can also evaluate the accuracy of the calculated moments adjusting the approximation range. Numerical examples show the proposed method can approximate the moments accurately with the less function evaluations and evaluate the accuracy of the calculated moments.

  18. Evolutionary Tuning Method for PID Controller Parameters of a Cruise Control System Using Metamodeling

    Directory of Open Access Journals (Sweden)

    M. N. Ab Malek

    2009-01-01

    Full Text Available For long time the optimization of controller parameters uses the well-known classical method such as the Ziegler-Nichols and the Cohen-Coon tuning techniques. Despite its effectiveness, these off-line tuning techniques can be time consuming especially for a case of complex nonlinear system. This paper attempts to show a great deal on how Metamodeling techniques can be utilized to tune the PID controller parameters quickly. Note that the plant use in this study is the cruise control system with 2 different models, which are the linear model and the nonlinear model. The difference between both models is that the disturbances were taken into consideration for the nonlinear model, but in the linear model the disturbances were assumed as zero. The Radial Basis Function Neural Network Metamodel is able to prove that it can minimize the time in tuning process as it is able to give a good approximation to the optimum controller parameters in both models of this system.

  19. Pupil size tracks perceptual content and surprise.

    Science.gov (United States)

    Kloosterman, Niels A; Meindertsma, Thomas; van Loon, Anouk M; Lamme, Victor A F; Bonneh, Yoram S; Donner, Tobias H

    2015-04-01

    Changes in pupil size at constant light levels reflect the activity of neuromodulatory brainstem centers that control global brain state. These endogenously driven pupil dynamics can be synchronized with cognitive acts. For example, the pupil dilates during the spontaneous switches of perception of a constant sensory input in bistable perceptual illusions. It is unknown whether this pupil dilation only indicates the occurrence of perceptual switches, or also their content. Here, we measured pupil diameter in human subjects reporting the subjective disappearance and re-appearance of a physically constant visual target surrounded by a moving pattern ('motion-induced blindness' illusion). We show that the pupil dilates during the perceptual switches in the illusion and a stimulus-evoked 'replay' of that illusion. Critically, the switch-related pupil dilation encodes perceptual content, with larger amplitude for disappearance than re-appearance. This difference in pupil response amplitude enables prediction of the type of report (disappearance vs. re-appearance) on individual switches (receiver-operating characteristic: 61%). The amplitude difference is independent of the relative durations of target-visible and target-invisible intervals and subjects' overt behavioral report of the perceptual switches. Further, we show that pupil dilation during the replay also scales with the level of surprise about the timing of switches, but there is no evidence for an interaction between the effects of surprise and perceptual content on the pupil response. Taken together, our results suggest that pupil-linked brain systems track both the content of, and surprise about, perceptual events.

  20. Some surprising facts about (the problem of) surprising facts (from the Dusseldorf Conference, February 2011).

    Science.gov (United States)

    Mayo, D

    2014-03-01

    A common intuition about evidence is that if data x have been used to construct a hypothesis H, then x should not be used again in support of H. It is no surprise that x fits H, if H was deliberately constructed to accord with x. The question of when and why we should avoid such "double-counting" continues to be debated in philosophy and statistics. It arises as a prohibition against data mining, hunting for significance, tuning on the signal, and ad hoc hypotheses, and as a preference for predesignated hypotheses and "surprising" predictions. I have argued that it is the severity or probativeness of the test--or lack of it--that should determine whether a double-use of data is admissible. I examine a number of surprising ambiguities and unexpected facts that continue to bedevil this debate.

  1. Reliability-based design optimization of reinforced concrete structures including soil-structure interaction using a discrete gravitational search algorithm and a proposed metamodel

    Science.gov (United States)

    Khatibinia, M.; Salajegheh, E.; Salajegheh, J.; Fadaee, M. J.

    2013-10-01

    A new discrete gravitational search algorithm (DGSA) and a metamodelling framework are introduced for reliability-based design optimization (RBDO) of reinforced concrete structures. The RBDO of structures with soil-structure interaction (SSI) effects is investigated in accordance with performance-based design. The proposed DGSA is based on the standard gravitational search algorithm (GSA) to optimize the structural cost under deterministic and probabilistic constraints. The Monte-Carlo simulation (MCS) method is considered as the most reliable method for estimating the probabilities of reliability. In order to reduce the computational time of MCS, the proposed metamodelling framework is employed to predict the responses of the SSI system in the RBDO procedure. The metamodel consists of a weighted least squares support vector machine (WLS-SVM) and a wavelet kernel function, which is called WWLS-SVM. Numerical results demonstrate the efficiency and computational advantages of DGSA and the proposed metamodel for RBDO of reinforced concrete structures.

  2. Stroke Recovery: Surprising Influences and Residual Consequences

    Directory of Open Access Journals (Sweden)

    Argye E. Hillis

    2014-01-01

    Full Text Available There is startling individual variability in the degree to which people recover from stroke and the duration of time over which recovery of some symptoms occurs. There are a variety of mechanisms of recovery from stroke which take place at distinct time points after stroke and are influenced by different variables. We review recent studies from our laboratory that unveil some surprising findings, such as the role of education in chronic recovery. We also report data showing that the consequences that most plague survivors of stroke and their caregivers are loss of high level cortical functions, such as empathy or written language. These results have implications for rehabilitation and management of stroke.

  3. Surprises and mysteries in urban soils

    Science.gov (United States)

    Groffman, P. M.

    2015-12-01

    In the Baltimore Ecosystem Study, one of two urban long-term ecological research (LTER) projects funded by the U.S. National Science Foundation, we are using "the watershed approach" to integrate ecological, physical and social sciences. Urban and suburban watershed input/output budgets for nitrogen have shown surprisingly high retention which has led to detailed analysis of sources and sinks in soils these watersheds. Home lawns, thought to be major sources of reactive nitrogen in suburban watersheds, have more complex coupled carbon and nitrogen dynamics than previously thought, and are likely the site of much nitrogen retention. Riparian zones, thought to be an important sink for reactive nitrogen in many watersheds, have turned out be nitrogen sources in urban watersheds due to hydrologic changes that disconnect streams from their surrounding landscape. Urban effects on atmospheric carbon dioxide levels and nitrogen deposition have strong effects on soil nitrogen cycling processes and soil:atmosphere fluxes of nitrous oxide, carbon dioxide and methane. Efforts to manage urban soils and watersheds through geomorphic stream restoration, creation of stormwater management features and changes in lawn and forest management can have significant effects on watershed carbon and nitrogen dynamics. Urban soils present a basic and applied science frontier that challenges our understanding of biological, physical, chemical and social science processes. The watershed approach provides an effective platform for integrating these disciplines and for articulating critical questions that arise from surprising results. This approach can help us to meet the challenge of urban soils, which is critical to achieving sustainability goals in cities across the world.

  4. Using Evolution Strategy with Meta-models for Well Placement Optimization

    CERN Document Server

    Bouzarkouna, Zyed; Auger, Anne

    2010-01-01

    Optimum implementation of non-conventional wells allows us to increase considerably hydrocarbon recovery. By considering the high drilling cost and the potential improvement in well productivity, well placement decision is an important issue in field development. Considering complex reservoir geology and high reservoir heterogeneities, stochastic optimization methods are the most suitable approaches for optimum well placement. This paper proposes an optimization methodology to determine optimal well location and trajectory based upon the Covariance Matrix Adaptation - Evolution Strategy (CMA-ES) which is a variant of Evolution Strategies recognized as one of the most powerful derivative-free optimizers for continuous optimization. To improve the optimization procedure, two new techniques are investigated: (1). Adaptive penalization with rejection is developed to handle well placement constraints. (2). A meta-model, based on locally weighted regression, is incorporated into CMA-ES using an approximate ranking ...

  5. Difference mapping method using least square support vector regression for variable-fidelity metamodelling

    Science.gov (United States)

    Zheng, Jun; Shao, Xinyu; Gao, Liang; Jiang, Ping; Qiu, Haobo

    2015-06-01

    Engineering design, especially for complex engineering systems, is usually a time-consuming process involving computation-intensive computer-based simulation and analysis methods. A difference mapping method using least square support vector regression is developed in this work, as a special metamodelling methodology that includes variable-fidelity data, to replace the computationally expensive computer codes. A general difference mapping framework is proposed where a surrogate base is first created, then the approximation is gained by a mapping the difference between the base and the real high-fidelity response surface. The least square support vector regression is adopted to accomplish the mapping. Two different sampling strategies, nested and non-nested design of experiments, are conducted to explore their respective effects on modelling accuracy. Different sample sizes and three approximation performance measures of accuracy are considered.

  6. Reliability-based design optimization using a moment method and a kriging metamodel

    Science.gov (United States)

    Ju, Byeong Hyeon; Chai Lee, Byung

    2008-05-01

    Reliability-based design optimization (RBDO) has been used for optimizing engineering systems with uncertainties in design variables and system parameters. RBDO involves reliability analysis, which requires a large amount of computational effort, so it is important to select an efficient method for reliability analysis. Of the many methods for reliability analysis, a moment method, which is called the fourth moment method, is known to be less expensive for moderate size problems and requires neither iteration nor the computation of derivatives. Despite these advantages, previous research on RBDO has been mainly based on the first-order reliability method and relatively little attention has been paid to moment-based RBDO. This article considers difficulties in implementing the moment method into RBDO; they are solved using a kriging metamodel with an active constraint strategy. Three numerical examples are tested and the results show that the proposed method is efficient and accurate.

  7. A Meta-Model of Inter-Organisational Cooperation for the Transition to a Circular Economy

    Directory of Open Access Journals (Sweden)

    Alessandro Ruggieri

    2016-11-01

    Full Text Available The transition to a circular economy bodes well for a future of environmentally sustainable growth and economic development. The implications and advantages of a shift to a circular economy have been extensively demonstrated by the literature on the subject. What has not been sufficiently investigated is how this paradigm can be enabled through the inter-organisational cooperation among different business enterprises. In order to illustrate this point, in this paper we aim to contribute to the circular economy debate by describing and discussing such a meta-model of inter-organisational cooperation. The present study is therefore based on the analysis of three cases from an equal number of industries, from which we identified factors of potential impact for the stimulation of cooperation in a circular economy perspective. Last, but not least, we discuss the relations between the case studies and try to formulate all possible implications for both managers and research.

  8. Meta-model Based Model Organization and Transformation of Design Pattern Units in MDA

    Institute of Scientific and Technical Information of China (English)

    Chang-chun YANG; Zi-yi ZHAO; Jing Sun

    2010-01-01

    Tb achieve the purpose of applying design patterns which are various in kind and constant in changing in MDA from idea and application,one way is used to solve the problem of pattern disappearance which occurs at the process of pattern instantiation,to guarantee the independenceJpatterns,and at the same time,to apply this process to miltiple design patterns.Ib solve these two problems,the modeling method of design pattern units based on seta-models is adopted,I.e.,to divide the basic operations into atones in the metamodel tier and then combine the atones to complete design pattern units seta-models without business logic.After one process of conversion,the kxupose of making up various pattern units seta-model and dividing business logic and pattern logic is achieved.

  9. The conceptualization model problem—surprise

    Science.gov (United States)

    Bredehoeft, John

    2005-03-01

    The foundation of model analysis is the conceptual model. Surprise is defined as new data that renders the prevailing conceptual model invalid; as defined here it represents a paradigm shift. Limited empirical data indicate that surprises occur in 20-30% of model analyses. These data suggest that groundwater analysts have difficulty selecting the appropriate conceptual model. There is no ready remedy to the conceptual model problem other than (1) to collect as much data as is feasible, using all applicable methods—a complementary data collection methodology can lead to new information that changes the prevailing conceptual model, and (2) for the analyst to remain open to the fact that the conceptual model can change dramatically as more information is collected. In the final analysis, the hydrogeologist makes a subjective decision on the appropriate conceptual model. The conceptualization problem does not render models unusable. The problem introduces an uncertainty that often is not widely recognized. Conceptual model uncertainty is exacerbated in making long-term predictions of system performance. C'est le modèle conceptuel qui se trouve à base d'une analyse sur un modèle. On considère comme une surprise lorsque le modèle est invalidé par des données nouvelles; dans les termes définis ici la surprise est équivalente à un change de paradigme. Des données empiriques limitées indiquent que les surprises apparaissent dans 20 à 30% des analyses effectuées sur les modèles. Ces données suggèrent que l'analyse des eaux souterraines présente des difficultés lorsqu'il s'agit de choisir le modèle conceptuel approprié. Il n'existe pas un autre remède au problème du modèle conceptuel que: (1) rassembler autant des données que possible en utilisant toutes les méthodes applicables—la méthode des données complémentaires peut conduire aux nouvelles informations qui vont changer le modèle conceptuel, et (2) l'analyste doit rester ouvert au fait

  10. Dynamic sensitivity analysis of long running landslide models through basis set expansion and meta-modelling

    Science.gov (United States)

    Rohmer, Jeremy

    2016-04-01

    Predicting the temporal evolution of landslides is typically supported by numerical modelling. Dynamic sensitivity analysis aims at assessing the influence of the landslide properties on the time-dependent predictions (e.g., time series of landslide displacements). Yet two major difficulties arise: 1. Global sensitivity analysis require running the landslide model a high number of times (> 1000), which may become impracticable when the landslide model has a high computation time cost (> several hours); 2. Landslide model outputs are not scalar, but function of time, i.e. they are n-dimensional vectors with n usually ranging from 100 to 1000. In this article, I explore the use of a basis set expansion, such as principal component analysis, to reduce the output dimensionality to a few components, each of them being interpreted as a dominant mode of variation in the overall structure of the temporal evolution. The computationally intensive calculation of the Sobol' indices for each of these components are then achieved through meta-modelling, i.e. by replacing the landslide model by a "costless-to-evaluate" approximation (e.g., a projection pursuit regression model). The methodology combining "basis set expansion - meta-model - Sobol' indices" is then applied to the La Frasse landslide to investigate the dynamic sensitivity analysis of the surface horizontal displacements to the slip surface properties during the pore pressure changes. I show how to extract information on the sensitivity of each main modes of temporal behaviour using a limited number (a few tens) of long running simulations. In particular, I identify the parameters, which trigger the occurrence of a turning point marking a shift between a regime of low values of landslide displacements and one of high values.

  11. Metamodeling abduction

    Directory of Open Access Journals (Sweden)

    Ángel Nepomuceno

    2009-12-01

    Full Text Available A general trend is to consider abduction as a backward deduction with some additional conditions, but there can be more than one kind of deduction. By adopting Makinson’s method to define deductive consequence relations, abduction is settled as a reverse one corresponding to each one of such deductive relations

  12. Surprising characteristics of visual systems of invertebrates.

    Science.gov (United States)

    González-Martín-Moro, J; Hernández-Verdejo, J L; Jiménez-Gahete, A E

    2017-01-01

    To communicate relevant and striking aspects about the visual system of some close invertebrates. Review of the related literature. The capacity of snails to regenerate a complete eye, the benefit of the oval shape of the compound eye of many flying insects as a way of stabilising the image during flight, the potential advantages related to the extreme refractive error that characterises the ocelli of many insects, as well as the ability to detect polarised light as a navigation system, are some of the surprising capabilities present in the small invertebrate eyes that are described in this work. The invertebrate eyes have capabilities and sensorial modalities that are not present in the human eye. The study of the eyes of these animals can help us to improve our understanding of our visual system, and inspire the development of optical devices. Copyright © 2016 Sociedad Española de Oftalmología. Publicado por Elsevier España, S.L.U. All rights reserved.

  13. Surprises from Saturn: Implications for Other Environments

    Science.gov (United States)

    Coates, A. J.

    2014-05-01

    The exploration of Saturn by Cassini has provided many surprises regarding: Saturn's rapidly rotating magnetosphere, interactions with its diverse moons, and interactions with the solar wind. Enceladus, orbiting at 4 Saturn radii (RS), was found to have plumes of water vapour and ice which are the dominant source for the inner magnetosphere. Charged water clusters, charged dust and photoelectrons provide key populations in the 'dusty plasma' observed. Direct pickup is seen near Enceladus and field-aligned currents create a spot in Saturn's aurora. At Titan, orbiting at 20 RS, unexpected heavy negative and positive ions are seen in the ionosphere, which provide the source for Titan's haze. Ionospheric plasma is seen in Titan's tail, enabling ion escape to be estimated at 7 tonnes per day. Saturn's ring ionosphere was seen early in the mission and a return will be made in 2017. In addition, highly accelerated electrons are seen at Saturn's high Mach number (MA˜100) quasi-parallel bow shock. Here we review some of these key new results, and discuss the implications for other solar system objects.

  14. idSpace D2.2 – Semantic meta-model, integration and transformations v1

    DEFF Research Database (Denmark)

    Dolog, Peter; Lin, Yujian; Dols, Roger;

    2009-01-01

    This report introduces a topic maps based meta-model for creativity techniques, creativity process, and idea maps as results from creativity process. It proposes a graph based and hierarchical graph based transformation of idea maps for combination and integration of results of different creativity...... sessions. It further suggests a service composition model as an integration model based on service oriented architecture which integrates various creativity process supporting tools as services....

  15. Optimization of Process Parameters of Hybrid Laser-Arc Welding onto 316L Using Ensemble of Metamodels

    Science.gov (United States)

    Zhou, Qi; Jiang, Ping; Shao, Xinyu; Gao, Zhongmei; Cao, Longchao; Yue, Chen; Li, Xiongbin

    2016-08-01

    Hybrid laser-arc welding (LAW) provides an effective way to overcome problems commonly encountered during either laser or arc welding such as brittle phase formation, cracking, and porosity. The process parameters of LAW have significant effects on the bead profile and hence the quality of joint. This paper proposes an optimization methodology by combining non-dominated sorting genetic algorithm (NSGA-II) and ensemble of metamodels (EMs) to address multi-objective process parameter optimization in LAW onto 316L. Firstly, Taguchi experimental design is adopted to generate the experimental samples. Secondly, the relationships between process parameters ( i.e., laser power ( P), welding current ( A), distance between laser and arc ( D), and welding speed ( V)) and the bead geometries are fitted using EMs. The comparative results show that the EMs can take advantage of the prediction ability of each stand-alone metamodel and thus decrease the risk of adopting inappropriate metamodels. Then, the NSGA-II is used to facilitate design space exploration. Besides, the main effects and contribution rates of process parameters on bead profile are analyzed. Eventually, the verification experiments of the obtained optima are carried out and compared with the un-optimized weld seam for bead geometries, weld appearances, and welding defects. Results illustrate that the proposed hybrid approach exhibits great capability of improving welding quality in LAW.

  16. A Shocking Surprise in Stephan's Quintet

    Science.gov (United States)

    2006-01-01

    This false-color composite image of the Stephan's Quintet galaxy cluster clearly shows one of the largest shock waves ever seen (green arc). The wave was produced by one galaxy falling toward another at speeds of more than one million miles per hour. The image is made up of data from NASA's Spitzer Space Telescope and a ground-based telescope in Spain. Four of the five galaxies in this picture are involved in a violent collision, which has already stripped most of the hydrogen gas from the interiors of the galaxies. The centers of the galaxies appear as bright yellow-pink knots inside a blue haze of stars, and the galaxy producing all the turmoil, NGC7318b, is the left of two small bright regions in the middle right of the image. One galaxy, the large spiral at the bottom left of the image, is a foreground object and is not associated with the cluster. The titanic shock wave, larger than our own Milky Way galaxy, was detected by the ground-based telescope using visible-light wavelengths. It consists of hot hydrogen gas. As NGC7318b collides with gas spread throughout the cluster, atoms of hydrogen are heated in the shock wave, producing the green glow. Spitzer pointed its infrared spectrograph at the peak of this shock wave (middle of green glow) to learn more about its inner workings. This instrument breaks light apart into its basic components. Data from the instrument are referred to as spectra and are displayed as curving lines that indicate the amount of light coming at each specific wavelength. The Spitzer spectrum showed a strong infrared signature for incredibly turbulent gas made up of hydrogen molecules. This gas is caused when atoms of hydrogen rapidly pair-up to form molecules in the wake of the shock wave. Molecular hydrogen, unlike atomic hydrogen, gives off most of its energy through vibrations that emit in the infrared. This highly disturbed gas is the most turbulent molecular hydrogen ever seen. Astronomers were surprised not only by the turbulence

  17. THE 7 STAGE MODEL FOR FACILITATING MORAL CASE DELIBERATION IN HEALTH-CARE INSTITUTIONS: A PRACTICAL ILLUSTRATION OF A META-MODEL.

    Science.gov (United States)

    de Bree, Menno; Veening, Eite

    2016-01-01

    During a moral case deliberation-session, health care professionals come together in order to reflect on a moral issue they have to deal with. Since the whole process of sorting facts out, identifying moral issues, formulating and weighing arguments et cetera, can be quite complex, there should always be a well-trained facilitator present, who safeguards the flow of the session. In order to train and to assess the quality of these facilitators, we developed the so-called 7 stage model of moral case deliberation. This model is a meta-model, describing all the stages and all the activities that take place during each mcd-session--regardless of variables like the type of case that is being discussed, the number of participants, or the reflection method that is being applied. These 7 stages are: introduction, case selection, method selection, factual exploration, analysis, conclusion and rounding off. The model makes it possible to describe all the tasks facilitators at least have to perform in order to steer their group through theses stages in a fruitful and efficient way. It also makes it possible to identify the minimum generic competencies each facilitator should master, in order to perform these tasks successfully. In this paper, we introduce the model (to our knowledge, the first of its kind), discuss some of the most important theoretical backgrounds, provide a theoretical justification, and above all, give a practical illustration of how the model can be applied when facilitating an mcd-session.

  18. The Energetic Universe: a Nobel Surprise

    Science.gov (United States)

    Kirshner, Robert P.

    2015-01-01

    he history of cosmic expansion can be accurately traced using Type Ia supernovae (SN Ia) as standard candles. Over the past 40 years, this effort has improved its precision and extended its reach in redshift. Recently, the distances to SN Ia have been measured to a precision of ~5% using luminosity information that is encoded in the shape of the supernova's rest frame optical light curve. By combining observations of supernova distances as measured from their light curves and redshifts measured from spectra, we can detect changes in the cosmic expansion rate. This empirical approach was successfully exploited by the High-Z Supernova Team and by the Supernova Cosmology Project to detect cosmic expansion and to infer the presence of dark energy. The 2011 Nobel Prize in Physics was awarded to Perlmutter, Schmidt and Riess for this discovery. The world's sample of well-observed SN Ia light curves at high redshift and low, approaching 1000 objects, is now large enough to make statistical errors due to sample size a thing of the past. Systematic errors are now the challenge. To learn the properties of dark energy and determine, for example, whether it has an equation-of-state that is different from the cosmological constant demands higher precision and better accuracy. The largest systematic uncertainties come from light curve fitters, photometric calibration errors, and from uncertain knowledge of the scattering properties of dust along the line of sight. Efforts to use SN Ia spectra as luminosity indicators have had some success, but have not yet produced a big step forward. Fortunately, observations of SN Ia in the near infrared (NIR), from 1 to 2 microns, offer a very promising path to better knowledge of the Hubble constant and to improved constraints on dark energy. In the NIR, SN Ia are better standard candles and the effects of dust absorption are smaller. We have begun an HST program dubbed RAISIN (SN IA in the IR) to tighten our grip on dark energy properties

  19. The Influence of Negative Surprise on Hedonic Adaptation

    Directory of Open Access Journals (Sweden)

    Ana Paula Kieling

    2016-01-01

    Full Text Available After some time using a product or service, the consumer tends to feel less pleasure with consumption. This reduction of pleasure is known as hedonic adaptation. One of the emotions that interfere in this process is surprise. Based on two experiments, we suggest that negative surprise – differently to positive – influences with the level of pleasure foreseen and experienced by the consumer. Study 1 analyzes the influence of negative (vs. positive surprise on the consumer’s post-purchase hedonic adaptation expectation. Results showed that negative surprise influences the intensity of adaptation, augmenting its strength. Study 2 verifies the influence of negative (vs positive surprise over hedonic adaptation. The findings suggested that negative surprise makes adaptation happen more intensively and faster as time goes by, which brings consequences to companies and consumers in the post-purchase process, such as satisfaction and loyalty.

  20. Parameter identification and global sensitivity analysis of Xinanjiang model using meta-modeling approach

    Directory of Open Access Journals (Sweden)

    Xiao-meng SONG

    2013-01-01

    Full Text Available Parameter identification, model calibration, and uncertainty quantification are important steps in the model-building process, and are necessary for obtaining credible results and valuable information. Sensitivity analysis of hydrological model is a key step in model uncertainty quantification, which can identify the dominant parameters, reduce the model calibration uncertainty, and enhance the model optimization efficiency. There are, however, some shortcomings in classical approaches, including the long duration of time and high computation cost required to quantitatively assess the sensitivity of a multiple-parameter hydrological model. For this reason, a two-step statistical evaluation framework using global techniques is presented. It is based on (1 a screening method (Morris for qualitative ranking of parameters, and (2 a variance-based method integrated with a meta-model for quantitative sensitivity analysis, i.e., the Sobol method integrated with the response surface model (RSMSobol. First, the Morris screening method was used to qualitatively identify the parameters’ sensitivity, and then ten parameters were selected to quantify the sensitivity indices. Subsequently, the RSMSobol method was used to quantify the sensitivity, i.e., the first-order and total sensitivity indices based on the response surface model (RSM were calculated. The RSMSobol method can not only quantify the sensitivity, but also reduce the computational cost, with good accuracy compared to the classical approaches. This approach will be effective and reliable in the global sensitivity analysis of a complex large-scale distributed hydrological model.

  1. Robust optimization of aircraft weapon delivery trajectory using probability collectives and meta-modeling

    Institute of Scientific and Technical Information of China (English)

    Wang Nan; Shen Lincheng; Liu Hongfu; Chen Jing; Hu Tianjiang

    2013-01-01

    Conventional trajectory optimization techniques have been challenged by their inability to handle threats with irregular shapes and the tendency to be sensitive to control variations of aircraft.Aiming to overcome these difficulties,this paper presents an alternative approach for trajectory optimization,where the problem is formulated into a parametric optimization of the maneuver variables under a tactics template framework.To reduce the size of the problem,global sensitivity analysis (GSA) is performed to identify the less-influential maneuver variables.The probability collectives (PC) algorithm,which is well-suited to discrete and discontinuous optimization,is applied to solve the trajectory optimization problem.The robustness of the trajectory is assessed through multiple sampling around the chosen values of the maneuver variables.Meta-models based on radius basis function (RBF) are created for evaluations of the means and deviations of the problem objectives and constraints.To guarantee the approximation accuracy,the meta-models are adaptively updated during optimization.The proposed approach is demonstrated on a typical airground attack mission scenario.Results reveal that the proposed approach is capable of generating robust and optimal trajectories with both accuracy and efficiency.

  2. Robust Design of Sheet Metal Forming Process Based on Kriging Metamodel

    Science.gov (United States)

    Xie, Yanmin

    2011-08-01

    Nowadays, sheet metal forming processes design is not a trivial task due to the complex issues to be taken into account (conflicting design goals, complex shapes forming and so on). Optimization methods have also been widely applied in sheet metal forming. Therefore, proper design methods to reduce time and costs have to be developed mostly based on computer aided procedures. At the same time, the existence of variations during manufacturing processes significantly may influence final product quality, rendering non-robust optimal solutions. In this paper, a small size of design of experiments is conducted to investigate how a stochastic behavior of noise factors affects drawing quality. The finite element software (LS_DYNA) is used to simulate the complex sheet metal stamping processes. The Kriging metamodel is adopted to map the relation between input process parameters and part quality. Robust design models for sheet metal forming process integrate adaptive importance sampling with Kriging model, in order to minimize impact of the variations and achieve reliable process parameters. In the adaptive sample, an improved criterion is used to provide direction in which additional training samples can be added to better the Kriging model. Nonlinear functions as test functions and a square stamping example (NUMISHEET'93) are employed to verify the proposed method. Final results indicate application feasibility of the aforesaid method proposed for multi-response robust design.

  3. Effects of Surprisal and Locality on Danish Sentence Processing

    DEFF Research Database (Denmark)

    Balling, Laura Winther; Kizach, Johannes

    2017-01-01

    An eye-tracking experiment in Danish investigates two dominant accounts of sentence processing: locality-based theories that predict a processing advantage for sentences where the distance between the major syntactic heads is minimized, and the surprisal theory which predicts that processing time...... constructions with two postverbal NP-objects. An eye-tracking experiment showed a clear advantage for local syntactic relations, with only a marginal effect of lexicalised surprisal and no effect of syntactic surprisal. We conclude that surprisal has a relatively marginal effect, which may be clearest for verbs...

  4. Surprise and Sense Making: Undergraduate Placement Experiences in SMEs

    Science.gov (United States)

    Walmsley, Andreas; Thomas, Rhodri; Jameson, Stephanie

    2006-01-01

    Purpose: This paper seeks to explore undergraduate placement experiences in tourism and hospitality SMEs, focusing on the notions of surprise and sense making. It aims to argue that surprises and sense making are important elements not only of the adjustment process when entering new work environments, but also of the learning experience that…

  5. Neural Responses to Rapid Facial Expressions of Fear and Surprise

    Directory of Open Access Journals (Sweden)

    Ke Zhao

    2017-05-01

    Full Text Available Facial expression recognition is mediated by a distributed neural system in humans that involves multiple, bilateral regions. There are six basic facial expressions that may be recognized in humans (fear, sadness, surprise, happiness, anger, and disgust; however, fearful faces and surprised faces are easily confused in rapid presentation. The functional organization of the facial expression recognition system embodies a distinction between these two emotions, which is investigated in the present study. A core system that includes the right parahippocampal gyrus (BA 30, fusiform gyrus, and amygdala mediates the visual recognition of fear and surprise. We found that fearful faces evoked greater activity in the left precuneus, middle temporal gyrus (MTG, middle frontal gyrus, and right lingual gyrus, whereas surprised faces were associated with greater activity in the right postcentral gyrus and left posterior insula. These findings indicate the importance of common and separate mechanisms of the neural activation that underlies the recognition of fearful and surprised faces.

  6. A proposed metamodel for the implementation of object oriented software through the automatic generation of source code

    Directory of Open Access Journals (Sweden)

    CARVALHO, J. S. C.

    2008-12-01

    Full Text Available During the development of software one of the most visible risks and perhaps the biggest implementation obstacle relates to the time management. All delivery deadlines software versions must be followed, but it is not always possible, sometimes due to delay in coding. This paper presents a metamodel for software implementation, which will rise to a development tool for automatic generation of source code, in order to make any development pattern transparent to the programmer, significantly reducing the time spent in coding artifacts that make up the software.

  7. Polynomial meta-models with canonical low-rank approximations: Numerical insights and comparison to sparse polynomial chaos expansions

    Science.gov (United States)

    Konakli, Katerina; Sudret, Bruno

    2016-09-01

    The growing need for uncertainty analysis of complex computational models has led to an expanding use of meta-models across engineering and sciences. The efficiency of meta-modeling techniques relies on their ability to provide statistically-equivalent analytical representations based on relatively few evaluations of the original model. Polynomial chaos expansions (PCE) have proven a powerful tool for developing meta-models in a wide range of applications; the key idea thereof is to expand the model response onto a basis made of multivariate polynomials obtained as tensor products of appropriate univariate polynomials. The classical PCE approach nevertheless faces the "curse of dimensionality", namely the exponential increase of the basis size with increasing input dimension. To address this limitation, the sparse PCE technique has been proposed, in which the expansion is carried out on only a few relevant basis terms that are automatically selected by a suitable algorithm. An alternative for developing meta-models with polynomial functions in high-dimensional problems is offered by the newly emerged low-rank approximations (LRA) approach. By exploiting the tensor-product structure of the multivariate basis, LRA can provide polynomial representations in highly compressed formats. Through extensive numerical investigations, we herein first shed light on issues relating to the construction of canonical LRA with a particular greedy algorithm involving a sequential updating of the polynomial coefficients along separate dimensions. Specifically, we examine the selection of optimal rank, stopping criteria in the updating of the polynomial coefficients and error estimation. In the sequel, we confront canonical LRA to sparse PCE in structural-mechanics and heat-conduction applications based on finite-element solutions. Canonical LRA exhibit smaller errors than sparse PCE in cases when the number of available model evaluations is small with respect to the input dimension, a

  8. KRIGING-HDMR METAMODELING TECHNIQUE FOR NONLINEAR PROBLEMS%Kriging-HDMR非线性近似模型方法

    Institute of Scientific and Technical Information of China (English)

    汤龙; 李光耀; 王琥

    2011-01-01

    Some large-scale structural engineering problems need to be solved by metamodels.With the increasing of complexity and dimensionality,metamodeling techniques confront two major challenges.First, the size of sample points should be increase exponentially as the number of design variables increases.Second, it is difficult to give the explicit correlation relationships amongst design variables by popular metamodeling techniques.Therefore,a new high-dimension model representation(HDMR) based on the Kriging interpolation, Kriging-HDMR,is suggested in this paper.The most remarkable advantage of this method is its capacity to exploit relationships among variables of the underlying function.Furthermore,Kriging-HDMR can reduce the corresponding computational cost from exponential growth to polynomial level.Thus,the essence of the assigned problem could be presented efficiently.To prove the feasibility of this method,several high dimensional and nonlinear functions are tested.The algorithm is also applied to a simple engineering problem.Compared with the classical metamodeling techniques,the efficiency and accuracy are improved.%提出基于克里金(Kriging)插值的高维模型表示(high dimensional model representation,HDMR)方法,即Kriging-HDMR方法.Kriging-HDMR方法的最大优势在于:能够明确输入参数的耦合特性,将构造模型复杂度由指数级增长降阶为多项式级增长,进而用有限样本确定待求问题的物理实质.为了验证算法的建模性能,采用高维非线性函数成功地验证了该算法的可行性,并将该算法初步应用于简单的非线性工程问题,同传统算法相比,其精度和效率都得到了明显提升.

  9. Calibration of forcefields for molecular simulation: sequential design of computer experiments for building cost-efficient kriging metamodels.

    Science.gov (United States)

    Cailliez, Fabien; Bourasseau, Arnaud; Pernot, Pascal

    2014-01-15

    We present a global strategy for molecular simulation forcefield optimization, using recent advances in Efficient Global Optimization algorithms. During the course of the optimization process, probabilistic kriging metamodels are used, that predict molecular simulation results for a given set of forcefield parameter values. This enables a thorough investigation of parameter space, and a global search for the minimum of a score function by properly integrating relevant uncertainty sources. Additional information about the forcefield parameters are obtained that are inaccessible with standard optimization strategies. In particular, uncertainty on the optimal forcefield parameters can be estimated, and transferred to simulation predictions. This global optimization strategy is benchmarked on the TIP4P water model.

  10. Polynomial meta-models with canonical low-rank approximations: Numerical insights and comparison to sparse polynomial chaos expansions

    Energy Technology Data Exchange (ETDEWEB)

    Konakli, Katerina, E-mail: konakli@ibk.baug.ethz.ch; Sudret, Bruno

    2016-09-15

    The growing need for uncertainty analysis of complex computational models has led to an expanding use of meta-models across engineering and sciences. The efficiency of meta-modeling techniques relies on their ability to provide statistically-equivalent analytical representations based on relatively few evaluations of the original model. Polynomial chaos expansions (PCE) have proven a powerful tool for developing meta-models in a wide range of applications; the key idea thereof is to expand the model response onto a basis made of multivariate polynomials obtained as tensor products of appropriate univariate polynomials. The classical PCE approach nevertheless faces the “curse of dimensionality”, namely the exponential increase of the basis size with increasing input dimension. To address this limitation, the sparse PCE technique has been proposed, in which the expansion is carried out on only a few relevant basis terms that are automatically selected by a suitable algorithm. An alternative for developing meta-models with polynomial functions in high-dimensional problems is offered by the newly emerged low-rank approximations (LRA) approach. By exploiting the tensor–product structure of the multivariate basis, LRA can provide polynomial representations in highly compressed formats. Through extensive numerical investigations, we herein first shed light on issues relating to the construction of canonical LRA with a particular greedy algorithm involving a sequential updating of the polynomial coefficients along separate dimensions. Specifically, we examine the selection of optimal rank, stopping criteria in the updating of the polynomial coefficients and error estimation. In the sequel, we confront canonical LRA to sparse PCE in structural-mechanics and heat-conduction applications based on finite-element solutions. Canonical LRA exhibit smaller errors than sparse PCE in cases when the number of available model evaluations is small with respect to the input

  11. A meta-modelling strategy to identify the critical offshore conditions for coastal flooding

    Directory of Open Access Journals (Sweden)

    J. Rohmer

    2012-09-01

    Full Text Available High water level at the coast may be the result of different combinations of offshore hydrodynamic conditions (e.g. wave characteristics, offshore water level, etc.. Providing the contour of the "critical" set of offshore conditions leading to high water level is of primary importance either to constrain early warning networks based on hydro-meteorological forecast or observations, or for the assessment of the coastal flood hazard return period. The challenge arises from the use of computationally intensive simulators, which prevent the application of a grid approach consisting in extracting the contour through the systematic evaluation of the simulator on a fine design grid defined in the offshore conditions domain. To overcome such a computational difficulty, we propose a strategy based on the kriging meta-modelling technique combined with an adaptive sampling procedure aiming at improving the local accuracy in the regions of the offshore conditions that contribute the most to the estimate of the targeted contour. This methodology is applied to two cases using an idealized site on the Mediterranean coast (southern France: (1 a two-dimensional case to ease the visual analysis and aiming at identifying the combination of offshore water level and of significant wave height; (2 a more complex case aiming at identifying four offshore conditions (offshore water level and offshore wave characteristics: height, direction and period. By using a simulator of moderate computation time cost (a few tens of minutes, the targeted contour can be estimated using a cluster composed of a moderate number of computer units. This reference contour is then compared with the results of the meta-model-based strategy. In both cases, we show that the critical offshore conditions can be estimated with a good level of accuracy using a very limited number (of a few tens of computationally intensive hydrodynamic simulations.

  12. FLCNDEMF: An Event Metamodel for Flood Process Information Management under the Sensor Web Environment

    Directory of Open Access Journals (Sweden)

    Nengcheng Chen

    2015-06-01

    Full Text Available Significant economic losses, large affected populations, and serious environmental damage caused by recurrent natural disaster events (NDE worldwide indicate insufficiency in emergency preparedness and response. The barrier of full life cycle data preparation and information support is one of the main reasons. This paper adopts the method of integrated environmental modeling, incorporates information from existing event protocols, languages, and models, analyzes observation demands from different event stages, and forms the abstract full life cycle natural disaster event metamodel (FLCNDEM based on meta-object facility. Then task library and knowledge base for floods are built to instantiate FLCNDEM, forming the FLCNDEM for floods (FLCNDEMF. FLCNDEMF is formalized according to Event Pattern Markup Language, and a prototype system, Natural Disaster Event Manager, is developed to assist in the template-based modeling and management. The flood in Liangzi (LZ Lake of Hubei, China on 16 July 2010 is adopted to illustrate how to apply FLCNDEM in real scenarios. FLCNDEM-based modeling is realized, and the candidate remote sensing (RS dataset for different observing missions are provided for LZ Lake flood. Taking the mission of flood area extraction as an example, the appropriate RS data are selected via the model of simplified general perturbation version 4, and the flood area in different phases are calculated and displayed on the map. The phase-based modeling and visualization intuitively display the spatial-temporal distribution and the evolution process of the LZ Lake flood, and it is of great significance for flood responding. In addition, through the extension mechanism, FLCNDEM can also be applied in other environmental applications, providing important support for full life cycle information sharing and rapid responding.

  13. "EII META-MODEL" ON INTEGRATION FRAMEWORK FOR VIABLE ENTERPRISE SYSTEMS - CITY PLANNING METAPHOR BASED ON STRUCTURAL SIMILARITY

    Institute of Scientific and Technical Information of China (English)

    Yukio NAMBA; Junichi IIJIMA

    2003-01-01

    Enterprise systems must have the structure to adapt the change of business environment. Whenrebuilding enterprise system to meet the extended operational boundaries, the concept of IT cityplanning is applicable and effective. The aim of this paper is to describe the architectural approachfrom the integrated information infrastructure (In3) standpoint and to propose for applying the "CityPlanning" concept for rebuilding "inter-application spaghetti" enterprise systems. This is mainlybecause the portion of infrastructure has increased with the change of information systems fromcentralized systems to distributed and open systems. As enterprise systems have involvedheterogeneity or architectural black box in them, it may be required the integration framework(meta-architecture) as a discipline based on heterogeneity that can provide comprehensive view of theenterprise systems. This paper proposes "EH Meta-model" as the integration framework that canoptimize the overall enterprise systems from the IT city planning point of view. EH Meta-modelconsists of "Integrated Information Infrastructure Map (In3-Map)", "Service Framework" and "ITScenario". It would be applicable and effective for the viable enterprise, because it has the mechanismto adapt the change. Finally, we illustrate a case of information system in an online securitiescompany and demonstrate applicability and effectiveness of EII Meta-model to meet their businessgoals.

  14. Extending MAM5 Meta-Model and JaCalIV E Framework to Integrate Smart Devices from Real Environments.

    Directory of Open Access Journals (Sweden)

    J A Rincon

    Full Text Available This paper presents the extension of a meta-model (MAM5 and a framework based on the model (JaCalIVE for developing intelligent virtual environments. The goal of this extension is to develop augmented mirror worlds that represent a real and virtual world coupled, so that the virtual world not only reflects the real one, but also complements it. A new component called a smart resource artifact, that enables modelling and developing devices to access the real physical world, and a human in the loop agent to place a human in the system have been included in the meta-model and framework. The proposed extension of MAM5 has been tested by simulating a light control system where agents can access both virtual and real sensor/actuators through the smart resources developed. The results show that the use of real environment interactive elements (smart resource artifacts in agent-based simulations allows to minimize the error between simulated and real system.

  15. The tetrahedron of knowledge acquisition : A meta-model of the relations among observation, conceptualization, evaluation and action in the research on socio-ecological systems

    NARCIS (Netherlands)

    Nijland, G.O.

    2002-01-01

    This paper presents a meta-model which integrates different approaches in the research on socio-ecological systems. The relations between hypothetical-deductive, empirical-inductive and interpretive-phenomenological holistic research approaches are visualized schematically together with their interr

  16. Defense Science Board (DSB) Summer Study Report on Strategic Surprise

    Science.gov (United States)

    2015-07-01

    DSB Summer Study Report on Strategic Surprise July 2015 Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden...SUBTITLE DSB Summer Study Report on Strategic Surprise 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT...NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Defense Science Board ( DSB ),The Pentagon ,OUSD(AT&L

  17. Providing the meta-model of development of competency using the meta-ethnography approach: Part 2. Synthesis of the available competency development models

    Directory of Open Access Journals (Sweden)

    Somayeh Akbari Farmad

    2016-06-01

    Full Text Available Background and Purpose: Considering the importance and necessity of competency-based education at a global level and with respect to globalization and the requirement of minimum competencies in medical fields, medical education communities and organizations worldwide have tried to determine the competencies, present frameworks and education models to respond to be sure of the ability of all graduates. In the literature, we observed numerous competency development models that refer to the same issues with different terminologies. It seems that evaluation and synthesis of all these models can finally result in designing a comprehensive meta-model for competency development. Methods: Meta-ethnography is a useful method for synthesis of qualitative research that is used to develop models that interpret the results in several studies. Considering that the aim of this study is to ultimately provide a competency development meta-model, in the previous section of the study, the literature review was conducted to achieve competency development models. Models obtained through the search were studied in details, and the key concepts of the models and overarching concepts were extracted in this section, models’ concepts were reciprocally translated and the available competency development models were synthesized. Results: A presentation of the competency development meta-model and providing a redefinition of the Dreyfus brothers model. Conclusions: Given the importance of competency-based education at a global level and the need to review curricula and competency-based curriculum design, it is required to provide competency development as well as meta-model to be the basis for curriculum development. As there are a variety of competency development models available, in this study, it was tried to develop the curriculum using them. Keywords: Meta-ethnography, Competency development, Meta-model, Qualitative synthesis

  18. Meta-modeling of methylprednisolone effects on glucose regulation in rats.

    Directory of Open Access Journals (Sweden)

    Jing Fang

    Full Text Available A retrospective meta-modeling analysis was performed to integrate previously reported data of glucocorticoid (GC effects on glucose regulation following a single intramuscular dose (50 mg/kg, single intravenous doses (10, 50 mg/kg, and intravenous infusions (0.1, 0.2, 0.3 and 0.4 mg/kg/h of methylprednisolone (MPL in normal and adrenalectomized (ADX male Wistar rats. A mechanistic pharmacodynamic (PD model was developed based on the receptor/gene/protein-mediated GC effects on glucose regulation. Three major target organs (liver, white adipose tissue and skeletal muscle together with some selected intermediate controlling factors were designated as important regulators involved in the pathogenesis of GC-induced glucose dysregulation. Assessed were dynamic changes of food intake and systemic factors (plasma glucose, insulin, free fatty acids (FFA and leptin and tissue-specific biomarkers (cAMP, phosphoenolpyruvate carboxykinase (PEPCK mRNA and enzyme activity, leptin mRNA, interleukin 6 receptor type 1 (IL6R1 mRNA and Insulin receptor substrate-1 (IRS-1 mRNA after acute and chronic dosing with MPL along with the GC receptor (GR dynamics in each target organ. Upon binding to GR in liver, MPL dosing caused increased glucose production by stimulating hepatic cAMP and PEPCK activity. In adipose tissue, the rise in leptin mRNA and plasma leptin caused reduction of food intake, the exogenous source of glucose input. Down-regulation of IRS-1 mRNA expression in skeletal muscle inhibited the stimulatory effect of insulin on glucose utilization further contributing to hyperglycemia. The nuclear drug-receptor complex served as the driving force for stimulation or inhibition of downstream target gene expression within different tissues. Incorporating information such as receptor dynamics, as well as the gene and protein induction, allowed us to describe the receptor-mediated effects of MPL on glucose regulation in each important tissue. This advanced

  19. A META-MODELLING SERVICE PARADIGM FOR CLOUD COMPUTING AND ITS IMPLEMENTATION

    Directory of Open Access Journals (Sweden)

    F. Cheng

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT:Service integrators seek opportunities to align the way they manage resources in the service supply chain. Many business organisations can operate new, more flexible business processes that harness the value of a service approach from the customer’s perspective. As a relatively new concept, cloud computing and related technologies have rapidly gained momentum in the IT world. This article seeks to shed light on service supply chain issues associated with cloud computing by examining several interrelated questions: service supply chain architecture from a service perspective; the basic clouds of service supply chain; managerial insights into these clouds; and the commercial value of implementing cloud computing. In particular, to show how those services can be used, and involved in their utilisation processes, a hypothetical meta-modelling service of cloud computing is proposed. Moreover, the paper defines the managed cloud architecture for a service vendor or service integrator in the cloud computing infrastructure in the service supply chain: IT services, business services, business processes, which create atomic and composite software services that are used to perform business processes with business service choreographies.

    AFRIKAANSE OPSOMMING: Diensintegreeders is op soek na geleenthede om die bestuur van hulpbronne in die diensketting te belyn. Talle organisasies kan nuwe, meer buigsame besigheidprosesse, wat die waarde van ‘n diensaanslag uit die kliënt se oogpunt inspan, gebruik. As ‘n relatiewe nuwe konsep het wolkberekening en verwante tegnologie vinnig momentum gekry in die IT-wêreld. Die artikel poog om lig te werp op kwessies van die diensketting wat verband hou met wolkberekening deur verskeie verwante vrae te ondersoek: dienkettingargitektuur uit ‘n diensoogpunt; die basiese wolk van die diensketting; bestuursinsigte oor sodanige wolke; en die kommersiële waarde van die implementering van

  20. A Reference Architecture for Provisioning of Tools as a Service: Meta-Model, Ontologies and Design Elements

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali; Sheng, Quan Z.

    2016-01-01

    Software Architecture (SA) plays a critical role in designing, developing and evolving cloud-based platforms that can be used to provision different types of services to consumers on demand. In this paper, we present a Reference Architecture (RA) for designing cloud-based Tools as a service SPACE...... (TSPACE) for provisioning a bundled suite of tools by following the Software as a Service (SaaS) model. The reference architecture has been designed by leveraging information structuring approaches and by using well-known architecture design principles and patterns. The RA has been documented using view......-based approach and has been presented in terms of its context, goals, the RA meta-model, information structuring and relationship models using ontologies and components of the RA. We have demonstrated the feasibility and applicability of the RA with the help of a prototype and have used the prototype...

  1. LMS Transitioning to "Moodle": A Surprising Case of Successful, Emergent Change Management

    Science.gov (United States)

    Lawler, Alan

    2011-01-01

    During 2009-10 the University of Ballarat implemented the open-source learning management system (LMS) "Moodle" alongside its existing legacy LMS, "Blackboard". While previous IT implementations have been troublesome at the university, notably the student information and finance management systems in 2008-09, the…

  2. Avoiding surprises when implementing a single quality system.

    Science.gov (United States)

    Donawa, Maria

    2009-01-01

    European medical device manufacturers are sometimes surprised to learn that operating ISO 13485 alone is not sufficient to meet United States (US) quality system requirements. This article discusses important considerations for meeting US and European requirements when operating under a single quality system.

  3. Reconsiderations: Donald Murray and the Pedagogy of Surprise

    Science.gov (United States)

    Ballenger, Bruce

    2008-01-01

    Toward the end of his life, Donald Murray felt that his approach to writing instruction was no longer appreciated by journals in his field. Nevertheless, his emphasis on encouraging students to surprise themselves through informal writing still has considerable value. (Contains 1 note.)

  4. Reconsiderations: Donald Murray and the Pedagogy of Surprise

    Science.gov (United States)

    Ballenger, Bruce

    2008-01-01

    Toward the end of his life, Donald Murray felt that his approach to writing instruction was no longer appreciated by journals in his field. Nevertheless, his emphasis on encouraging students to surprise themselves through informal writing still has considerable value. (Contains 1 note.)

  5. Errors and surprise in patients with focal brain lesions

    NARCIS (Netherlands)

    Ullsperger, M.

    2016-01-01

    Recent theories of performance monitoring suggest that not only errors and negative action outcomes but also valence-free expectancy violations can trigger cognitive and behavioral adaptations. EEG and fMRI evidence suggests that monitoring of both errors and surprising but valence-free action

  6. Flexible software process lines in practice: A metamodel-based approach to effectively construct and manage families of software process models

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Ternité, Thomas; Friedrich, Jan

    2017-01-01

    -defined and standardized process assets that can be reused, modified, and extended using a well-defined customization approach. Hence, process engineers can ground context-specific process variants in a standardized or domain-specific reference model that can be adapted to the respective context. We present an approach...... to construct flexible SPrLs and show its practical application in the German V-Modell XT. We contribute a proven approach that is presented as metamodel fragment for reuse and implementation in further process modeling approaches. This summary refers to the paper Flexible software process lines in practice......: A metamodel-based approach to effectively construct and manage families of software process models [Ku16]. This paper was published as original research article in the Journal of Systems and Software....

  7. Flexible software process lines in practice: A metamodel-based approach to effectively construct and manage families of software process models

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Ternité, Thomas; Friedrich, Jan

    2017-01-01

    Process flexibility and adaptability is frequently discussed, and several proposals aim to improve software processes for a given organization-/project context. A software process line (SPrL) is an instrument to systematically construct and manage variable software processes, by combining pre-def......: A metamodel-based approach to effectively construct and manage families of software process models [Ku16]. This paper was published as original research article in the Journal of Systems and Software....

  8. Sleeping beauties in theoretical physics 26 surprising insights

    CERN Document Server

    Padmanabhan, Thanu

    2015-01-01

    This book addresses a fascinating set of questions in theoretical physics which will both entertain and enlighten all students, teachers and researchers and other physics aficionados. These range from Newtonian mechanics to quantum field theory and cover several puzzling issues that do not appear in standard textbooks. Some topics cover conceptual conundrums, the solutions to which lead to surprising insights; some correct popular misconceptions in the textbook discussion of certain topics; others illustrate deep connections between apparently unconnected domains of theoretical physics; and a few provide remarkably simple derivations of results which are not often appreciated. The connoisseur of theoretical physics will enjoy a feast of pleasant surprises skilfully prepared by an internationally acclaimed theoretical physicist. Each topic is introduced with proper background discussion and special effort is taken to make the discussion self-contained, clear and comprehensible to anyone with an undergraduate e...

  9. The June surprises: balls, strikes, and the fog of war.

    Science.gov (United States)

    Fried, Charles

    2013-04-01

    At first, few constitutional experts took seriously the argument that the Patient Protection and Affordable Care Act exceeded Congress's power under the commerce clause. The highly political opinions of two federal district judges - carefully chosen by challenging plaintiffs - of no particular distinction did not shake that confidence that the act was constitutional. This disdain for the challengers' arguments was only confirmed when the act was upheld by two highly respected conservative court of appeals judges in two separate circuits. But after the hostile, even mocking questioning of the government's advocate in the Supreme Court by the five Republican-appointed justices, the expectation was that the act would indeed be struck down on that ground. So it came as no surprise when the five opined the act did indeed exceed Congress's commerce clause power. But it came as a great surprise when Chief Justice John Roberts, joined by the four Democrat-appointed justices, ruled that the act could be sustained as an exercise of Congress's taxing power - a ground urged by the government almost as an afterthought. It was further surprising, even shocking, that Justices Antonin Scalia, Anthony Kennedy, Clarence Thomas, and Samuel Alito not only wrote a joint opinion on the commerce clause virtually identical to that of their chief, but that in writing it they did not refer to or even acknowledge his opinion. Finally surprising was the fact that Justices Ruth Bader Ginsburg and Stephen Breyer joined the chief in holding that aspects of the act's Medicaid expansion were unconstitutional. This essay ponders and tries to unravel some of these puzzles.

  10. Surprise and Opportunity for Learning in Grand Canyon: the Glen Canyon Dam Adaptive Management Program

    Directory of Open Access Journals (Sweden)

    Theodore S. Melis

    2015-09-01

    Full Text Available With a focus on resources of the Colorado River ecosystem below Glen Canyon Dam, the Glen Canyon Dam Adaptive Management Program has included a variety of experimental policy tests, ranging from manipulation of water releases from the dam to removal of non-native fish within Grand Canyon National Park. None of these field-scale experiments has yet produced unambiguous results in terms of management prescriptions. But there has been adaptive learning, mostly from unanticipated or surprising resource responses relative to predictions from ecosystem modeling. Surprise learning opportunities may often be viewed with dismay by some stakeholders who might not be clear about the purpose of science and modeling in adaptive management. However, the experimental results from the Glen Canyon Dam program actually represent scientific successes in terms of revealing new opportunities for developing better river management policies. A new long-term experimental management planning process for Glen Canyon Dam operations, started in 2011 by the U.S. Department of the Interior, provides an opportunity to refocus management objectives, identify and evaluate key uncertainties about the influence of dam releases, and refine monitoring for learning over the next several decades. Adaptive learning since 1995 is critical input to this long-term planning effort. Embracing uncertainty and surprise outcomes revealed by monitoring and ecosystem modeling will likely continue the advancement of resource objectives below the dam, and may also promote efficient learning in other complex programs.

  11. Upscaling NZ-DNDC using a regression based meta-model to estimate direct N2O emissions from New Zealand grazed pastures.

    Science.gov (United States)

    Giltrap, Donna L; Ausseil, Anne-Gaëlle E

    2016-01-01

    The availability of detailed input data frequently limits the application of process-based models at large scale. In this study, we produced simplified meta-models of the simulated nitrous oxide (N2O) emission factors (EF) using NZ-DNDC. Monte Carlo simulations were performed and the results investigated using multiple regression analysis to produce simplified meta-models of EF. These meta-models were then used to estimate direct N2O emissions from grazed pastures in New Zealand. New Zealand EF maps were generated using the meta-models with data from national scale soil maps. Direct emissions of N2O from grazed pasture were calculated by multiplying the EF map with a nitrogen (N) input map. Three meta-models were considered. Model 1 included only the soil organic carbon in the top 30cm (SOC30), Model 2 also included a clay content factor, and Model 3 added the interaction between SOC30 and clay. The median annual national direct N2O emissions from grazed pastures estimated using each model (assuming model errors were purely random) were: 9.6GgN (Model 1), 13.6GgN (Model 2), and 11.9GgN (Model 3). These values corresponded to an average EF of 0.53%, 0.75% and 0.63% respectively, while the corresponding average EF using New Zealand national inventory values was 0.67%. If the model error can be assumed to be independent for each pixel then the 95% confidence interval for the N2O emissions was of the order of ±0.4-0.7%, which is much lower than existing methods. However, spatial correlations in the model errors could invalidate this assumption. Under the extreme assumption that the model error for each pixel was identical the 95% confidence interval was approximately ±100-200%. Therefore further work is needed to assess the degree of spatial correlation in the model errors.

  12. Metamodeling as a tool to size vegetative filter strips for surface runoff pollution control in European watersheds.

    Science.gov (United States)

    Lauvernet, Claire; Muñoz-Carpena, Rafael; Carluer, Nadia

    2015-04-01

    influence and interactions, and set priorities for data collecting and management. Based on GSA results, we compared several mathematical methods to compute the metamodel, and then validated it on an agricultural watershed with real data in the North-West of France. The analysis procedure allows for a robust and validated metamodel, before extending it on other climatic conditions in order to make the application on a large range of european watersheds possible. The tool will allow comparison of field scenarios, and to validate/improve actual existing placements and VFS sizing.

  13. Estimations of expectedness and potential surprise in possibility theory

    Science.gov (United States)

    Prade, Henri; Yager, Ronald R.

    1992-01-01

    This note investigates how various ideas of 'expectedness' can be captured in the framework of possibility theory. Particularly, we are interested in trying to introduce estimates of the kind of lack of surprise expressed by people when saying 'I would not be surprised that...' before an event takes place, or by saying 'I knew it' after its realization. In possibility theory, a possibility distribution is supposed to model the relative levels of mutually exclusive alternatives in a set, or equivalently, the alternatives are assumed to be rank-ordered according to their level of possibility to take place. Four basic set-functions associated with a possibility distribution, including standard possibility and necessity measures, are discussed from the point of view of what they estimate when applied to potential events. Extensions of these estimates based on the notions of Q-projection or OWA operators are proposed when only significant parts of the possibility distribution are retained in the evaluation. The case of partially-known possibility distributions is also considered. Some potential applications are outlined.

  14. 10 years of surprises at Saturn: CAPS and INMS highlights

    Science.gov (United States)

    Coates, A. J.; Waite, J. H.

    2014-04-01

    The Cassini mission at Saturn has provided many surprises on Saturn's rapidly rotating magnetosphere and its interaction with the diverse moons, as well as its interaction with the solar wind. One of the early discoveries was the water-rich composition of the magnetosphere. Its structure and dynamics indicate remarkable injections, periodicities and interchange events. Enceladus, orbiting at 4 RS, was found to have plumes of water vapour and ice which are the dominant source for the inner magnetosphere. Charged water clusters, charged dust and photoelectrons provide key populations in the 'dusty plasma' seen here, as well as chemical complexity in the plume material. Direct pickup is seen near Enceladus and field aligned currents create a spot in Saturn's aurora. At Titan, orbiting at 20 RS, heavy negative and positive ions are seen in the ionosphere, as well as neutrals, all of which have surprising chemical complexity. These provide the source for Titan's haze. Ionospheric plasma is seen in Titan's tail, enabling ion escape to be estimated at 7 tonnes per day. Saturn's ring ionosphere was seen early in the mission, which was oxygen rich and produced photoelectrons; a return will be made in 2017. At Rhea, pickup positive and negative ions indicated weak atmospheres sustained by energetic particle impact, seen in the neutrals also. A weak atmosphere was also seen at Dione. The exosphere production process operates at Jupiter's moons also. Here we review some of the key new results, and discuss the implications for other solar system contexts.

  15. A meta-model analysis of a finite element simulation for defining poroelastic properties of intervertebral discs.

    Science.gov (United States)

    Nikkhoo, Mohammad; Hsu, Yu-Chun; Haghpanahi, Mohammad; Parnianpour, Mohamad; Wang, Jaw-Lin

    2013-06-01

    Finite element analysis is an effective tool to evaluate the material properties of living tissue. For an interactive optimization procedure, the finite element analysis usually needs many simulations to reach a reasonable solution. The meta-model analysis of finite element simulation can be used to reduce the computation of a structure with complex geometry or a material with composite constitutive equations. The intervertebral disc is a complex, heterogeneous, and hydrated porous structure. A poroelastic finite element model can be used to observe the fluid transferring, pressure deviation, and other properties within the disc. Defining reasonable poroelastic material properties of the anulus fibrosus and nucleus pulposus is critical for the quality of the simulation. We developed a material property updating protocol, which is basically a fitting algorithm consisted of finite element simulations and a quadratic response surface regression. This protocol was used to find the material properties, such as the hydraulic permeability, elastic modulus, and Poisson's ratio, of intact and degenerated porcine discs. The results showed that the in vitro disc experimental deformations were well fitted with limited finite element simulations and a quadratic response surface regression. The comparison of material properties of intact and degenerated discs showed that the hydraulic permeability significantly decreased but Poisson's ratio significantly increased for the degenerated discs. This study shows that the developed protocol is efficient and effective in defining material properties of a complex structure such as the intervertebral disc.

  16. Measured Zero Net Energy Performance: Results, Lessons, and Surprises

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Carrie; LaRue, Anna; Pigman, Margaret; Roberts, Jon; Kaneda, David; Connelly, Dylan; Elliott, John; Pless, Shanti; Pande, Abhijeet; Dean, Edward; Anbarlilar, Can

    2016-08-26

    As more and more zero net energy (ZNE) buildings are built and monitored, we can learn from both careful case studies of individual projects as well as a broader perspective of trends over time. In a forum sponsored by Pacific Gas and Electric Company (PG&E), eight expert speakers discussed: results and lessons from monitoring occupied ZNE buildings; best practices for setting performance targets and getting actionable performance information, and; things that have surprised them about monitored ZNE buildings. This paper distills the content of the forum by laying out the most common hurdles that are encountered in setting up monitoring projects, frequent performance issues that the monitoring uncovers, and lessons learned that can be applied to future projects.

  17. Surprising hair analysis results following acute carbofuran intoxication.

    Science.gov (United States)

    Dulaurent, S; Gaulier, J M; Zouaoui, K; Moesch, C; François, B; Lachâtre, G

    2011-10-10

    We present two non fatal cases of intoxication with carbofuran (CBF) documented by hair analysis. Carbofuran and 3-hydroxycarbofuran (OHCBF, its main metabolite) hair concentrations were determined using a liquid chromatography-tandem mass spectrometry method. The obtained results were surprising if we consider several hair analyses previously published and based on a theory of the presence of xenobiotic in the only segment that comprised its intake. Among the two intoxication cases, we noticed the presence of CBF and OHCBF in hair segments corresponding to 45 days before, and more than 100 days after, the day of intoxication. Additionally, repeated hair samplings and subsequent analysis revealed a decrease of the carbofuran's concentration during the hair life.

  18. Physics Nobel prize 2004: Surprising theory wins physics Nobel

    CERN Multimedia

    2004-01-01

    From left to right: David Politzer, David Gross and Frank Wilczek. For their understanding of counter-intuitive aspects of the strong force, which governs quarks inside protons and neutrons, on 5 October three American physicists were awarded the 2004 Nobel Prize in Physics. David J. Gross (Kavli Institute of Theoretical Physics, University of California, Santa Barbara), H. David Politzer (California Institute of Technology), and Frank Wilczek (Massachusetts Institute of Technology) made a key theoretical discovery with a surprising result: the closer quarks are together, the weaker the force - opposite to what is seen with electromagnetism and gravity. Rather, the strong force is analogous to a rubber band stretching, where the force increases as the quarks get farther apart. These physicists discovered this property of quarks, known as asymptotic freedom, in 1976. It later became a key part of the theory of quantum chromodynamics (QCD) and the Standard Model, the current best theory to describe the interac...

  19. Probability and Surprisal in Auditory Comprehension of Morphologically Complex Words

    DEFF Research Database (Denmark)

    Balling, Laura Winther; Baayen, R. Harald

    2012-01-01

    Two auditory lexical decision experiments document for morphologically complex words two points at which the probability of a target word given the evidence shifts dramatically. The first point is reached when morphologically unrelated competitors are no longer compatible with the evidence....... Adapting terminology from Marslen-Wilson (1984), we refer to this as the word’s initial uniqueness point (UP1). The second point is the complex uniqueness point (CUP) introduced by Balling and Baayen (2008), at which morphologically related competitors become incompatible with the input. Later initial...... in the course of the word co-determines response latencies. The presence of effects of surprisal, both at the initial uniqueness point of complex words, and cumulatively throughout the word, challenges the Shortlist B model of Norris and McQueen (2008), and suggests that a Bayesian approach to auditory...

  20. 2014 Presidential elections in Romania – surprising result or strategy

    Directory of Open Access Journals (Sweden)

    Dan Mihalache

    2015-03-01

    Full Text Available The presidential elections in Romania which took place in November 2014 were won by Klaus Iohannis, who clearly defeated the incumbent prime-minister Victor Ponta by 10%. The result was considered by many a surprise, as none of the opinion polls were able to predict it. This article reveals a part of the strategy of Klaus Iohannis’s campaign and it offers a few clues about how this is result was possible, without having the aim to explain it fully. As the authors were accountable for strategy and political message in the electoral campaign for Klaus Iohannis, the scientific approach is combined with the inside view, to provide the reader a better understanding of the November 2014 events.

  1. Exploring the concept of climate surprises. A review of the literature on the concept of surprise and how it is related to climate change

    Energy Technology Data Exchange (ETDEWEB)

    Glantz, M.H.; Moore, C.M. [National Center for Atmospheric Research, Boulder, CO (United States); Streets, D.G.; Bhatti, N.; Rosa, C.H. [Argonne National Lab., IL (United States). Decision and Information Sciences Div.; Stewart, T.R. [State Univ. of New York, Albany, NY (United States)

    1998-01-01

    This report examines the concept of climate surprise and its implications for environmental policymaking. Although most integrated assessment models of climate change deal with average values of change, it is usually the extreme events or surprises that cause the most damage to human health and property. Current models do not help the policymaker decide how to deal with climate surprises. This report examines the literature of surprise in many aspects of human society: psychology, military, health care, humor, agriculture, etc. It draws together various ways to consider the concept of surprise and examines different taxonomies of surprise that have been proposed. In many ways, surprise is revealed to be a subjective concept, triggered by such factors as prior experience, belief system, and level of education. How policymakers have reacted to specific instances of climate change or climate surprise in the past is considered, particularly with regard to the choices they made between proactive and reactive measures. Finally, the report discusses techniques used in the current generation of assessment models and makes suggestions as to how climate surprises might be included in future models. The report concludes that some kinds of surprises are simply unpredictable, but there are several types that could in some way be anticipated and assessed, and their negative effects forestalled.

  2. A Quantitative Review and Meta-models of the Variability and Factors Affecting Oral Drug Absorption-Part II: Gastrointestinal Transit Time.

    Science.gov (United States)

    Abuhelwa, Ahmad Y; Foster, David J R; Upton, Richard N

    2016-09-01

    This study aimed to conduct a quantitative meta-analysis for the values of, and variability in, gastrointestinal (GI) transit times of non-disintegrating single-unit ("tablet") and multiple-unit ("pellets/multi-unit tablet") solid dosage forms, characterize the effect of food on the values and variability in these parameters and present quantitative meta-models of the distributions of GI transit times in the respective GI regions to help inform models of oral drug absorption. The literature was systemically reviewed for the values of, and the variability in, gastric, small intestinal and colonic transit times under fed and fasted conditions. Meta-analysis used the "metafor" package of the R language. Meta-models of GI transit were assumed to be log-normally distributed between the studied populations. Twenty-nine studies including 125 reported means and standard deviations were used in the meta-analysis. Caloric content of administered food increased variability and delayed the gastric transit of both pellets and tablets. Conversely, food caloric content reduced the variability but had no significant influence on the mean small intestinal transit time (SITT). Food had no significant effect on the transit time through the colon. The transit of pellets through the colon was significantly slower than that of single-unit tablets which is most likely related to their smaller size. GI transit times may influence the dissolution and absorption of oral drugs. The meta-models of GI transit times may be used as part of semi-physiological absorption models to characterize the influence of transit time on the dissolution, absorption and in vivo pharmacokinetic profiles of oral drugs.

  3. Mono and multi-objective optimization techniques applied to a large range of industrial test cases using Metamodel assisted Evolutionary Algorithms

    Science.gov (United States)

    Fourment, Lionel; Ducloux, Richard; Marie, Stéphane; Ejday, Mohsen; Monnereau, Dominique; Massé, Thomas; Montmitonnet, Pierre

    2010-06-01

    The use of material processing numerical simulation allows a strategy of trial and error to improve virtual processes without incurring material costs or interrupting production and therefore save a lot of money, but it requires user time to analyze the results, adjust the operating conditions and restart the simulation. Automatic optimization is the perfect complement to simulation. Evolutionary Algorithm coupled with metamodelling makes it possible to obtain industrially relevant results on a very large range of applications within a few tens of simulations and without any specific automatic optimization technique knowledge. Ten industrial partners have been selected to cover the different area of the mechanical forging industry and provide different examples of the forming simulation tools. It aims to demonstrate that it is possible to obtain industrially relevant results on a very large range of applications within a few tens of simulations and without any specific automatic optimization technique knowledge. The large computational time is handled by a metamodel approach. It allows interpolating the objective function on the entire parameter space by only knowing the exact function values at a reduced number of "master points". Two algorithms are used: an evolution strategy combined with a Kriging metamodel and a genetic algorithm combined with a Meshless Finite Difference Method. The later approach is extended to multi-objective optimization. The set of solutions, which corresponds to the best possible compromises between the different objectives, is then computed in the same way. The population based approach allows using the parallel capabilities of the utilized computer with a high efficiency. An optimization module, fully embedded within the Forge2009 IHM, makes possible to cover all the defined examples, and the use of new multi-core hardware to compute several simulations at the same time reduces the needed time dramatically. The presented examples

  4. Pharmacokinetic models of morphine and its metabolites in neonates:: Systematic comparisons of models from the literature, and development of a new meta-model.

    Science.gov (United States)

    Knøsgaard, Katrine Rørbæk; Foster, David John Richard; Kreilgaard, Mads; Sverrisdóttir, Eva; Upton, Richard Neil; van den Anker, Johannes N

    2016-09-20

    Morphine is commonly used for pain management in preterm neonates. The aims of this study were to compare published models of neonatal pharmacokinetics of morphine and its metabolites with a new dataset, and to combine the characteristics of the best predictive models to design a meta-model for morphine and its metabolites in preterm neonates. Moreover, the concentration-analgesia relationship for morphine in this clinical setting was also investigated. A population of 30 preterm neonates (gestational age: 23-32weeks) received a loading dose of morphine (50-100μg/kg), followed by a continuous infusion (5-10μg/kg/h) until analgesia was no longer required. Pain was assessed using the Premature Infant Pain Profile. Five published population models were compared using numerical and graphical tests of goodness-of-fit and predictive performance. Population modelling was conducted using NONMEM® and the $PRIOR subroutine to describe the time-course of plasma concentrations of morphine, morphine-3-glucuronide, and morphine-6-glucuronide, and the concentration-analgesia relationship for morphine. No published model adequately described morphine concentrations in this new dataset. Previously published population pharmacokinetic models of morphine, morphine-3-glucuronide, and morphine-6-glucuronide were combined into a meta-model. The meta-model provided an adequate description of the time-course of morphine and the concentrations of its metabolites in preterm neonates. Allometric weight scaling was applied to all clearance and volume terms. Maturation of morphine clearance was described as a function of postmenstrual age, while maturation of metabolite elimination was described as a function of postnatal age. A clear relationship between morphine concentrations and pain score was not established.

  5. Hierarchical Cluster-based Partial Least Squares Regression (HC-PLSR is an efficient tool for metamodelling of nonlinear dynamic models

    Directory of Open Access Journals (Sweden)

    Omholt Stig W

    2011-06-01

    Full Text Available Abstract Background Deterministic dynamic models of complex biological systems contain a large number of parameters and state variables, related through nonlinear differential equations with various types of feedback. A metamodel of such a dynamic model is a statistical approximation model that maps variation in parameters and initial conditions (inputs to variation in features of the trajectories of the state variables (outputs throughout the entire biologically relevant input space. A sufficiently accurate mapping can be exploited both instrumentally and epistemically. Multivariate regression methodology is a commonly used approach for emulating dynamic models. However, when the input-output relations are highly nonlinear or non-monotone, a standard linear regression approach is prone to give suboptimal results. We therefore hypothesised that a more accurate mapping can be obtained by locally linear or locally polynomial regression. We present here a new method for local regression modelling, Hierarchical Cluster-based PLS regression (HC-PLSR, where fuzzy C-means clustering is used to separate the data set into parts according to the structure of the response surface. We compare the metamodelling performance of HC-PLSR with polynomial partial least squares regression (PLSR and ordinary least squares (OLS regression on various systems: six different gene regulatory network models with various types of feedback, a deterministic mathematical model of the mammalian circadian clock and a model of the mouse ventricular myocyte function. Results Our results indicate that multivariate regression is well suited for emulating dynamic models in systems biology. The hierarchical approach turned out to be superior to both polynomial PLSR and OLS regression in all three test cases. The advantage, in terms of explained variance and prediction accuracy, was largest in systems with highly nonlinear functional relationships and in systems with positive feedback

  6. Sensitivity analysis and metamodeling of a toolchain of models to help sizing vetetative filter strips in a watershed.

    Science.gov (United States)

    Lauvernet, Claire; Noll, Dorothea; Muñoz-Carpena, Rafael; Carluer, Nadia

    2014-05-01

    agricultural field and the VFS characteristics. These scenarios are based on: 2 types of climates (North and South-west of France), different rainfall intensities and durations, different lengths and slopes of hillslope, different humidity conditions, 4 soil types (silt loam, sandy loam, clay loam, sandy clay loam), 2 crops (wheat and corn) for the contributive area, 2 water table depths (1m and 2.5m) and 4 soil types for the VFS. The sizing method was applied for all these scenarios, and a sensitivity analysis of the VFS optimal length was performed for all the input parameters in order to understand their influence, and to identify for which a special care has to be given. Based on that sensitivity analysis, a metamodel has been developed. The idea is to simplify the whole toolchain and to make it possible to perform the buffer sizing by using a unique tool and a smaller set of parameters, given the available information from the end users. We first compared several mathematical methods to compute the metamodel, and then validated them on an agricultural watershed with real data in the North-West of France.

  7. Statistical Metamodeling and Sequential Design of Computer Experiments to Model Glyco-Altered Gating of Sodium Channels in Cardiac Myocytes.

    Science.gov (United States)

    Du, Dongping; Yang, Hui; Ednie, Andrew R; Bennett, Eric S

    2016-09-01

    Glycan structures account for up to 35% of the mass of cardiac sodium ( Nav ) channels. To question whether and how reduced sialylation affects Nav activity and cardiac electrical signaling, we conducted a series of in vitro experiments on ventricular apex myocytes under two different glycosylation conditions, reduced protein sialylation (ST3Gal4(-/-)) and full glycosylation (control). Although aberrant electrical signaling is observed in reduced sialylation, realizing a better understanding of mechanistic details of pathological variations in INa and AP is difficult without performing in silico studies. However, computer model of Nav channels and cardiac myocytes involves greater levels of complexity, e.g., high-dimensional parameter space, nonlinear and nonconvex equations. Traditional linear and nonlinear optimization methods have encountered many difficulties for model calibration. This paper presents a new statistical metamodeling approach for efficient computer experiments and optimization of Nav models. First, we utilize a fractional factorial design to identify control variables from the large set of model parameters, thereby reducing the dimensionality of parametric space. Further, we develop the Gaussian process model as a surrogate of expensive and time-consuming computer models and then identify the next best design point that yields the maximal probability of improvement. This process iterates until convergence, and the performance is evaluated and validated with real-world experimental data. Experimental results show the proposed algorithm achieves superior performance in modeling the kinetics of Nav channels under a variety of glycosylation conditions. As a result, in silico models provide a better understanding of glyco-altered mechanistic details in state transitions and distributions of Nav channels. Notably, ST3Gal4(-/-) myocytes are shown to have higher probabilities accumulated in intermediate inactivation during the repolarization and yield a

  8. The surprising diversity of clostridial hydrogenases: a comparative genomic perspective.

    Science.gov (United States)

    Calusinska, Magdalena; Happe, Thomas; Joris, Bernard; Wilmotte, Annick

    2010-06-01

    Among the large variety of micro-organisms capable of fermentative hydrogen production, strict anaerobes such as members of the genus Clostridium are the most widely studied. They can produce hydrogen by a reversible reduction of protons accumulated during fermentation to dihydrogen, a reaction which is catalysed by hydrogenases. Sequenced genomes provide completely new insights into the diversity of clostridial hydrogenases. Building on previous reports, we found that [FeFe] hydrogenases are not a homogeneous group of enzymes, but exist in multiple forms with different modular structures and are especially abundant in members of the genus Clostridium. This unusual diversity seems to support the central role of hydrogenases in cell metabolism. In particular, the presence of multiple putative operons encoding multisubunit [FeFe] hydrogenases highlights the fact that hydrogen metabolism is very complex in this genus. In contrast with [FeFe] hydrogenases, their [NiFe] hydrogenase counterparts, widely represented in other bacteria and archaea, are found in only a few clostridial species. Surprisingly, a heteromultimeric Ech hydrogenase, known to be an energy-converting [NiFe] hydrogenase and previously described only in methanogenic archaea and some sulfur-reducing bacteria, was found to be encoded by the genomes of four cellulolytic strains: Clostridum cellulolyticum, Clostridum papyrosolvens, Clostridum thermocellum and Clostridum phytofermentans.

  9. Atom Surprise: Using Theatre in Primary Science Education

    Science.gov (United States)

    Peleg, Ran; Baram-Tsabari, Ayelet

    2011-10-01

    Early exposure to science may have a lifelong effect on children's attitudes towards science and their motivation to learn science in later life. Out-of-class environments can play a significant role in creating favourable attitudes, while contributing to conceptual learning. Educational science theatre is one form of an out-of-class environment, which has received little research attention. This study aims to describe affective and cognitive learning outcomes of watching such a play and to point to connections between theatrical elements and specific outcomes. "Atom Surprise" is a play portraying several concepts on the topic of matter. A mixed methods approach was adopted to investigate the knowledge and attitudes of children (grades 1-6) from two different school settings who watched the play. Data were gathered using questionnaires and in-depth interviews. Analysis suggested that in both schools children's knowledge on the topic of matter increased after the play with younger children gaining more conceptual knowledge than their older peers. In the public school girls showed greater gains in conceptual knowledge than boys. No significant changes in students' general attitudes towards science were found, however, students demonstrated positive changes towards science learning. Theatrical elements that seemed to be important in children's recollection of the play were the narrative, props and stage effects, and characters. In the children's memory, science was intertwined with the theatrical elements. Nonetheless, children could distinguish well between scientific facts and the fictive narrative.

  10. Novelty biases attention and gaze in a surprise trial.

    Science.gov (United States)

    Horstmann, Gernot; Herwig, Arvid

    2016-01-01

    While the classical distinction between task-driven and stimulus-driven biasing of attention appears to be a dichotomy at first sight, there seems to be a third category that depends on the contrast or discrepancy between active representations and the upcoming stimulus, and may be termed novelty, surprise, or prediction failure. For previous demonstrations of the discrepancy-attention link, stimulus-driven components (saliency) may have played a decisive role. The present study was conducted to evaluate the discrepancy-attention link in a display where novel and familiar stimuli are equated for saliency. Eye tracking was used to determine fixations on novel and familiar stimuli as a proxy for attention. Results show a prioritization of attention by the novel color, and a de-prioritization of the familiar color, which is clearly present at the second fixation, and spans over the next couple of fixations. Saliency, on the other hand, did not prioritize items in the display. The results thus reinforce the notion that novelty captures and binds attention.

  11. A Well-Known But Still Surprising Generator

    Science.gov (United States)

    Haugland, Ole Anton

    2014-12-01

    The bicycle generator is often mentioned as an example of a method to produce electric energy. It is cheap and easily accessible, so it is a natural example to use in teaching. There are different types, but I prefer the old side-wall dynamo. The most common explanation of its working principle seems to be something like the illustration in Fig. 1. The illustration is taken from a popular textbook in the Norwegian junior high school.1 Typically it is explained as a system of a moving magnet or coils that directly results in a varying magnetic field through the coils. According to Faraday's law a voltage is induced in the coils. Simple and easy! A few times I have had a chance to glimpse into a bicycle generator, and I was somewhat surprised to sense that the magnet rotated parallel to the turns of the coil. How could the flux through the coil change and induce a voltage when the magnet rotated parallel to the turns of the coil? When teaching electromagnetic induction I have showed the students a dismantled generator and asked them how this could work. They naturally found that this was more difficult to understand than the principle illustrated in Fig. 1. Other authors in this journal have discussed even more challenging questions concerning electric generators.2,3

  12. Surprise disrupts cognition via a fronto-basal ganglia suppressive mechanism.

    Science.gov (United States)

    Wessel, Jan R; Jenkinson, Ned; Brittain, John-Stuart; Voets, Sarah H E M; Aziz, Tipu Z; Aron, Adam R

    2016-04-18

    Surprising events markedly affect behaviour and cognition, yet the underlying mechanism is unclear. Surprise recruits a brain mechanism that globally suppresses motor activity, ostensibly via the subthalamic nucleus (STN) of the basal ganglia. Here, we tested whether this suppressive mechanism extends beyond skeletomotor suppression and also affects cognition (here, verbal working memory, WM). We recorded scalp-EEG (electrophysiology) in healthy participants and STN local field potentials in Parkinson's patients during a task in which surprise disrupted WM. For scalp-EEG, surprising events engage the same independent neural signal component that indexes action stopping in a stop-signal task. Importantly, the degree of this recruitment mediates surprise-related WM decrements. Intracranially, STN activity is also increased post surprise, especially when WM is interrupted. These results suggest that surprise interrupts cognition via the same fronto-basal ganglia mechanism that interrupts action. This motivates a new neural theory of how cognition is interrupted, and how distraction arises after surprising events.

  13. Properties and Surprises of Solar Activity XXIII Cycle

    Science.gov (United States)

    Ishkov, V. N.

    2010-12-01

    The main properties of the 23rd cycle match almost completely those of average-magnitude solar cycles, and some of the features of the cycle may indicate a change in the generation mode of magnetic fields in the solar convection zone. If this is the case, the Sun enters a period of intermediate and weak cycles of solar activity (SA) in terms of the Wolf number, which may last for 3 to 6 solar cycles. The main development stages of solar cycle 23 are the following: minimum of solar cycle 22: April 1996 (W* = 8.0); maximum of the smoothed relative sunspot number: April 2000; global polarity reversal of the general solar magnetic field: July to December 2000; secondary maximum of the relative sunspot number: November 2001; maximum of the 10.7-cm radio flux: February 2002; phase of the cycle maximum: October 1999 to June 2002; beginning of the decrease phase: July 2002; the point of minimum of the current SA cycle: December 2008. Solar cycle 23 has presented two powerful flare-active sunspot groups, in September 2005 and December 2006 (+5.5 and +6.6 years from the maximum) which by flare potential occupy 4th and 20th place among the most flare-active regions for the last four solar cycles. The unprecedented duration of the relative sunspot numbers fall that has led to already record duration of the last solar cycle among authentic cycles (since 1849) became the next surprise of development of solar activity during the last cycle. The phase of the minimum began in May 2005 and lasted for 4.5 years. Thus, the new solar cycle 24 has begun in January 2009.

  14. Dracunculiasis eradication - Finishing the job before surprises arise

    Institute of Scientific and Technical Information of China (English)

    Benjamin Jelle Visser

    2012-01-01

    ABSTRACT Dracunculiasis(Guinea worm disease) is a preventable waterborne parasitic disease that affects the poorest people living in remote rural areas in sub-SaharanAfrican countries, who do not have access to safe drinking water.The Guinea Worm Eradication Program, a25-year old campaign to rid the world ofGuineaWorm disease has now reached its final stage accelerating to zero cases in all endemic countries.During the19th and20th centuries, dracunculiasis was common in much ofSouthernAsia and theAfrican continent.The overall number of cases has been reduced tremendously by≥99%, from the3.32 million cases estimated to have occurred in1986 inAfrica to only1797 cases reported in2010 reported in only five countries(Sudan,Mali,Ethiopia,Chad andGhana) andAsia free of the disease.This achievement is unique in its kind - the only previously eradicated disease is smallpox, a viral infection for which vaccination was possible - and it has been achieved through primary community-based prevention and health education programs.Most efforts need to be taken in two countries,SouthSudan(comprising94% or1698 out of1797 of the cases reported world-wide in2010) andMali because of frequent movements of nomads in a vast area inside and outsideMali’s borders.All factors favourable to dracunculiasis eradication are available including adequate financial resources, community and political support and high levels of advocacy.Thus there is no reason that this disabling parasitic disease cannot be eradicated soon before surprises arise such as new civil conflicts in currently endemic countries.

  15. Metamodel-based design optimization of injection molding process variables and gates of an automotive glove box for enhancing its quality

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Gyung Ju [Pusan National University, Busan (Korea, Republic of); Park, Chang Hyun; Choi, Dong Hoon [Hanyang University, Seoul (Korea, Republic of)

    2016-04-15

    Injection molding process variables and gates of an automotive glove box were optimally determined to enhance its injection molding quality. We minimized warpage with satisfying constraints on clamp force, weldline, and profiles of filling and packing. Design variables concerning the injection molding process are temperatures of the mold and the resin, ram speeds, and packing pressures and durations; design variables concerning the gates are the shape of the center gate and locations of two side gates. To optimally determine the design variables in an efficient way, we adopted metamodel-based design optimization, sequentially using an optimal Latin hypercube design as a design of experiment, Kriging models as metamodels that replace time-consuming injection molding simulations, and a micro genetic algorithm as an optimization algorithm. In the optimization process, a commercial injection molding analysis software, MoldflowTM, was employed to evaluate the injection molding quality at design points specified. Using the proposed design approach, the warpage was found reduced by 20.5% compared to the initial warpage, while all the design constraints were satisfied, which clearly shows the validity of the proposed design approach.

  16. Emotional Intelligence and Successful Leadership.

    Science.gov (United States)

    Maulding, Wanda S.

    Cognitive intelligence is often equated with eventual success in many areas. However, there are many instances where people of high IQ flounder whereas those of modest IQ do surprisingly well. Author and renowned psychologist Daniel Goleman believes that the explanation for this fact lies in abilities called "emotional intelligence,"…

  17. Emotion regulation and successful aging.

    Science.gov (United States)

    Suri, Gaurav; Gross, James J

    2012-08-01

    Despite normative declines in old age, healthy elderly typically report surprisingly high levels of well-being. It is not clear why this is so. A study by Brassen and colleagues suggests that one factor may be reduced responsiveness to regret. These findings highlight the role of emotion regulation in successful aging.

  18. Reflection, A Meta-Model for Learning, and a Proposal To Improve the Quality of University Teaching = Reflexion, el meta-modelo del aprendizaje, y la propuesta del mejoramiento de la calidad de la docencia.

    Science.gov (United States)

    Montgomery, Joel R.

    This paper, in both English and Spanish, offers a meta-model of the learning process which focuses on the importance of the reflective learning process in enhancing the quality of learning in higher education. This form of innovative learning is offered as a means of helping learners to realize the relevance of what they are learning to their life…

  19. Reflection, A Meta-Model for Learning, and a Proposal To Improve the Quality of University Teaching = Reflexion, el meta-modelo del aprendizaje, y la propuesta del mejoramiento de la calidad de la docencia.

    Science.gov (United States)

    Montgomery, Joel R.

    This paper, in both English and Spanish, offers a meta-model of the learning process which focuses on the importance of the reflective learning process in enhancing the quality of learning in higher education. This form of innovative learning is offered as a means of helping learners to realize the relevance of what they are learning to their life…

  20. Surprising Sensitivities in Simulations of Radiative Convective Equilibrium

    Science.gov (United States)

    Drotos, Gabor; Becker, Tobias; Mauritsen, Thorsten; Stevens, Bjorn

    2017-04-01

    The climate and climate-sensitivity of a global model run in radiative equilibrium is explored. Results from simulations with ECHAM6.3 coupled to a slab ocean and run in a wide range of configurations are presented. Simulations both with and without a parameterised representation of deep convection are conducted for CO2 concentrations ranging from one eighth of present day values to thirty-two times the present day, and for variations in the solar constant of more than a factor of two. Very long simulations, in some case more than a thousand years, are performed to adequately sample the attractor of the different climate states of the model, and provide robust estimates of the system's climate sensitivity parameter. For the standard configuration of the model the climate sensitivity progressively decreases from very large values (6-7K) for the coldest climates to well below 1 K for the warmest climates. For very high CO2 levels (16 and 32 times the present value) fluctuations of globally averaged temperature as large as 10 K arise on decadal time-scales. These fluctuations manifest as quasi-period coolings, driven by large and persistent global scale decks of stratiform low clouds, so that for a period of several years global temperatures drop to levels below the lowest temperatures of the climate with present day values of CO2. The same configuration of the model has more modest sensitivities when the insolation is reduced, but runaway warming results for small (10%) increases. Simulations without parameterised convection have colder (by roughly 10K) climates and smaller (1K) sensitivities, allowing a stable climate with earth-like temperatures even for insolation much (50%) larger than the present day. Such values of insolation are possible because over a large range of the insolation the climate sensitivity parameter is very near zero. The surprising sensitivities of the system, and the limit-cycle like behaviour of the very CO2 rich climates, can be traced to

  1. Stars Form Surprisingly Close to Milky Way's Black Hole

    Science.gov (United States)

    2005-10-01

    The supermassive black hole at the center of the Milky Way has surprisingly helped spawn a new generation of stars, according to observations from NASA's Chandra X-ray Observatory. This novel mode of star formation may solve several mysteries about the supermassive black holes that reside at the centers of nearly all galaxies. "Massive black holes are usually known for violence and destruction," said Sergei Nayakshin of the University of Leicester, United Kingdom, and coauthor of a paper on this research in an upcoming issue of the Monthly Notices of the Royal Astronomical Society. "So it's remarkable that this black hole helped create new stars, not just destroy them." Black holes have earned their fearsome reputation because any material -- including stars -- that falls within the so-called event horizon is never seen again. However, these new results indicate that the immense disks of gas known to orbit many black holes at a "safe" distance from the event horizon can help nurture the formation of new stars. Animation of Stars Forming Around Black Hole Animation of Stars Forming Around Black Hole This conclusion came from new clues that could only be revealed in X-rays. Until the latest Chandra results, astronomers have disagreed about the origin of a mysterious group of massive stars discovered by infrared astronomers to be orbiting less than a light year from the Milky Way's central black hole, a.k.a. Sagittarius A*, or Sgr A*. At such close distances to Sgr A*, the standard model for star formation predicts that gas clouds from which stars form should have been ripped apart by tidal forces from the black hole. Two models to explain this puzzle have been proposed. In the disk model, the gravity of a dense disk of gas around Sgr A* offsets the tidal forces and allows stars to form; in the migration model, the stars formed in a star cluster far away from the black hole and migrated in to form the ring of massive stars. The migration scenario predicts about a

  2. Carbon Dioxide: Surprising Effects on Decision Making and Neurocognitive Performance

    Science.gov (United States)

    James, John T.

    2013-01-01

    The occupants of modern submarines and the International Space Station (ISS) have much in common as far as their air quality is concerned. Air is polluted by materials offgassing, use of utility compounds, leaks of systems chemicals, and anthropogenic sources. The primary anthropogenic compound of concern to submariners and astronauts has been carbon dioxide (CO2). NASA and the US Navy rely on the National Research Council Committee on Toxicology (NRC-COT) to help formulate exposure levels to CO2 that are thought to be safe for exposures of 3-6 months. NASA calls its limits Spacecraft Maximum Allowable Concentrations (SMACs). Years of experience aboard the ISS and a recent publication on deficits in decision making in ground-based subjects exposed briefly to 0.25% CO2 suggest that exposure levels that have been presumed acceptable to preserve health and performance need to be reevaluated. The current CO2 exposure limits for 3-6 months set by NASA and the UK Navy are 0.7%, and the limit for US submariners is 0.5%, although the NRC-COT recommended a 90-day level of 0.8% as safe a few years ago. NASA has set a 1000-day SMAC at 0.5% for exploration-class missions. Anecdotal experience with ISS operations approaching the current 180-day SMAC of 0.7% suggest that this limit is too high. Temporarily, NASA has limited exposures to 0.5% until further peer-reviewed data become available. In the meantime, a study published last year in the journal Environmental Health Perspectives (Satish U, et al. 2012) demonstrated that complexdecision- making performance is somewhat affected at 0.1% CO2 and becomes "dysfunctional" for at least half of the 9 indices of performance at concentrations approaching 0.25% CO2. The investigators used the Strategic Management Simulation (SMS) method of testing for decisionmaking ability, and the results were so surprising to the investigators that they declared that their findings need to be independently confirmed. NASA has responded to the

  3. Are seismic hazard assessment errors and earthquake surprises unavoidable?

    Science.gov (United States)

    Kossobokov, Vladimir

    2013-04-01

    Why earthquake occurrences bring us so many surprises? The answer seems evident if we review the relationships that are commonly used to assess seismic hazard. The time-span of physically reliable Seismic History is yet a small portion of a rupture recurrence cycle at an earthquake-prone site, which makes premature any kind of reliable probabilistic statements about narrowly localized seismic hazard. Moreover, seismic evidences accumulated to-date demonstrate clearly that most of the empirical relations commonly accepted in the early history of instrumental seismology can be proved erroneous when testing statistical significance is applied. Seismic events, including mega-earthquakes, cluster displaying behaviors that are far from independent or periodic. Their distribution in space is possibly fractal, definitely, far from uniform even in a single segment of a fault zone. Such a situation contradicts generally accepted assumptions used for analytically tractable or computer simulations and complicates design of reliable methodologies for realistic earthquake hazard assessment, as well as search and definition of precursory behaviors to be used for forecast/prediction purposes. As a result, the conclusions drawn from such simulations and analyses can MISLEAD TO SCIENTIFICALLY GROUNDLESS APPLICATION, which is unwise and extremely dangerous in assessing expected societal risks and losses. For example, a systematic comparison of the GSHAP peak ground acceleration estimates with those related to actual strong earthquakes, unfortunately, discloses gross inadequacy of this "probabilistic" product, which appears UNACCEPTABLE FOR ANY KIND OF RESPONSIBLE SEISMIC RISK EVALUATION AND KNOWLEDGEABLE DISASTER PREVENTION. The self-evident shortcomings and failures of GSHAP appeals to all earthquake scientists and engineers for an urgent revision of the global seismic hazard maps from the first principles including background methodologies involved, such that there becomes: (a) a

  4. Surprisingly high substrate specificities observed in complex biofilms

    DEFF Research Database (Denmark)

    Nierychlo, Marta; Kindaichi, Tomonori; Kragelund, Caroline;

    The behavior of microorganisms in natural ecosystems (e.g. biofilms) differs significantly from laboratory studies. In nature microorganisms experience alternating periods of surplus nutrients, nutrient-limitation, and starvation. Literature data suggests that to survive and compete successfully......, microorganisms can regulate their metabolism expressing wide range of uptake and catabolic systems. However, ecophysiological studies of natural biofilms indicate that bacteria are very specialized in their choice of substrate, so even minor changes in substrate composition can affect the community composition...... by selection for different specialized species. We hypothesized that bacteria growing in natural environment express strongly conserved substrate specificity which is independent on short-term (few hours) variations in growth conditions. In this study, biofilm from Aalborg wastewater treatment plant was used...

  5. Selective nitration and bromination of surprisingly ruffled phosphorus corroles.

    Science.gov (United States)

    Pomarico, Giuseppe; Tortora, Luca; Fronczek, Frank R; Smith, Kevin M; Paolesse, Roberto

    2016-05-01

    Phosphorus complexes of corrole have recently attracted increasing interest since these compounds can be easily prepared in good yields, are stable, and show unusual optical properties. For these reasons, phosphorus corroles represent a class of interesting compounds to be exploited in the field of material science or for biomedical investigations and the definition of synthetic pathways for their functionalization is an important step to optimize their properties for various applications. We report here the reactivity of the phosphorus complex of 5,10,15-tritolylcorrole in the nitration or bromination reaction. Both these attempts were successful, allowing the preparation of substituted phosphorus corroles, which can be used as intermediates of more complex architectures endowed with useful properties. Furthermore, the crystallographic characterization of both complexes shows that they have an unusual ruffled geometry of the corrole core, a conformation that has not been considered possible for such a macrocycle.

  6. Data for developing metamodels to assess the fate, transport, and bioaccumulation of organic chemicals in rivers. Chemicals have log Kow ranging from 3 to 14, and rivers have mean annual discharges ranging from 1.09 to 3240 m3/s.

    Data.gov (United States)

    U.S. Environmental Protection Agency — This dataset was developed to demonstrate how metamodels of high resolution, process-based models that simulate the fate, transport, and bioaccumulation of organic...

  7. Supermagnetic Neutron Star Surprises Scientists, Forces Revision of Theories

    Science.gov (United States)

    2006-08-01

    magnetars because their magnetic fields are 100-1,000 times stronger than those of typical pulsars. It is the decay of those incredibly strong fields that powers their strange X-ray emission. "The magnetic field from a magnetar would make an aircraft carrier spin around and point north quicker than a compass needle moves on Earth," said David Helfand, of Columbia University. A magnetar's field is 1,000 trillion times stronger than Earth's, Helfand pointed out. The new object -- named XTE J1810-197 -- was first discovered by NASA's Rossi X-ray Timing Explorer when it emitted a strong burst of X-rays in 2003. While the X-rays were fading in 2004, Jules Halpern of Columbia University and collaborators identified the magnetar as a radio-wave emitter using the National Science Foundation's (NSF) Very Large Array (VLA) radio telescope in New Mexico. Any radio emission is highly unusual for a magnetar. Because magnetars had not been seen to regularly emit radio waves, the scientists presumed that the radio emission was caused by a cloud of particles thrown off the neutron star at the time of its X-ray outburst, an idea they soon would realize was wrong. With knowledge that the magnetar emitted some form of radio waves, Camilo and his colleagues observed it with the Parkes radio telescope in Australia in March and immediately detected astonishingly strong radio pulsations every 5.5 seconds, corresponding to the previously-determined rotation rate of the neutron star. As they continued to observe XTE J1810-197, the scientists got more surprises. Whereas most pulsars become weaker at higher radio frequencies, XTE J1810-197 does not, remaining a strong emitter at frequencies up to 140 GHz, the highest frequency ever detected from a radio pulsar. In addition, unlike normal pulsars, the object's radio emission fluctuates in strength from day to day, and the shape of the pulsations changes as well. These variations likely indicate that the magnetic fields around the pulsar are changing

  8. Chandra Finds Surprising Black Hole Activity In Galaxy Cluster

    Science.gov (United States)

    2002-09-01

    Scientists at the Carnegie Observatories in Pasadena, California, have uncovered six times the expected number of active, supermassive black holes in a single viewing of a cluster of galaxies, a finding that has profound implications for theories as to how old galaxies fuel the growth of their central black holes. The finding suggests that voracious, central black holes might be as common in old, red galaxies as they are in younger, blue galaxies, a surprise to many astronomers. The team made this discovery with NASA'S Chandra X-ray Observatory. They also used Carnegie's 6.5-meter Walter Baade Telescope at the Las Campanas Observatory in Chile for follow-up optical observations. "This changes our view of galaxy clusters as the retirement homes for old and quiet black holes," said Dr. Paul Martini, lead author on a paper describing the results that appears in the September 10 issue of The Astrophysical Journal Letters. "The question now is, how do these black holes produce bright X-ray sources, similar to what we see from much younger galaxies?" Typical of the black hole phenomenon, the cores of these active galaxies are luminous in X-ray radiation. Yet, they are obscured, and thus essentially undetectable in the radio, infrared and optical wavebands. "X rays can penetrate obscuring gas and dust as easily as they penetrate the soft tissue of the human body to look for broken bones," said co-author Dr. Dan Kelson. "So, with Chandra, we can peer through the dust and we have found that even ancient galaxies with 10-billion-year-old stars can have central black holes still actively pulling in copious amounts of interstellar gas. This activity has simply been hidden from us all this time. This means these galaxies aren't over the hill after all and our theories need to be revised." Scientists say that supermassive black holes -- having the mass of millions to billions of suns squeezed into a region about the size of our Solar System -- are the engines in the cores of

  9. Trait Anxiety Is Associated with Negative Interpretations When Resolving Valence Ambiguity of Surprised Faces.

    Science.gov (United States)

    Park, Gewnhi; Vasey, Michael W; Kim, Grace; Hu, Dixie D; Thayer, Julian F

    2016-01-01

    The current research examines whether trait anxiety is associated with negative interpretation bias when resolving valence ambiguity of surprised faces. To further isolate the neuro-cognitive mechanism, we presented angry, happy, and surprised faces at broad spatial frequency (BSF), high spatial frequency (HSF), and low spatial frequency (LSF) and asked participants to determine the valence of each face. High trait anxiety was associated with more negative interpretations of BSF (i.e., intact) surprised faces. However, the modulation of trait anxiety on the negative interpretation of surprised faces disappeared at HSF and LSF. The current study provides evidence that trait anxiety modulates negative interpretations of BSF surprised faces. However, the negative interpretation of LSF surprised faces appears to be a robust default response that occurs regardless of individual differences in trait anxiety.

  10. Trait anxiety is associated with negative interpretations when resolving valence ambiguity of surprised faces

    Directory of Open Access Journals (Sweden)

    Gewnhi Park

    2016-08-01

    Full Text Available The current research examines whether trait anxiety is associated with negative interpretation bias when resolving valence ambiguity of surprised faces. To further isolate the neuro-cognitive mechanism, we presented angry, happy, and surprised faces at broad, high, and low spatial frequency and asked participants to determine the valence of each face. High trait anxiety was associated with more negative interpretations of broad spatial frequency (i.e., intact surprised faces. However, the modulation of trait anxiety on the negative interpretation of surprised faces disappeared at high and low spatial frequencies. The current study provides evidence that trait anxiety modulates negative interpretations of broad spatial frequency surprised faces. However, the negative interpretation of low spatial frequency surprised faces appears to be a robust default response that occurs regardless of individual differences in trait anxiety.

  11. Effects of Surprisal and Locality on Danish Sentence Processing: An Eye-Tracking Investigation.

    Science.gov (United States)

    Balling, Laura Winther; Kizach, Johannes

    2017-03-22

    An eye-tracking experiment in Danish investigates two dominant accounts of sentence processing: locality-based theories that predict a processing advantage for sentences where the distance between the major syntactic heads is minimized, and the surprisal theory which predicts that processing time increases with big changes in the relative entropy of possible parses, sometimes leading to anti-locality effects. We consider both lexicalised surprisal, expressed in conditional trigram probabilities, and syntactic surprisal expressed in the manipulation of the expectedness of the second NP in Danish constructions with two postverbal NP-objects. An eye-tracking experiment showed a clear advantage for local syntactic relations, with only a marginal effect of lexicalised surprisal and no effect of syntactic surprisal. We conclude that surprisal has a relatively marginal effect, which may be clearest for verbs in verb-final languages, while locality is a robust predictor of sentence processing.

  12. Trait Anxiety Is Associated with Negative Interpretations When Resolving Valence Ambiguity of Surprised Faces

    OpenAIRE

    Gewnhi Park; Vasey, Michael W.; Grace Kim; Dixie D Hu; Thayer, Julian F

    2016-01-01

    The current research examines whether trait anxiety is associated with negative interpretation bias when resolving valence ambiguity of surprised faces. To further isolate the neuro-cognitive mechanism, we presented angry, happy, and surprised faces at broad, high, and low spatial frequency and asked participants to determine the valence of each face. High trait anxiety was associated with more negative interpretations of broad spatial frequency (i.e., intact) surprised faces. However, the mo...

  13. SeReM2--a meta-model for the structured definition of quality requirements for electronic health record services.

    Science.gov (United States)

    Hoerbst, Alexander; Hackl, Werner; Ammenwerth, Elske

    2010-01-01

    Quality assurance is a major task with regard to Electronic Health Records (EHR). Currently there are only a few approaches explicitly dealing with the quality of EHR services as a whole. The objective of this paper is to introduce a new Meta-Model to structure and describe quality requirements of EHRs. This approach should support the transnational quality certification of EHR services. The Model was developed based on interviews with 24 experts and a systematic literature search and comprises a service and requirements model. The service model represents the structure of a service whereas the requirements model can be used to assign specific predefined aims and requirements to a service. The new model differs from existing approaches as it accounts for modern software architectures and the special attributes of EHRs.

  14. A Neural Mechanism for Surprise-related Interruptions of Visuospatial Working Memory.

    Science.gov (United States)

    Wessel, Jan R

    2016-11-30

    Surprising perceptual events recruit a fronto-basal ganglia mechanism for inhibition, which suppresses motor activity following surprise. A recent study found that this inhibitory mechanism also disrupts the maintenance of verbal working memory (WM) after surprising tones. However, it is unclear whether this same mechanism also relates to surprise-related interruptions of non-verbal WM. We tested this hypothesis using a change-detection task, in which surprising tones impaired visuospatial WM. Participants also performed a stop-signal task (SST). We used independent component analysis and single-trial scalp-electroencephalogram to test whether the same inhibitory mechanism that reflects motor inhibition in the SST relates to surprise-related visuospatial WM decrements, as was the case for verbal WM. As expected, surprising tones elicited activity of the inhibitory mechanism, and this activity correlated strongly with the trial-by-trial level of surprise. However, unlike for verbal WM, the activity of this mechanism was unrelated to visuospatial WM accuracy. Instead, inhibition-independent activity that immediately succeeded the inhibitory mechanism was increased when visuospatial WM was disrupted. This shows that surprise-related interruptions of visuospatial WM are not effected by the same inhibitory mechanism that interrupts verbal WM, and instead provides evidence for a 2-stage model of distraction.

  15. A Quantitative Review and Meta-Models of the Variability and Factors Affecting Oral Drug Absorption-Part I: Gastrointestinal pH.

    Science.gov (United States)

    Abuhelwa, Ahmad Y; Foster, David J R; Upton, Richard N

    2016-09-01

    This study aimed to conduct a quantitative meta-analysis for the values of, and variability in, gastrointestinal (GI) pH in the different GI segments; characterize the effect of food on the values and variability in these parameters; and present quantitative meta-models of distributions of GI pH to help inform models of oral drug absorption. The literature was systemically reviewed for the values of, and the variability in, GI pH under fed and fasted conditions. The GI tract was categorized into the following 10 distinct regions: stomach (proximal, mid-distal), duodenum (proximal, mid-distal), jejunum and ileum (proximal, mid, and distal small intestine), and colon (ascending, transverse, and descending colon). Meta-analysis used the "metafor" package of the R language. The time course of postprandial stomach pH was modeled using NONMEM. Food significantly influenced the estimated meta-mean stomach and duodenal pH but had no significant influence on small intestinal and colonic pH. The time course of postprandial pH was described using an exponential model. Increased meal caloric content increased the extent and duration of postprandial gastric pH buffering. The different parts of the small intestine had significantly different pH. Colonic pH was significantly different for descending but not for ascending and transverse colon. Knowledge of GI pH is important for the formulation design of the pH-dependent dosage forms and in understanding the dissolution and absorption of orally administered drugs. The meta-models of GI pH may also be used as part of semi-physiological pharmacokinetic models to characterize the effect of GI pH on the in vivo drug release and pharmacokinetics.

  16. Kriging metamodeling for simulation

    NARCIS (Netherlands)

    van Beers, W.C.M.

    2005-01-01

    Many scientific disciplines use mathematical models to describe complicated real systems. Often, analytical methods are inadequate, so simulation is applied. This thesis focuses on computer intensive simulation experiments in Operations Research/Management Science. For such experiments it is necessa

  17. Kriging Metamodeling for Simulation.

    OpenAIRE

    van Beers, W.C.M.

    2005-01-01

    Many scientific disciplines use mathematical models to describe complicated real systems. Often, analytical methods are inadequate, so simulation is applied. This thesis focuses on computer intensive simulation experiments in Operations Research/Management Science. For such experiments it is necessary to apply interpolation. In this thesis, Kriging interpolation for random simulation is proposed and a novel type of Kriging - called Detrended Kriging - is developed. Kriging turns out to give b...

  18. You'll Never Guess Who Wrote That: 78 Surprising Authors of Psychological Publications.

    Science.gov (United States)

    Lilienfeld, Scott O; Lynn, Steven Jay

    2016-07-01

    One can find psychological authors in the most unexpected places. We present a capsule summary of scholarly publications of psychological interest authored or coauthored by 78 surprising individuals, most of whom are celebrities or relatives of celebrities, historical figures, or people who have otherwise achieved visibility in academic circles, politics, religion, art, and diverse realms of popular culture. Still other publications are authored by individuals who are far better known for their contributions to popular than to academic psychology. The publications, stretching across more than two centuries, encompass a wide swath of domains of psychological inquiry and highlight the intersection of psychology with fields that fall outside its traditional borders, including public health, economics, law, neurosurgery, and even magic. Many of these scholarly contributions have enriched psychology and its allied disciplines, such as psychiatry, in largely unappreciated ways, and they illustrate the penetration of psychological knowledge into multiple scientific disciplines and everyday life. At the same time, our author list demonstrates that remarkable intellectual accomplishments in one scientific domain, such as physics, do not necessarily translate into success in psychology and underscores the distinction between intelligence, on the one hand, and critical thinking and wisdom, on the other.

  19. Polar F-layer model-observation comparisons: a neutral wind surprise

    Directory of Open Access Journals (Sweden)

    J. J. Sojka

    2005-01-01

    Full Text Available The existence of a month-long continuous database of incoherent scatter radar observations of the ionosphere from the EISCAT Savlbard Radar (ESR at Longyearbyen, Norway, provides an unprecedented opportunity for model/data comparisons. Physics-based ionospheric models, such as the Utah State University Time Dependent Ionospheric Model (TDIM, are usually only compared with observations over restricted one or two day events or against climatological averages. In this study, using the ESR observations, the daily weather, day-to-day variability, and month-long climatology can be simultaneously addressed to identify modeling shortcomings and successes. Since for this study the TDIM is driven by climatological representations of the magnetospheric convection, auroral oval, neutral atmosphere, and neutral winds, whose inputs are solar and geomagnetic indices, it is not surprising that the daily weather cannot be reproduced. What is unexpected is that the horizontal neutral wind has come to the forefront as a decisive model input parameter in matching the diurnal morphology of density structuring seen in the observations.

  20. Surprisal-based comparison between a symbolic and a connectionist model of sentence processing

    NARCIS (Netherlands)

    Frank, S.L.; Taatgen, N.; van Rijn, H.

    2009-01-01

    The 'unlexicalized surprisal' of a word in sentence context is defined as the negative logarithm of the probability of the word's part-of-speech given the sequence of previous parts-of-speech of the sentence. Unlexicalized surprisal is known to correlate with word reading time. Here, it is shown

  1. The role of surprising events in a math game on proportional reasoning

    NARCIS (Netherlands)

    Wouters, P.; Oostendorp, van H.; Vrugte, ter J.; Jong, de T.; Vandercruysse, S.; Elen, J.

    2015-01-01

    This study examines whether surprising events can be used to stimulate students’ playful learning in a GBL environment in the domain of proportional reasoning. The assumed effect of surprise is that unexpected events interrupt an expectation and therefore triggers the player to evaluate the new situ

  2. Distinct medial temporal networks encode surprise during motivation by reward versus punishment.

    Science.gov (United States)

    Murty, Vishnu P; LaBar, Kevin S; Adcock, R Alison

    2016-10-01

    Adaptive motivated behavior requires predictive internal representations of the environment, and surprising events are indications for encoding new representations of the environment. The medial temporal lobe memory system, including the hippocampus and surrounding cortex, encodes surprising events and is influenced by motivational state. Because behavior reflects the goals of an individual, we investigated whether motivational valence (i.e., pursuing rewards versus avoiding punishments) also impacts neural and mnemonic encoding of surprising events. During functional magnetic resonance imaging (fMRI), participants encountered perceptually unexpected events either during the pursuit of rewards or avoidance of punishments. Despite similar levels of motivation across groups, reward and punishment facilitated the processing of surprising events in different medial temporal lobe regions. Whereas during reward motivation, perceptual surprises enhanced activation in the hippocampus, during punishment motivation surprises instead enhanced activation in parahippocampal cortex. Further, we found that reward motivation facilitated hippocampal coupling with ventromedial PFC, whereas punishment motivation facilitated parahippocampal cortical coupling with orbitofrontal cortex. Behaviorally, post-scan testing revealed that reward, but not punishment, motivation resulted in greater memory selectivity for surprising events encountered during goal pursuit. Together these findings demonstrate that neuromodulatory systems engaged by anticipation of reward and punishment target separate components of the medial temporal lobe, modulating medial temporal lobe sensitivity and connectivity. Thus, reward and punishment motivation yield distinct neural contexts for learning, with distinct consequences for how surprises are incorporated into predictive mnemonic models of the environment.

  3. 基于高斯过程元模型的产品设计时间估计方法%Time estimation method for product design based on Gaussian process meta-model

    Institute of Scientific and Technical Information of China (English)

    张昆仑; 刘新亮; 郭波

    2011-01-01

    To estimate the product design time more precisely, the modeling methodology of Gaussian process meta-model was applied to estimate the product design time. The modeling principles of Gaussian process meta-model were firstly introduced. Since there were linguistic variables among the factors effecting product design time, Hausdorff distance was applied to assist the construction of relevant matrix in Gaussian Process modeling. The example analysis illustrated that Gaussian Process meta-model was superior to the existing two fuzzy neural network models.%为更精确地预测产品设计时间,将高斯过程元模型建模方法应用于产品设计时间估计中,介绍了高斯过程元模型的建模原理.考虑产品设计时间影响因素中存在语言型变量的问题,利用Hausdorff距离辅助构造高斯过程建模中的相关矩阵,通过算例分析证明高斯过程元模型优于已有的两种模糊神经网络模型.

  4. Previously seen and expected stimuli elicit surprise in the context of visual search.

    Science.gov (United States)

    Retell, James D; Becker, Stefanie I; Remington, Roger W

    2016-04-01

    In the context of visual search, surprise is the phenomenon by which a previously unseen and unexpected stimulus exogenously attracts spatial attention. Capture by such a stimulus occurs, by definition, independent of task goals and is thought to be dependent on the extent to which the stimulus deviates from expectations. However, the relative contributions of prior-exposure and explicit knowledge of an unexpected event to the surprise response have not yet been systematically investigated. Here observers searched for a specific color while ignoring irrelevant cues of different colors presented prior to the target display. After a brief familiarization period, we presented an irrelevant motion cue to elicit surprise. Across conditions we varied prior exposure to the motion stimulus - seen versus unseen - and top-down expectations of occurrence - expected versus unexpected - to assess the extent to which each of these factors contributes to surprise. We found no attenuation of the surprise response when observers were pre-exposed to the motion cue and or had explicit knowledge of its occurrence. Our results show that it is neither sufficient nor necessary that a stimulus be new and unannounced to elicit surprise and suggest that the expectations that determine the surprise response are highly context specific.

  5. A Statistical Analysis of the Relationship between Harmonic Surprise and Preference in Popular Music.

    Science.gov (United States)

    Miles, Scott A; Rosen, David S; Grzywacz, Norberto M

    2017-01-01

    Studies have shown that some musical pieces may preferentially activate reward centers in the brain. Less is known, however, about the structural aspects of music that are associated with this activation. Based on the music cognition literature, we propose two hypotheses for why some musical pieces are preferred over others. The first, the Absolute-Surprise Hypothesis, states that unexpected events in music directly lead to pleasure. The second, the Contrastive-Surprise Hypothesis, proposes that the juxtaposition of unexpected events and subsequent expected events leads to an overall rewarding response. We tested these hypotheses within the framework of information theory, using the measure of "surprise." This information-theoretic variable mathematically describes how improbable an event is given a known distribution. We performed a statistical investigation of surprise in the harmonic structure of songs within a representative corpus of Western popular music, namely, the McGill Billboard Project corpus. We found that chords of songs in the top quartile of the Billboard chart showed greater average surprise than those in the bottom quartile. We also found that the different sections within top-quartile songs varied more in their average surprise than the sections within bottom-quartile songs. The results of this study are consistent with both the Absolute- and Contrastive-Surprise Hypotheses. Although these hypotheses seem contradictory to one another, we cannot yet discard the possibility that both absolute and contrastive types of surprise play roles in the enjoyment of popular music. We call this possibility the Hybrid-Surprise Hypothesis. The results of this statistical investigation have implications for both music cognition and the human neural mechanisms of esthetic judgments.

  6. October Surprises.

    Science.gov (United States)

    2016-10-01

    Ushered in with the rampage of Hurricane Matthew, later days brightened in this month that has often been harbinger of both good and bad news for Cuba and the world. Hurricane Matthew ripped through Eastern Cuba, devastating the historic town of Baracoa (Cuba's first capital, founded in 1511) and the village of Maisí, where the morning sun first rises over Cuban territory. Wind and flood leveled hundreds of homes, brought down the power grid and destroyed crops. Yet there was no loss of human life, unlike in neighboring Haiti and other countries in Matthew's path, and unlike in Cuba in 1963, when Hurricane Flora caused more than 1200 deaths. In Haiti, efforts of health workers-including hundreds of Haitian graduates from Cuba's Latin American Medical School and 600 Cuban health professionals already there-were bolstered by dozens of specially trained Cuban disaster medical personnel in the wake of the storm.

  7. Surprising Resists

    Science.gov (United States)

    Morton, Stephie

    2007-01-01

    In this article, the author discusses an art adventure with her third, fourth, and fifth grade enrichment kids to the Fort Collins Museum of Contemporary Art in Colorado. The author demonstrates and teaches her students how to use the art tissue paper and oil pastel complementing the creative spirit of the Jaune Quick-to-See Smith work presented…

  8. 基于 IDEAS 的联合论证元模型%Joint demonstration meta-model based on IDEAS

    Institute of Scientific and Technical Information of China (English)

    谭贤四; 朱刚; 王红; 毕红葵; 高婷

    2015-01-01

    To unify the architecture data modeling in the joint demonstration pattern,joint demonstration meta-model (JDM2)is proposed based on international defense enterprise architecture specification(IDEAS). Business object reference ontology(BORO),IDEAS and department of defense meta-model (DM2 )are intro-duced,the causes that the DM2 do not apply to describe joint demonstration content and the merits of DM2’s three layers from the perspective of design mechanism are analyzed.The merits of DM2 are made up of conceptu-al data model(CDM),logical data model(LDM)and physical exchange schema(PES).Finally,the conceptual data model and the logical data model in JDM2 are presented based on joint demonstration space,ontology theo-ry,ideas of DM2 design and IDEAS.The JDM2 has been proved feasible with an example.%为统一描述联合论证模式中体系结构数据,基于《国际国防企业体系结构规范》(international de-fense enterprise architecture specification,IDEAS)提出了联合论证元模型(joint demonstration meta-model, JDM2)。首先,介绍了业务对象参考本体(business object reference ontology,BORO)、IDEAS 和国防部元模型(department of defense meta-model,DM2),从设计机理方面剖析了 DM2不适用于描述联合论证内容的根本原因以及 DM2概念数据模型(conceptual data model,CDM)、逻辑数据模型(logical data model,LDM)和物理交换规范(physical exchange schema,PES)3层结构的优点;然后,借鉴本体理论和 DM2设计思想基于联合论证空间提出了 JDM2中的概念数据模型;在概念数据模型基础上基于 IDEAS 规范构建了 JDM2中的逻辑数据模型。最后通过实例说明了 JDM2的可行性。

  9. 元模型在工程数据仓库中的作用%ROLE OF META-MODEL IN ENGINEERING DATA WAREHOUSE

    Institute of Scientific and Technical Information of China (English)

    沈国华; 黄志球; 王传栋

    2004-01-01

    Engineering data are separately organized and their schemas are increasingly complex and variable. Engineering data management systems are needed to be able to manage the unified data and to be both customizable and extensible. The design of the systems is heavily dependent on the flexibility and self-description of the data model. The characteristics of engineering data and their management facts are analyzed. Then engineering data warehouse (EDW) architecture and multi-layer metamodels are presented. Also an approach to manage and use engineering data by a meta object is proposed. Finally, an application flight test EDW system (FTEDWS) is described and meta-objects to manage engineering data in the data warehouse are used. It shows that adopting a meta-modeling approach provides a support for interchangeability and a sufficiently flexible environment in which the system evolution and the reusability can be handled.%工程数据的组织管理分散、模式复杂且随时间变化,这就要求工程数据管理系统不仅可以统一管理数据,而且支持客户化工作、可扩展.这种系统的设计必须依赖数据模型的灵活性和自描述性.本文在分析工程数据的特点及其管理现状基础上,描述了工程数据仓库的体系结构和多层元模型;重点刻画了利用元对象技术实现工程数据管理的设计实现;最后描述了一个现实的利用元对象管理仓库中工程数据的系统-FTEDWS,证明了本方案和元模型的有效性.这种采用元模型的方法提供了一个灵活、可互交换的环境,用以支持系统演化和重用.

  10. The role of loudness in detection of surprising events in music recordings

    OpenAIRE

    Holonowicz, Piotr; Herrera, Perfecto; Purwins, Hendrik

    2009-01-01

    The abrupt change of loudness is a salient event that is not always expected by a music listener. Therefore loudness is an important cue when seeking for events in a music stream that could violate human expectations. The concept of expectation and surprise in music has become recently the subject of extensive research, however mostly using symbolic data. The aim of this work is to investigate the circumstances when a change of sound intensity could be surprising for a listener. Then, using t...

  11. Computational surprisal analysis speeds-up genomic characterization of cancer processes.

    Science.gov (United States)

    Kravchenko-Balasha, Nataly; Simon, Simcha; Levine, R D; Remacle, F; Exman, Iaakov

    2014-01-01

    Surprisal analysis is increasingly being applied for the examination of transcription levels in cellular processes, towards revealing inner network structures and predicting response. But to achieve its full potential, surprisal analysis should be integrated into a wider range computational tool. The purposes of this paper are to combine surprisal analysis with other important computation procedures, such as easy manipulation of the analysis results--e.g. to choose desirable result sub-sets for further inspection--, retrieval and comparison with relevant datasets from public databases, and flexible graphical displays for heuristic thinking. The whole set of computation procedures integrated into a single practical tool is what we call Computational Surprisal Analysis. This combined kind of analysis should facilitate significantly quantitative understanding of different cellular processes for researchers, including applications in proteomics and metabolomics. Beyond that, our vision is that Computational Surprisal Analysis has the potential to reach the status of a routine method of analysis for practitioners. The resolving power of Computational Surprisal Analysis is here demonstrated by its application to a variety of cellular cancer process transcription datasets, ours and from the literature. The results provide a compact biological picture of the thermodynamic significance of the leading gene expression phenotypes in every stage of the disease. For each transcript we characterize both its inherent steady state weight, its correlation with the other transcripts and its variation due to the disease. We present a dedicated website to facilitate the analysis for researchers and practitioners.

  12. Computational surprisal analysis speeds-up genomic characterization of cancer processes.

    Directory of Open Access Journals (Sweden)

    Nataly Kravchenko-Balasha

    Full Text Available Surprisal analysis is increasingly being applied for the examination of transcription levels in cellular processes, towards revealing inner network structures and predicting response. But to achieve its full potential, surprisal analysis should be integrated into a wider range computational tool. The purposes of this paper are to combine surprisal analysis with other important computation procedures, such as easy manipulation of the analysis results--e.g. to choose desirable result sub-sets for further inspection--, retrieval and comparison with relevant datasets from public databases, and flexible graphical displays for heuristic thinking. The whole set of computation procedures integrated into a single practical tool is what we call Computational Surprisal Analysis. This combined kind of analysis should facilitate significantly quantitative understanding of different cellular processes for researchers, including applications in proteomics and metabolomics. Beyond that, our vision is that Computational Surprisal Analysis has the potential to reach the status of a routine method of analysis for practitioners. The resolving power of Computational Surprisal Analysis is here demonstrated by its application to a variety of cellular cancer process transcription datasets, ours and from the literature. The results provide a compact biological picture of the thermodynamic significance of the leading gene expression phenotypes in every stage of the disease. For each transcript we characterize both its inherent steady state weight, its correlation with the other transcripts and its variation due to the disease. We present a dedicated website to facilitate the analysis for researchers and practitioners.

  13. NR sulphur vulcanization: Interaction study between TBBS and DPG by means of a combined experimental rheometer and meta-model best fitting strategy

    Science.gov (United States)

    Milani, G.; Hanel, T.; Donetti, R.; Milani, F.

    2016-06-01

    The paper is aimed at studying the possible interaction between two different accelerators (DPG and TBBS) in the chemical kinetic of Natural Rubber (NR) vulcanized with sulphur. The same blend with several DPG and TBBS concentrations is deeply analyzed from an experimental point of view, varying the curing temperature in the range 150-180°C and obtaining rheometer curves with a step of 10°C. In order to study any possible interaction between the two accelerators -and eventually evaluating its engineering relevance-rheometer data are normalized by means of the well known Sun and Isayev normalization approach and two output parameters are assumed as meaningful to have an insight into the possible interaction, namely time at maximum torque and reversion percentage. Two different numerical meta-models, which belong to the family of the so-called response surfaces RS are compared. The first is linear against TBBS and DPG and therefore well reproduces no interaction between the accelerators, whereas the latter is a non-linear RS with bilinear term. Both RS are deduced from standard best fitting of experimental data available. It is found that, generally, there is a sort of interaction between TBBS and DPG, but that the error introduced making use of a linear model (no interaction) is generally lower than 10%, i.e. fully acceptable from an engineering standpoint.

  14. Surprise and Opportunity for Learning in Grand Canyon: the Glen Canyon Dam Adaptive Management Program

    Science.gov (United States)

    Melis, T. S.; Walters, C. J.; Korman, J.

    2013-12-01

    . The repeated surprises were initially viewed with dismay by some managers and stakeholders who had unrealistic expectations about science and modeling to start with, yet actually represent scientific successes in terms of revealing new opportunities for developing better flow and non-flow policies. A new Long Term Experiment and Management Plan EIS (see URL) started in 2011, and co-led by the U.S. Department of the Interior's Bureau of Reclamation and the National Park Service, is underway and provides Colorado River managers, other stakeholders and the public a unique opportunity to refocus and weight resource objectives, conduct trade-off evaluations within the context of structured decision analyses, and identify key uncertainties with the goal of improving past experimental designs and monitoring strategies so as to take advantage of future learning opportunities over the next two decades. Perhaps the single greatest uncertainty now facing river managers is trying to anticipate how climate change and global warming will affect the supply of water from the Upper Colorado River Basin, Lake Powell storage that is known to control the river's thermal regime and native and nonnative fish interactions in GCNP, and the already highly-limited tributary sand supply below the dam from the Paria and Little Colorado Rivers required to manage sandbars along river shorelines.

  15. Efficient reduction of complex noise in passive millimeter-wavelength video utilizing Bayesian surprise

    Science.gov (United States)

    Mundhenk, T. Nathan; Baron, Josh; Matic, Roy M.

    2011-06-01

    Passive millimeter wavelength (PMMW) video holds great promise given its ability to see targets and obstacles through fog, smoke and rain. However, current imagers produce undesirable complex noise. This can come as a mixture of fast shot (snow like) noise and a slower forming circular fixed pattern. Shot noise can be removed by a simple gain style filter. However, this can produce blurring of objects in the scene. To alleviate this, we measure the amount of Bayesian surprise in videos. Bayesian surprise is feature change in time which is abrupt, but cannot be accounted for as shot noise. Surprise is used to attenuate the shot noise filter in locations of high surprise. Since high Bayesian surprise in videos is very salient to observers, this reduces blurring particularly in places where people visually attend. Fixed pattern noise is removed after the shot noise using a combination of Non-uniformity correction (NUC) and Eigen Image Wavelet Transformation. The combination allows for online removal of time varying fixed pattern noise even when background motion may be absent. It also allows for online adaptation to differing intensities of fixed pattern noise. The fixed pattern and shot noise filters are all efficient allowing for real time video processing of PMMW video. We show several examples of PMMW video with complex noise that is much cleaner as a result of the noise removal. Processed video clearly shows cars, houses, trees and utility poles at 20 frames per second.

  16. Conference of “Uncertainty and Surprise: Questions on Working with the Unexpected and Unknowable”

    CERN Document Server

    McDaniel, Reuben R; Uncertainty and Surprise in Complex Systems : Questions on Working with the Unexpected

    2005-01-01

    Complexity science has been a source of new insight in physical and social systems and has demonstrated that unpredictability and surprise are fundamental aspects of the world around us. This book is the outcome of a discussion meeting of leading scholars and critical thinkers with expertise in complex systems sciences and leaders from a variety of organizations sponsored by the Prigogine Center at The University of Texas at Austin and the Plexus Institute to explore strategies for understanding uncertainty and surprise. Besides distributions to the conference it includes a key digest by the editors as well as a commentary by the late nobel laureat Ilya Prigogine, "Surprises in half of a century". The book is intended for researchers and scientists in complexity science as well as for a broad interdisciplinary audience of both practitioners and scholars. It will well serve those interested in the research issues and in the application of complexity science to physical and social systems.

  17. What is a surprise earthquake? The example of the 2002, San Giuliano (Italy event

    Directory of Open Access Journals (Sweden)

    M. Mucciarelli

    2005-06-01

    Full Text Available Both in scientific literature and in the mass media, some earthquakes are defined as «surprise earthquakes». Based on his own judgment, probably any geologist, seismologist or engineer may have his own list of past «surprise earthquakes». This paper tries to quantify the underlying individual perception that may lead a scientist to apply such a definition to a seismic event. The meaning is different, depending on the disciplinary approach. For geologists, the Italian database of seismogenic sources is still too incomplete to allow for a quantitative estimate of the subjective degree of belief. For seismologists, quantification is possible defining the distance between an earthquake and its closest previous neighbor. Finally, for engineers, the San Giuliano quake could not be considered a surprise, since probabilistic site hazard estimates reveal that the change before and after the earthquake is just 4%.

  18. Successful ageing

    DEFF Research Database (Denmark)

    Bülow, Morten Hillgaard; Söderqvist, Thomas

    2014-01-01

    Since the late 1980s, the concept of ‘ successful ageing’ has set the frame for discourse about contemporary ageing research. Through an analysis of the reception to John W. Rowe and Robert L. Kahn's launch of the concept of ‘ successful ageing’ in 1987, this article maps out the important themes...... and discussions that have emerged from the interdisciplinary field of ageing research. These include an emphasis on interdisciplinarity; the interaction between biology, psycho-social contexts and lifestyle choices; the experiences of elderly people; life-course perspectives; optimisation and prevention...... strategies; and the importance of individual, societal and scientific conceptualisations and understandings of ageing. By presenting an account of the recent historical uses, interpretations and critiques of the concept, the article unfolds the practical and normative complexities of ‘ successful ageing’....

  19. Successful ageing

    DEFF Research Database (Denmark)

    Bülow, Morten Hillgaard; Söderqvist, Thomas

    2014-01-01

    Since the late 1980s, the concept of ‘ successful ageing’ has set the frame for discourse about contemporary ageing research. Through an analysis of the reception to John W. Rowe and Robert L. Kahn's launch of the concept of ‘ successful ageing’ in 1987, this article maps out the important themes...... and discussions that have emerged from the interdisciplinary field of ageing research. These include an emphasis on interdisciplinarity; the interaction between biology, psycho-social contexts and lifestyle choices; the experiences of elderly people; life-course perspectives; optimisation and prevention...... strategies; and the importance of individual, societal and scientific conceptualisations and understandings of ageing. By presenting an account of the recent historical uses, interpretations and critiques of the concept, the article unfolds the practical and normative complexities of ‘ successful ageing’....

  20. Citation Success

    DEFF Research Database (Denmark)

    Vaio, Gianfranco Di; Waldenström, Daniel; Weisdorf, Jacob Louis

    2012-01-01

    This study examines the determinants of citation success among authors who have recently published their work in economic history journals. Besides offering clues about how to improve one's scientific impact, our citation analysis also sheds light on the state of the field of economic history....... Consistent with our expectations, we find that full professors, authors appointed at economics and history departments, and authors working in Anglo-Saxon and German countries are more likely to receive citations than other scholars. Long and co-authored articles are also a factor for citation success. We...... find similar patterns when assessing the same authors' citation success in economics journals. As a novel feature, we demonstrate that the diffusion of research — publication of working papers, as well as conference and workshop presentations — has a first-order positive impact on the citation rate....

  1. Salience and Attention in Surprisal-Based Accounts of Language Processing

    Science.gov (United States)

    Zarcone, Alessandra; van Schijndel, Marten; Vogels, Jorrig; Demberg, Vera

    2016-01-01

    The notion of salience has been singled out as the explanatory factor for a diverse range of linguistic phenomena. In particular, perceptual salience (e.g., visual salience of objects in the world, acoustic prominence of linguistic sounds) and semantic-pragmatic salience (e.g., prominence of recently mentioned or topical referents) have been shown to influence language comprehension and production. A different line of research has sought to account for behavioral correlates of cognitive load during comprehension as well as for certain patterns in language usage using information-theoretic notions, such as surprisal. Surprisal and salience both affect language processing at different levels, but the relationship between the two has not been adequately elucidated, and the question of whether salience can be reduced to surprisal / predictability is still open. Our review identifies two main challenges in addressing this question: terminological inconsistency and lack of integration between high and low levels of representations in salience-based accounts and surprisal-based accounts. We capitalize upon work in visual cognition in order to orient ourselves in surveying the different facets of the notion of salience in linguistics and their relation with models of surprisal. We find that work on salience highlights aspects of linguistic communication that models of surprisal tend to overlook, namely the role of attention and relevance to current goals, and we argue that the Predictive Coding framework provides a unified view which can account for the role played by attention and predictability at different levels of processing and which can clarify the interplay between low and high levels of processes and between predictability-driven expectation and attention-driven focus. PMID:27375525

  2. Salience and attention in surprisal-based accounts of language processing

    Directory of Open Access Journals (Sweden)

    Alessandra eZarcone

    2016-06-01

    Full Text Available The notion of salience has been singled out as the explanatory factor for a diverse range oflinguistic phenomena. In particular, perceptual salience (e.g. visual salience of objects in the world,acoustic prominence of linguistic sounds and semantic-pragmatic salience (e.g. prominence ofrecently mentioned or topical referents have been shown to influence language comprehensionand production. A different line of research has sought to account for behavioral correlates ofcognitive load during comprehension as well as for certain patterns in language usage usinginformation-theoretic notions, such as surprisal. Surprisal and salience both affect languageprocessing at different levels, but the relationship between the two has not been adequatelyelucidated, and the question of whether salience can be reduced to surprisal / predictability isstill open. Our review identifies two main challenges in addressing this question: terminologicalinconsistency and lack of integration between high and low levels of representations in salience-based accounts and surprisal-based accounts. We capitalise upon work in visual cognition inorder to orient ourselves in surveying the different facets of the notion of salience in linguisticsand their relation with models of surprisal. We find that work on salience highlights aspects oflinguistic communication that models of surprisal tend to overlook, namely the role of attentionand relevance to current goals, and we argue that the Predictive Coding framework provides aunified view which can account for the role played by attention and predictability at different levelsof processing and which can clarify the interplay between low and high levels of processes andbetween predictability-driven expectation and attention-driven focus.

  3. Citation Success

    DEFF Research Database (Denmark)

    Di Vaio, Gianfranco; Waldenström, Daniel; Weisdorf, Jacob Louis

    affects citations. In regard to author-specific characteristics, male authors, full professors and authors working economics or history departments, and authors employed in Anglo-Saxon countries, are more likely to get cited than others. As a ‘shortcut' to citation success, we find that research diffusion......This study analyses determinants of citation success among authors publishing in economic history journals. Bibliometric features, like article length and number of authors, are positively correlated with the citation rate up to a certain point. Remarkably, publishing in top-ranked journals hardly...

  4. One In Five Inpatient Emergency Department Cases May Lead To Surprise Bills.

    Science.gov (United States)

    Garmon, Christopher; Chartock, Benjamin

    2017-01-01

    A surprise medical bill is a bill from an out-of-network provider that was not expected by the patient or that came from an out-of-network provider not chosen by the patient. In 2014, 20 percent of hospital inpatient admissions that originated in the emergency department (ED), 14 percent of outpatient visits to the ED, and 9 percent of elective inpatient admissions likely led to a surprise medical bill. Project HOPE—The People-to-People Health Foundation, Inc.

  5. Risk, surprises and black swans fundamental ideas and concepts in risk assessment and risk management

    CERN Document Server

    Aven, Terje

    2014-01-01

    Risk, Surprises and Black Swans provides an in depth analysis of the risk concept with a focus on the critical link to knowledge; and the lack of knowledge, that risk and probability judgements are based on.Based on technical scientific research, this book presents a new perspective to help you understand how to assess and manage surprising, extreme events, known as 'Black Swans'. This approach looks beyond the traditional probability-based principles to offer a broader insight into the important aspects of uncertain events and in doing so explores the ways to manage them.

  6. Citation Success

    DEFF Research Database (Denmark)

    Di Vaio, Gianfranco; Waldenström, Daniel; Weisdorf, Jacob Louis

    This study analyses determinants of citation success among authors publishing in economic history journals. Bibliometric features, like article length and number of authors, are positively correlated with the citation rate up to a certain point. Remarkably, publishing in top-ranked journals hardl...

  7. Citation Success

    DEFF Research Database (Denmark)

    Vaio, Gianfranco Di; Waldenström, Daniel; Weisdorf, Jacob Louis

    2012-01-01

    This study examines the determinants of citation success among authors who have recently published their work in economic history journals. Besides offering clues about how to improve one's scientific impact, our citation analysis also sheds light on the state of the field of economic history. Co...

  8. Successful ageing

    DEFF Research Database (Denmark)

    Kusumastuti, Sasmita; Derks, Marloes G. M.; Tellier, Siri;

    2016-01-01

    . METHODS: We performed a novel, hypothesis-free and quantitative analysis of citation networks exploring the literature on successful ageing that exists in the Web of Science Core Collection Database using the CitNetExplorer software. Outcomes were visualized using timeline-based citation patterns...

  9. A Lease Square Support Vector Machine Metamodel for Ship Performance in Uncertainty Quantification Study%支持向量机近似模型在船舶性能不确定度分析中的应用

    Institute of Scientific and Technical Information of China (English)

    贺伟; 邹早建

    2013-01-01

      Serving in robust design optimization (RDO) and reliability-based design optimization (RB-DO), the uncertainty quantification (UQ) requires a large number of samples, which is very expen-sive when using high-fidelity simulation tools. As an approximation of expensive high-fidelity simu-lation codes, the metamodel has its high efficiency. This study proposes a metamodel for ship per-formance based on lease square support vector machine (LS-SVM), validates the accuracy of the metamodel in 1D and 2D UQ studies and demonstrates the convergence of accuracy and UQ perfor-mance. The results of this study could provide researchers and engineers an option in UQ study, RDO and RBDO.%  棒性优化和可靠性优化中进行的不确定度分析需要大量的样本计算。当利用高精度分析工具时,其代价非常高昂。作为高精度模拟工具的近似,近似模型具有其高效性。文章基于最小二乘支持向量机(LSSVM)理论,提出了一种描述船舶性能的近似模型,分别在一维和二维不确定度分析中,验证了其精度及收敛性,并对其在不确定分析中的收敛性能进行了研究。文中结论可以为研究人员和工程师在不确定分析、鲁棒性优化和可靠性优化中提供一种新的选择。

  10. Did the FED Surprise the Markets in 2001? A Case Study for Vars with Sign Restrictions

    NARCIS (Netherlands)

    Uhlig, H.F.H.V.S.

    2001-01-01

    In 2001, the Fed has lowered interest rates in a series of cuts, starting from 6.5 % at the end of 2000 to 2.0 % by early November.This paper asks, whether the Federal Reserve Bank has been surprising the markets, taking as given the conventional view about the effect of monetary policy shocks.New

  11. Surprise Gift” Purchases of Small Electric Appliances: A Pilot Study

    NARCIS (Netherlands)

    J. Vanhamme (Joëlle); C.J.P.M. de Bont (Cees)

    2005-01-01

    textabstractUnderstanding decision-making processes for gifts is of strategic importance for companies selling small electrical appliances as gifts account for a large part of their sales. Among all gifts, the ones that are surprising are the most valued by recipients. However, research about

  12. Surprising convergence of the Monte Carlo renormalization group for the three-dimensional Ising model.

    Science.gov (United States)

    Ron, Dorit; Brandt, Achi; Swendsen, Robert H

    2017-05-01

    We present a surprisingly simple approach to high-accuracy calculations of the critical properties of the three-dimensional Ising model. The method uses a modified block-spin transformation with a tunable parameter to improve convergence in the Monte Carlo renormalization group. The block-spin parameter must be tuned differently for different exponents to produce optimal convergence.

  13. Bagpipes and Artichokes: Surprise as a Stimulus to Learning in the Elementary Music Classroom

    Science.gov (United States)

    Jacobi, Bonnie Schaffhauser

    2016-01-01

    Incorporating surprise into music instruction can stimulate student attention, curiosity, and interest. Novelty focuses attention in the reticular activating system, increasing the potential for brain memory storage. Elementary ages are ideal for introducing novel instruments, pieces, composers, or styles of music. Young children have fewer…

  14. The Educational Philosophies of Mordecai Kaplan and Michael Rosenak: Surprising Similarities and Illuminating Differences

    Science.gov (United States)

    Schein, Jeffrey; Caplan, Eric

    2014-01-01

    The thoughts of Mordecai Kaplan and Michael Rosenak present surprising commonalities as well as illuminating differences. Similarities include the perception that Judaism and Jewish education are in crisis, the belief that Jewish peoplehood must include commitment to meaningful content, the need for teachers to teach from a position of…

  15. Surprise, Memory, and Retrospective Judgment Making: Testing Cognitive Reconstruction Theories of the Hindsight Bias Effect

    Science.gov (United States)

    Ash, Ivan K.

    2009-01-01

    Hindsight bias has been shown to be a pervasive and potentially harmful decision-making bias. A review of 4 competing cognitive reconstruction theories of hindsight bias revealed conflicting predictions about the role and effect of expectation or surprise in retrospective judgment formation. Two experiments tested these predictions examining the…

  16. Did the FED Surprise the Markets in 2001? A Case Study for Vars with Sign Restrictions

    NARCIS (Netherlands)

    Uhlig, H.F.H.V.S.

    2001-01-01

    In 2001, the Fed has lowered interest rates in a series of cuts, starting from 6.5 % at the end of 2000 to 2.0 % by early November.This paper asks, whether the Federal Reserve Bank has been surprising the markets, taking as given the conventional view about the effect of monetary policy shocks.New e

  17. 一种面向图形化建模语言表示法的元模型%A Metamodel for the Notation of Graphical Modeling Languages

    Institute of Scientific and Technical Information of China (English)

    何啸; 麻志毅; 邵维忠

    2008-01-01

    对于图形化的建模语言,为定义其表示法一般需要解决3个问题:如何定义每个建模元素的图形符号,如何定义图形符号之间的位置关系以及如何将表示法映射到抽象语法.为了方便进行模型转换和代码生成,还需要使用模型化的方式描述建模语言的表示法.通过对UML及其语言家族中的表示法进行总结、分析和归纳,提出了一种表示法定义元模型(notation definition metamodel,简称NDM).针对定义表示法所面临的3个问题,NDM被分成基本图元及其布局、基本位置关系和抽象语法桥三部分.使用NDM定义好的表示法模型还可以通过代码生成技术生成可使用的源代码.将NDM与其他几种定义表示法的方法进行了比较,结果表明,NDM与其他方法相比具有优势.NDM已经在元建模工具PKU MetaModeler中实现.介绍了NDM在实际应用中的几个案例.

  18. "Success"ful Reading Instruction.

    Science.gov (United States)

    George, Carol J.

    1986-01-01

    The Success in Reading and Writing Program at a K-2 school in Fort Jackson, South Carolina, teaches children of varied races and abilities to read and write using newspapers, dictionaries, library books, magazines, and telephone directories. These materials help students develop language skills in a failure-free atmosphere. Includes two…

  19. Investigating Whether Contacting Absent Students Increases Course Success

    Science.gov (United States)

    Stucky, Thomas D.

    2008-01-01

    Studies suggest that student attendance in college classes increases course success. Yet, surprisingly few studies have examined strategies to increase student attendance. The goal of the current study is to consider whether contacting consistently absent students increases success in an undergraduate research methods course. Results of this…

  20. Surprising electronic structure of the BeH- dimer: a full-configuration-interaction study.

    Science.gov (United States)

    Verdicchio, Marco; Bendazzoli, Gian Luigi; Evangelisti, Stefano; Leininger, Thierry

    2013-01-10

    The electronic structure of the beryllium hydride anion, BeH(-), was investigated at valence full-configuration-interaction (FCI) level, using large cc-pV6Z basis sets. It appears that there is a deep change of the wave function nature as a function of the internuclear distance: the ion structure goes from a weakly bonded Be···H(-) complex, at long distance, to a rather strongly bonded system (more than 2 eV) at short distance, having a (:Be-H)(-) Lewis structure. In this case, it is the beryllium atom that formally bears the negative charge, a surprising result in view of the fact that it is the hydrogen atom that has a larger electronegativity. Even more surprisingly, at very short distances the average position of the total electronic charge is close to the beryllium atom but on the opposite side with respect to the hydrogen position.

  1. Investigating locality effects and surprisal in written English syntactic choice phenomena.

    Science.gov (United States)

    Rajkumar, Rajakrishnan; van Schijndel, Marten; White, Michael; Schuler, William

    2016-10-01

    We investigate the extent to which syntactic choice in written English is influenced by processing considerations as predicted by Gibson's (2000) Dependency Locality Theory (DLT) and Surprisal Theory (Hale, 2001; Levy, 2008). A long line of previous work attests that languages display a tendency for shorter dependencies, and in a previous corpus study, Temperley (2007) provided evidence that this tendency exerts a strong influence on constituent ordering choices. However, Temperley's study included no frequency-based controls, and subsequent work on sentence comprehension with broad-coverage eye-tracking corpora found weak or negative effects of DLT-based measures when frequency effects were statistically controlled for (Demberg & Keller, 2008; van Schijndel, Nguyen, & Schuler 2013; van Schijndel & Schuler, 2013), calling into question the actual impact of dependency locality on syntactic choice phenomena. Going beyond Temperley's work, we show that DLT integration costs are indeed a significant predictor of syntactic choice in written English even in the presence of competing frequency-based and cognitively motivated control factors, including n-gram probability and PCFG surprisal as well as embedding depth (Wu, Bachrach, Cardenas, & Schuler, 2010; Yngve, 1960). Our study also shows that the predictions of dependency length and surprisal are only moderately correlated, a finding which mirrors Dember & Keller's (2008) results for sentence comprehension. Further, we demonstrate that the efficacy of dependency length in predicting the corpus choice increases with increasing head-dependent distances. At the same time, we find that the tendency towards dependency locality is not always observed, and with pre-verbal adjuncts in particular, non-locality cases are found more often than not. In contrast, surprisal is effective in these cases, and the embedding depth measures further increase prediction accuracy. We discuss the implications of our findings for theories of

  2. Surprise and sense making: what newcomers experience in entering unfamiliar organizational settings.

    Science.gov (United States)

    Louis, M R

    1980-06-01

    Growing disillusionment among new members of organizations has been traced to inadequacies in approaches to organizational entry. Current directions of research on organizational entry and their limitations are described, and a new perspective is proposed. The new perspective identifies key features of newcomers' entry experiences, including surprise, contrast, and change, and describes the sense-making processes by which individuals cope with their entry experiences. Implications for research and practice on organizational entry are drawn.

  3. Each individual is a surprise: a conversation with Marianne Horney Eckardt.

    Science.gov (United States)

    Rubin, Jeffrey B

    2014-06-01

    "Each Individual is a Surprise" is a brief account of a dialogue between Marianne Horney Eckardt and myself about the state of psychoanalysis and the psychoanalytic process, the danger of idolatry, the damaging impact of psychoanalytic schools when they create a standardized and pathologizing approach to people, the value of curiosity and humility and retaining one's clinical creativity. The role of Rank, Horney, Sullivan, and Fromm in Dr. Eckardt's long life and rich work is touched upon.

  4. Surprise and Uncertainty—Framing Regional Geohazards in the Theory of Complexity

    Directory of Open Access Journals (Sweden)

    Beate M. W. Ratter

    2013-01-01

    Full Text Available The paper analyzes the concepts of uncertainty and surprise as key variables of a socio-ecological system’s behavior in the context of the theory of complexity. Experiences from the past have shown that living with uncertainty is part of our daily life and surprises are only surprising because our perspective of system trajectories is basically linear and non-dynamic. The future of humanity is dependent on the understanding of the system’s behavior and needs a change in perspective of linearity to non-linearity and from the planning imperative to a management hedging uncertainty and surprise. In the context of humanity’s future, the theory of complexity offers a new perspective on system trajectories and their understanding of surprises and uncertainty. There is a need for a Gestaltwechsel—a change in perception—which helps to see things differently and fosters the search for new answers to emerging questions at the human-nature interface. Drawing on the case study of hazard management the paper will explain the necessity of analysis system’s behavior and the taking into account of multi-agent behavior on the micro level which led to emergent behavior on the macro-level of the system. Regional geohazards are explained as the regional impact of an uncontrolled risk based on a state of a natural feature that has a direct impact on a regional population being affected by the appearance of a hazard and its development into damage. By acting in space, time and connectivity, people construct hazardscapes and change risk into regional geohazards. This concept shows relevance for future mitigation and adaptation measures. The theory of complexity can help in engendering the necessary shift in perspective. What is non-linear dynamic thinking as suggested by the theory of complexity? Why is the consideration of the system’s behavior crucial and not just the number of system’s elements? What is the role of agents in these systems? In

  5. Successful Aging

    Directory of Open Access Journals (Sweden)

    Taufiqurrahman Nasihun

    2015-06-01

    Full Text Available The emerging concept of successful aging is based on evidence that in healthy individual when they get aged, there are  considerable variations in physiological functions alteration. Some people exhibiting greater, but others very few or no age related alteration. The first is called poor aging and the later is called successful pattern of aging (Lambert SW, 2008. Thus, in the simple words the successful aging concept is define as an opportunity of old people to stay  active and productive condition despite they get aged chronologically. Aging itself might be defined as the progressive accumulation of changes with time associated with or responsible for the ever-increasing susceptibility to disease and death which accompanies advancing age (Harman D, 1981. The time needed to accumulate changes is attributable to aging process. The marked emerging questions are how does aging happen and where does aging start? To answer these questions and because of the complexity of aging process, there are more than 300 aging theories have been proposed to explain how and where aging occured and started respectively. There are too many to enumerate theories and classification of aging process. In summary, all of these aging theories can be grouped into three clusters: 1. Genetics program theory, this theory suggests that aging is resulted from program directed by the genes; 2. Epigenetic theory, in these theory aging is resulted from environmental random events not determined by the genes; 3. Evolutionary theory, which propose that aging is a medium for disposal mortal soma in order to avoid competition between organism and their progeny for food and space, did not try to explain how aging occur, but possibly answer why aging occur (De la Fuente. 2009. Among the three groups of aging theories, the epigenetic theory is useful to explain and try to solve the enigma of aging which is prominently caused by internal and external environmental influences

  6. A strange and surprising debate: mountains, original sin and 'science' in seventeenth-century England.

    Science.gov (United States)

    Wragge-Morley, Alexander

    2009-06-01

    It could come as a shock to learn that some seventeenth-century men of science and learning thought that mountains were bad. Even more alarmingly, some thought that God had imposed them on the earth to punish man for his sins. By the end of the seventeenth century, surprisingly many English natural philosophers and theologians were engaged in a debate about whether mountains were 'good' or 'bad', useful or useless. At stake in this debate were not just the careers of its participants, but arguments about the best ways of looking at and reckoning with 'nature' itself.

  7. The 'Secret' of success part 1.

    Science.gov (United States)

    Busby, Mike

    2011-03-01

    Practice success is defined across the four'dimensions' of oral health, patient satisfaction, job satisfaction and financial profit. It is suggested that the 'secret' of success in dental practice is to make patient (customer) satisfaction the primary focus. Not a very earth shattering or surprising'secret' perhaps! This is hardly a new idea, and not a concept restricted to dental practice. This principle applies to all businesses. This series of articles reviews evidence from across a broad spectrum of publications: from populist business publications through to refereed scientific papers, this'secret' seems to be confirmed. The evidence for which aspects of our service are most important in achieving patient satisfaction (and therefore success) is explored. Good oral health outcomes for patients are defined as the primary purpose of dental practice and, therefore, an essential dimension of success. The link between positive patient perceptions of general care and their own oral health to practice success is explored.

  8. Expectation and surprise determine neural population responses in the ventral visual stream.

    Science.gov (United States)

    Egner, Tobias; Monti, Jim M; Summerfield, Christopher

    2010-12-08

    Visual cortex is traditionally viewed as a hierarchy of neural feature detectors, with neural population responses being driven by bottom-up stimulus features. Conversely, "predictive coding" models propose that each stage of the visual hierarchy harbors two computationally distinct classes of processing unit: representational units that encode the conditional probability of a stimulus and provide predictions to the next lower level; and error units that encode the mismatch between predictions and bottom-up evidence, and forward prediction error to the next higher level. Predictive coding therefore suggests that neural population responses in category-selective visual regions, like the fusiform face area (FFA), reflect a summation of activity related to prediction ("face expectation") and prediction error ("face surprise"), rather than a homogenous feature detection response. We tested the rival hypotheses of the feature detection and predictive coding models by collecting functional magnetic resonance imaging data from the FFA while independently varying both stimulus features (faces vs houses) and subjects' perceptual expectations regarding those features (low vs medium vs high face expectation). The effects of stimulus and expectation factors interacted, whereby FFA activity elicited by face and house stimuli was indistinguishable under high face expectation and maximally differentiated under low face expectation. Using computational modeling, we show that these data can be explained by predictive coding but not by feature detection models, even when the latter are augmented with attentional mechanisms. Thus, population responses in the ventral visual stream appear to be determined by feature expectation and surprise rather than by stimulus features per se.

  9. Infants’ Looking to Surprising Events: When Eye-Tracking Reveals More than Looking Time

    Science.gov (United States)

    Yeung, H. Henny; Denison, Stephanie; Johnson, Scott P.

    2016-01-01

    Research on infants’ reasoning abilities often rely on looking times, which are longer to surprising and unexpected visual scenes compared to unsurprising and expected ones. Few researchers have examined more precise visual scanning patterns in these scenes, and so, here, we recorded 8- to 11-month-olds’ gaze with an eye tracker as we presented a sampling event whose outcome was either surprising, neutral, or unsurprising: A red (or yellow) ball was drawn from one of three visible containers populated 0%, 50%, or 100% with identically colored balls. When measuring looking time to the whole scene, infants were insensitive to the likelihood of the sampling event, replicating failures in similar paradigms. Nevertheless, a new analysis of visual scanning showed that infants did spend more time fixating specific areas-of-interest as a function of the event likelihood. The drawn ball and its associated container attracted more looking than the other containers in the 0% condition, but this pattern was weaker in the 50% condition, and even less strong in the 100% condition. Results suggest that measuring where infants look may be more sensitive than simply how much looking there is to the whole scene. The advantages of eye tracking measures over traditional looking measures are discussed. PMID:27926920

  10. 基于元建模的裂缝性气藏累计产量预测%Prediction of accumulated outflow from a fractured hydrocarbon reservoir using metamodeling

    Institute of Scientific and Technical Information of China (English)

    Seifi A; Kazemzadeh M B; Mohammadi H

    2013-01-01

    Three metamodels were established for predicting accumulated outflow from a fractured hydrocarbon reservoir over a planning horizon of 18 years, and the models were validated and compared using accumulated outflow predicted by numerical simulation. The reservoir was simulated and its basic parameters (porosity, permeability and water saturation) were estimated. The accumulated outflow over 18 years of a well in the reservoir was expressed as a function of the reservoir parameters. Low potential points were excluded using HIP (Hydrocarbon in Place) equation and 25 high potential points were chosen as design points using maximum entropy design. Three kinds of metamodels (quadratic model, multiplicative model and radial basis function model) were built and the accumulated outflows of 25 design points and 7 test points were predicted based on the models. The prediction results show that all of the three models can accurately predict the accumulated outflow of the reservoir studied in this paper and the radial basis function model outperforms the other two metamodels. Besides, the calculating time of the metamodeling method is much less than that of the numerical simulation.%  为了预测某裂缝性气藏在18年规划周期内的累计产量,建立了3种元模型,并结合数值模拟累计产量预测结果对各模型进行了验证和对比。估算了气藏的基本参数(孔隙度、渗透率及含水饱和度)并对气藏进行了数值模拟。将气藏内某井18年内的累计产量表示为各气藏参数的函数,采用储量计算经验函数排除了研究空间内的低潜力点,并基于最大熵准则选取25个高潜力点作为设计点。建立了二次模型、乘性模型和 RBF 模型3种元模型,基于各模型预测了25个设计点和7个试验点处的累计产量。预测结果表明:各模型均能比较准确地预测累计产量;RBF模型的累计产量预测准确性优于另外2种模型;与数值模拟方法相

  11. 基于Meta模型的虚拟原型及其主动信息服务%VIRTUAL PROTOTYPE BASED ON META-MODELS AND ACTIVE INFORMATION SERVICE

    Institute of Scientific and Technical Information of China (English)

    曹岩; 赵汝嘉

    2001-01-01

    Based on the analyses of functional requirements and characteristics of virtual prototype, the information organization of the virtual prototype based on meta-models is put forward. Active information service based on its four mechanisms and product family oriented design are also discussed.%本文在分析虚拟原型功能要求和特点的基础上,提出基于Meta模型的虚拟原型信息组织,然后提出并讨论了基于虚拟原型四个机制的主动信息服务和面向产品族的设计。

  12. Surprises from the resummation of ladders in the ABJ(M) cusp anomalous dimension

    CERN Document Server

    Bonini, Marisa; Preti, Michelangelo; Seminara, Domenico

    2016-01-01

    We study the cusp anomalous dimension in N=6 ABJ(M) theory, identifying a scaling limit in which the ladder diagrams dominate. The resummation is encoded into a Bethe-Salpeter equation that is mapped to a Schroedinger problem, exactly solvable due to the surprising supersymmetry of the effective Hamiltonian. In the ABJ case the solution implies the diagonalization of the U(N) and U(M) building blocks, suggesting the existence of two independent cusp anomalous dimensions and an unexpected exponentiation structure for the related Wilson loops. While consistent with previous perturbative analysis, the strong coupling limit of our result does not agree with the string theory computation, emphasizing a difference with the analogous resummation in the N=4 case.

  13. Probing Critical Point Energies of Transition Metal Dichalcogenides: Surprising Indirect Gap of Single Layer WSe 2

    KAUST Repository

    Zhang, Chendong

    2015-09-21

    By using a comprehensive form of scanning tunneling spectroscopy, we have revealed detailed quasi-particle electronic structures in transition metal dichalcogenides, including the quasi-particle gaps, critical point energy locations, and their origins in the Brillouin zones. We show that single layer WSe surprisingly has an indirect quasi-particle gap with the conduction band minimum located at the Q-point (instead of K), albeit the two states are nearly degenerate. We have further observed rich quasi-particle electronic structures of transition metal dichalcogenides as a function of atomic structures and spin-orbit couplings. Such a local probe for detailed electronic structures in conduction and valence bands will be ideal to investigate how electronic structures of transition metal dichalcogenides are influenced by variations of local environment.

  14. OCEAN CIRCULATION. Observing the Atlantic Meridional Overturning Circulation yields a decade of inevitable surprises.

    Science.gov (United States)

    Srokosz, M A; Bryden, H L

    2015-06-19

    The importance of the Atlantic Meridional Overturning Circulation (AMOC) heat transport for climate is well acknowledged. Climate models predict that the AMOC will slow down under global warming, with substantial impacts, but measurements of ocean circulation have been inadequate to evaluate these predictions. Observations over the past decade have changed that situation, providing a detailed picture of variations in the AMOC. These observations reveal a surprising degree of AMOC variability in terms of the intraannual range, the amplitude and phase of the seasonal cycle, the interannual changes in strength affecting the ocean heat content, and the decline of the AMOC over the decade, both of the latter two exceeding the variations seen in climate models. Copyright © 2015, American Association for the Advancement of Science.

  15. Paroxysmal atrial fibrillation occurs often in cryptogenic ischaemic stroke. Final results from the SURPRISE study

    DEFF Research Database (Denmark)

    Christensen, Louisa; Krieger, D W; Højberg, S;

    2014-01-01

    BACKGROUND AND PURPOSE: Atrial fibrillation (AF) increases the risk of stroke fourfold and is associated with a poor clinical outcome. Despite work-up in compliance with guidelines, up to one-third of patients have cryptogenic stroke (CS). The prevalence of asymptomatic paroxysmal atrial...... fibrillation (PAF) in CS remains unknown. The SURPRISE project aimed at determining this rate using long-term cardiac monitoring. METHODS: Patients with CS after protocolled work-up including electrocardiography (ECG) and telemetry were included after informed consent. An implantable loop recorder (ILR...... patients (16.1%). In three patients PAF was detected by other methods before or after monitoring and was undiscovered due to device sensitivity in one case. The first event of PAF was documented at a mean of 109 days (SD ±48) after stroke onset. PAF was asymptomatic in all cases and occurred in episodes...

  16. Probing Critical Point Energies of Transition Metal Dichalcogenides: Surprising Indirect Gap of Single Layer WSe2.

    Science.gov (United States)

    Zhang, Chendong; Chen, Yuxuan; Johnson, Amber; Li, Ming-Yang; Li, Lain-Jong; Mende, Patrick C; Feenstra, Randall M; Shih, Chih-Kang

    2015-10-14

    By using a comprehensive form of scanning tunneling spectroscopy, we have revealed detailed quasi-particle electronic structures in transition metal dichalcogenides, including the quasi-particle gaps, critical point energy locations, and their origins in the Brillouin zones. We show that single layer WSe2 surprisingly has an indirect quasi-particle gap with the conduction band minimum located at the Q-point (instead of K), albeit the two states are nearly degenerate. We have further observed rich quasi-particle electronic structures of transition metal dichalcogenides as a function of atomic structures and spin-orbit couplings. Such a local probe for detailed electronic structures in conduction and valence bands will be ideal to investigate how electronic structures of transition metal dichalcogenides are influenced by variations of local environment.

  17. Beyond surprise: the puzzle of infants' expressive reactions to expectancy violation.

    Science.gov (United States)

    Scherer, Klaus R; Zentner, Marcel R; Stern, Daniel

    2004-12-01

    The reactions of 58 infants to expectancy violation by digitally filtering the experimenter's voice were studied in a cross-sectional design for ages 5, 7, 9, 11-12, and 14 months. The results show that behavioral freezing and changes in gaze direction, but not facial or vocal expression, are reliable responses to expectancy violation. The pattern suggests that a transition in the infant's capacity for cognitive evaluation of novel and discrepant events may occur around age 9 months. These findings confirm the consistent failure to find prototypical facial surprise reactions in research on novel or impossible situations. Componential theories of emotion, which predict adaptive behavior patterns from appraisal processes, may provide clues for underlying mechanisms and generate hypotheses on age-related changes in emotional expression. copyright (c) 2004 APA, all rights reserved.

  18. Surprising judgments about robot drivers: Experiments on rising expectations and blaming humans

    Directory of Open Access Journals (Sweden)

    Peter Danielson

    2015-05-01

    Full Text Available N-Reasons is an experimental Internet survey platform designed to enhance public participation in applied ethics and policy. N-Reasons encourages individuals to generate reasons to support their judgments, and groups to converge on a common set of reasons pro and con various issues.  In the Robot Ethics Survey some of the reasons contributed surprising judgments about autonomous machines. Presented with a version of the trolley problem with an autonomous train as the agent, participants gave unexpected answers, revealing high expectations for the autonomous machine and shifting blame from the automated device to the humans in the scenario. Further experiments with a standard pair of human-only trolley problems refine these results. While showing the high expectations even when no autonomous machine is involved, human bystanders are only blamed in the machine case. A third experiment explicitly aimed at responsibility for driverless cars confirms our findings about shifting blame in the case of autonomous machine agents. We conclude methodologically that both results point to the power of an experimental survey based approach to public participation to explore surprising assumptions and judgments in applied ethics. However, both results also support using caution when interpreting survey results in ethics, demonstrating the importance of qualitative data to provide further context for evaluating judgments revealed by surveys. On the ethics side, the result about shifting blame to humans interacting with autonomous machines suggests caution about the unintended consequences of intuitive principles requiring human responsibility.http://dx.doi.org/10.5324/eip.v9i1.1727

  19. Ontological Metamodeling with Explicit Instantiation

    NARCIS (Netherlands)

    Laarman, Alfons; Kurtev, Ivan; van den Brand, M.; Gašević, D.; Gray, J.

    2010-01-01

    Model Driven Engineering (MDE) is a promising paradigm for software development. It raises the level of abstraction in software development by treating models as primary artifacts. The practical application of this paradigm is seriously endangered by the current weak modeling foundation of the appro

  20. Successful and unsuccessful psychopaths: a neurobiological model.

    Science.gov (United States)

    Gao, Yu; Raine, Adrian

    2010-01-01

    Despite increasing interest in psychopathy research, surprisingly little is known about the etiology of non-incarcerated, successful psychopaths. This review provides an analysis of current knowledge on the similarities and differences between successful and unsuccessful psychopaths derived from five population sources: community samples, individuals from employment agencies, college students, industrial psychopaths, and serial killers. An initial neurobiological model of successful and unsuccessful psychopathy is outlined. It is hypothesized that successful psychopaths have intact or enhanced neurobiological functioning that underlies their normal or even superior cognitive functioning, which in turn helps them to achieve their goals using more covert and nonviolent methods. In contrast, in unsuccessful, caught psychopaths, brain structural and functional impairments together with autonomic nervous system dysfunction are hypothesized to underlie cognitive and emotional deficits and more overt violent offending.

  1. Ensuring a successful family business management succession

    OpenAIRE

    Desbois, Joris

    2016-01-01

    Succession is the biggest long-term challenge that most family businesses face. Indeed, leaders ‘disposition to plan for their succession is frequently the key factor defining whether their family business subsists or stops. The research seeks to find out how to manage successfully the business management succession over main principles. This work project aims at researching the key points relevant to almost all family firms, to have a viable succession transition and positioni...

  2. The Nucleus of Comet 67P/Churyumov-Gerasimenko: Lots of Surprises

    Science.gov (United States)

    Weissman, Paul R.; Rosetta Science Working Team

    2016-10-01

    ESA's Rosetta mission has made many new and unexpected discoveries since its arrival at comet 67P/Churyumov-Gerasimenko in August 2014. The first of these was the unusual shape of the cometary nucleus. Although bilobate nuclei had been seen before, the extreme concavities on 67P were unexpected. Evidence gathered during the mission suggests that two independent bodies came together to form 67P, rather than the nucleus being a single body that was sculpted by sublimation and/or other processes. Although not a surprise, early observations showed that the nucleus rotation period had decreased by ~22 minutes since the previous aphelion passage. A similar rotation period decrease was seen post-perihelion during the encounter. These changes likely arise from asymmetric jetting forces from the irregular nucleus. Initially, Rosetta's instruments found little evidence for water ice on the surface; the presence of surface water ice increased substantially as the nucleus approached perihelion. The nucleus bulk density, 533 ± 6 kg/m3, was measured with Radio Science and OSIRIS imaging of the nucleus volume. This confirmed previous estimates based on indirect methods that the bulk density of cometary nuclei was on the order of 500-600 kg/m3 and on measurement of the density of 9P/Tempel 1's nucleus by Deep Impact. Nucleus topography proved to be highly varied, from smooth dust-covered plains to shallow circular basins, to the very rough terrain where the Philae lander came to rest. Evidence of thermal cracking is everywhere. The discovery of cylindrical pits on the surface, typically 100-200m in diameter with similar depths was a major surprise and has been interpreted as sinkholes. "Goose-bump" terrain consisting of apparently random piles of boulders 2-3 m in diameter was another unexpected discovery. Apparent layering with scales of meters to many tens of meters was seen but there was little or no evidence for impact features. Radar tomography of the interior of the "head

  3. Investigation of the heat source(s) of the Surprise Valley Geothermal System, Northern California

    Science.gov (United States)

    Tanner, N.; Holt, C. D.; Hawkes, S.; McClain, J. S.; Safford, L.; Mink, L. L.; Rose, C.; Zierenberg, R. A.

    2016-12-01

    Concerns about environmental impacts and energy security have led to an increased interest in sustainable and renewable energy resources, including geothermal systems. It is essential to know the permeability structure and possible heat source(s) of a geothermal area in order to assess the capacity and extent of the potential resource. We have undertaken geophysical surveys at the Surprise Valley Hot Springs in Cedarville, California to characterize essential parameters related to a fault-controlled geothermal system. At present, the heat source(s) for the system are unknown. Igneous bodies in the area are likely too old to have retained enough heat to supply the system, so it is probable that fracture networks provide heat from some deeper or more distributed heat sources. However, the fracture system and permeability structure remain enigmatic. The goal of our research is to identify the pathways for fluid transport within the Surprise Valley geothermal system using a combination of geophysical methods including active seismic surveys and short- and long-period magnetotelluric (MT) surveys. We have collected 14 spreads, consisting of 24 geophones each, of active-source seismic data. We used a "Betsy Gun" source at 8 to 12 locations along each spread and have collected and analyzed about 2800 shot-receiver pairs. Seismic velocities reveal shallow lake sediments, as well as velocities consistent with porous basalts. The latter, with velocities of greater than 3.0 km/s, lie along strike with known hot springs and faulted and tilted basalt outcrops outside our field area. This suggests that basalts may provide a permeable pathway through impermeable lake deposits. We conducted short-period (10Hz-60kHz) MT measurements at 33 stations. Our short-period MT models indicate shallow resistive blocks (>100Ωm) with a thin cover of more conductive sediments ( 10Ωm) at the surface. Hot springs are located in gaps between resistive blocks and are connected to deeper low

  4. Surprising results on phylogenetic tree building methods based on molecular sequences

    Directory of Open Access Journals (Sweden)

    Gonnet Gaston H

    2012-06-01

    Full Text Available Abstract Background We analyze phylogenetic tree building methods from molecular sequences (PTMS. These are methods which base their construction solely on sequences, coding DNA or amino acids. Results Our first result is a statistically significant evaluation of 176 PTMSs done by comparing trees derived from 193138 orthologous groups of proteins using a new measure of quality between trees. This new measure, called the Intra measure, is very consistent between different groups of species and strong in the sense that it separates the methods with high confidence. The second result is the comparison of the trees against trees derived from accepted taxonomies, the Taxon measure. We consider the NCBI taxonomic classification and their derived topologies as the most accepted biological consensus on phylogenies, which are also available in electronic form. The correlation between the two measures is remarkably high, which supports both measures simultaneously. Conclusions The big surprise of the evaluation is that the maximum likelihood methods do not score well, minimal evolution distance methods over MSA-induced alignments score consistently better. This comparison also allows us to rank different components of the tree building methods, like MSAs, substitution matrices, ML tree builders, distance methods, etc. It is also clear that there is a difference between Metazoa and the rest, which points out to evolution leaving different molecular traces. We also think that these measures of quality of trees will motivate the design of new PTMSs as it is now easier to evaluate them with certainty.

  5. Pooling designs with surprisingly high degree of error correction in a finite vector space

    CERN Document Server

    Guo, Jun

    2011-01-01

    Pooling designs are standard experimental tools in many biotechnical applications. It is well-known that all famous pooling designs are constructed from mathematical structures by the "containment matrix" method. In particular, Macula's designs (resp. Ngo and Du's designs) are constructed by the containment relation of subsets (resp. subspaces) in a finite set (resp. vector space). Recently, we generalized Macula's designs and obtained a family of pooling designs with more high degree of error correction by subsets in a finite set. In this paper, as a generalization of Ngo and Du's designs, we study the corresponding problems in a finite vector space and obtain a family of pooling designs with surprisingly high degree of error correction. Our designs and Ngo and Du's designs have the same number of items and pools, respectively, but the error-tolerant property is much better than that of Ngo and Du's designs, which was given by D'yachkov et al. \\cite{DF}, when the dimension of the space is large enough.

  6. IP Eri: A surprising long-period binary system hosting a He white dwarf

    CERN Document Server

    Merle, T; Masseron, T; Van Eck, S; Siess, L; Van Winckel, H

    2014-01-01

    We determine the orbital elements for the K0 IV + white dwarf (WD) system IP Eri, which appears to have a surprisingly long period of 1071 d and a significant eccentricity of 0.25. Previous spectroscopic analyses of the WD, based on a distance of 101 pc inferred from its Hipparcos parallax, yielded a mass of only 0.43 M$_\\odot$, implying it to be a helium-core WD. The orbital properties of IP Eri are similar to those of the newly discovered long-period subdwarf B star (sdB) binaries, which involve stars with He-burning cores surrounded by extremely thin H envelopes, and are therefore close relatives to He WDs. We performed a spectroscopic analysis of high-resolution spectra from the HERMES/Mercator spectrograph and concluded that the atmospheric parameters of the K0 component are $T_{\\rm eff} = 4960$ K, $\\log{g} = 3.3$, [Fe/H] = 0.09 and $\\xi = 1.5$ km/s. The detailed abundance analysis focuses on C, N, O abundances, carbon isotopic ratio, light (Na, Mg, Al, Si, Ca, Ti) and s-process (Sr, Y, Zr, Ba, La, Ce, N...

  7. Surprising dissimilarities in a newly formed pair of 'identical twin' stars.

    Science.gov (United States)

    Stassun, Keivan G; Mathieu, Robert D; Cargile, Phillip A; Aarnio, Alicia N; Stempels, Eric; Geller, Aaron

    2008-06-19

    The mass and chemical composition of a star are the primary determinants of its basic physical properties-radius, temperature and luminosity-and how those properties evolve with time. Accordingly, two stars born at the same time, from the same natal material and with the same mass, are 'identical twins,' and as such might be expected to possess identical physical attributes. We have discovered in the Orion nebula a pair of stellar twins in a newborn binary star system. Each star in the binary has a mass of 0.41 +/- 0.01 solar masses, identical to within 2 per cent. Here we report that these twin stars have surface temperatures differing by approximately 300 K ( approximately 10 per cent) and luminosities differing by approximately 50 per cent, both at high confidence level. Preliminary results indicate that the stars' radii also differ, by 5-10 per cent. These surprising dissimilarities suggest that one of the twins may have been delayed by several hundred thousand years in its formation relative to its sibling. Such a delay could only have been detected in a very young, definitively equal-mass binary system. Our findings reveal cosmic limits on the age synchronization of young binary stars, often used as tests for the age calibrations of star-formation models.

  8. The Surprising Composition of the Salivary Proteome of Preterm Human Newborn

    Science.gov (United States)

    Castagnola, Massimo; Inzitari, Rosanna; Fanali, Chiara; Iavarone, Federica; Vitali, Alberto; Desiderio, Claudia; Vento, Giovanni; Tirone, Chiara; Romagnoli, Costantino; Cabras, Tiziana; Manconi, Barbara; Teresa Sanna, Maria; Boi, Roberto; Pisano, Elisabetta; Olianas, Alessandra; Pellegrini, Mariagiuseppina; Nemolato, Sonia; Wilhelm Heizmann, Claus; Faa, Gavino; Messana, Irene

    2011-01-01

    Saliva is a body fluid of a unique composition devoted to protect the mouth cavity and the digestive tract. Our high performance liquid chromatography (HPLC)-electrospray ionization-MS analysis of the acidic soluble fraction of saliva from preterm human newborn surprisingly revealed more than 40 protein masses often undetected in adult saliva. We were able to identify the following proteins: stefin A and stefin B, S100A7 (two isoforms), S100A8, S100A9 (four isoforms), S100A11, S100A12, small proline-rich protein 3 (two isoforms), lysozyme C, thymosins β4 and β10, antileukoproteinase, histone H1c, and α and γ globins. The average mass value reported in international data banks was often incongruent with our experimental results mostly because of post-translational modifications of the proteins, e.g. acetylation of the N-terminal residue. A quantitative label-free MS analysis showed protein levels altered in relation to the postconceptional age and suggested coordinate and hierarchical functions for these proteins during development. In summary, this study shows for the first time that analysis of these proteins in saliva of preterm newborns might represent a noninvasive way to obtain precious information of the molecular mechanisms of development of human fetal oral structures. PMID:20943598

  9. Marine Protected Areas, Multiple-Agency Management, and Monumental Surprise in the Northwestern Hawaiian Islands

    Directory of Open Access Journals (Sweden)

    John N. Kittinger

    2011-01-01

    Full Text Available Large, regional-scale marine protected areas (MPAs and MPA networks face different challenges in governance systems than locally managed or community-based MPAs. An emerging theme in large-scale MPA management is the prevalence of governance structures that rely on institutional collaboration, presenting new challenges as agencies with differing mandates and cultures work together to implement ecosystem-based management. We analyzed qualitative interview data to investigate multi-level social interactions and institutional responses to the surprise establishment of the Papahānaumokuākea Marine National Monument (monument in the northwestern Hawaiian Islands (NWHI. The governance arrangement for the monument represents a new model in US MPA management, requiring two federal agencies and the State of Hawai‘i to collaboratively manage the NWHI. We elucidate the principal barriers to institutional cotrusteeship, characterize institutional transformations that have occurred among the partner agencies in the transition to collaborative management, and evaluate the governance arrangement for the monument as a model for MPAs. The lessons learned from the NWHI governance arrangement are critical as large-scale MPAs requiring multiple-agency management become a prevalent feature on the global seascape.

  10. Surprise responses in the human brain demonstrate statistical learning under high concurrent cognitive demand

    Science.gov (United States)

    Garrido, Marta Isabel; Teng, Chee Leong James; Taylor, Jeremy Alexander; Rowe, Elise Genevieve; Mattingley, Jason Brett

    2016-06-01

    The ability to learn about regularities in the environment and to make predictions about future events is fundamental for adaptive behaviour. We have previously shown that people can implicitly encode statistical regularities and detect violations therein, as reflected in neuronal responses to unpredictable events that carry a unique prediction error signature. In the real world, however, learning about regularities will often occur in the context of competing cognitive demands. Here we asked whether learning of statistical regularities is modulated by concurrent cognitive load. We compared electroencephalographic metrics associated with responses to pure-tone sounds with frequencies sampled from narrow or wide Gaussian distributions. We showed that outliers evoked a larger response than those in the centre of the stimulus distribution (i.e., an effect of surprise) and that this difference was greater for physically identical outliers in the narrow than in the broad distribution. These results demonstrate an early neurophysiological marker of the brain's ability to implicitly encode complex statistical structure in the environment. Moreover, we manipulated concurrent cognitive load by having participants perform a visual working memory task while listening to these streams of sounds. We again observed greater prediction error responses in the narrower distribution under both low and high cognitive load. Furthermore, there was no reliable reduction in prediction error magnitude under high-relative to low-cognitive load. Our findings suggest that statistical learning is not a capacity limited process, and that it proceeds automatically even when cognitive resources are taxed by concurrent demands.

  11. A surprisingly simple correlation between the classical and quantum structural networks in liquid water

    Energy Technology Data Exchange (ETDEWEB)

    Hamm, Peter; Fanourgakis, George S.; Xantheas, Sotiris S.

    2017-08-14

    Nuclear quantum effects in liquid water have profound implications for several of its macroscopic properties related to structure, dynamics, spectroscopy and transport. Although several of water’s macroscopic properties can be reproduced by classical descriptions of the nuclei using potentials effectively parameterized for a narrow range of its phase diagram, a proper account of the nuclear quantum effects is required in order to ensure that the underlying molecular interactions are transferable across a wide temperature range covering different regions of that diagram. When performing an analysis of the hydrogen bonded structural networks in liquid water resulting from the classical (class.) and quantum (q.m.) descriptions of the nuclei with the transferable, flexible, polarizable TTM3-F interaction potential, we found that the two results can be superimposed over the temperature range of T=270-350 K using a surprisingly simple, linear scaling of the two temperatures according to T(q.m.)=aT(class)- T , where a=1.2 and T=51 K. The linear scaling and constant shift of the temperature scale can be considered as a generalization of the previously reported temperature shifts (corresponding to structural changes and the melting T) induced by quantum effects in liquid water.

  12. Planning farm succession: how to be successful

    OpenAIRE

    Stephens, Mike

    2011-01-01

    Planning farm succession is really good farm planning in its broadest aspect. Unfortunately very few farmers and their families have devoted sufficient time to working out how the farm business will be transferred. After demonstrating the importance of the farm succession issue, this article goes on to explaining a method of successfully tackling the process.

  13. Factors favorable to public participation success

    Energy Technology Data Exchange (ETDEWEB)

    Peelle, E.; Schweitzer, M.; Munro, J.; Carnes, S.; Wolfe, A.

    1996-05-01

    Categories of factors linked to successful public participation (PP) program outcomes include PP process, organizational context, sociopolitical context, strategic considerations and unique (special circumstances) factors. We re-order the long list factors according to how essential, important, and unique they are and discuss their significance and interrelationships. It is argued that bureacratic structure and operational modes are basically in conflict with features of successful PP programs (openness, two-way education, communication with nonexpert outsiders). If this is so, then it is not surprising that the factors essential for PP success in bureacracies involve extraordinary management efforts by agencies to bypass, compensate for, or overcome structural constraints. We conclude by speculating about the long-term viability of PP practices in the agency setting as well as the consequences for agencies that attempt the problematic task of introducing PP into their complex, mission-oriented organizations.

  14. Farmers Insures Success

    Science.gov (United States)

    Freifeld, Lorri

    2012-01-01

    Farmers Insurance claims the No. 2 spot on the Training Top 125 with a forward-thinking training strategy linked to its primary mission: FarmersFuture 2020. It's not surprising an insurance company would have an insurance policy for the future. But Farmers takes that strategy one step further, setting its sights on 2020 with a far-reaching plan to…

  15. Geophysical Investigation of the Lake City Fault Zone, Surprise Valley, California, and Implications for Geothermal Circulation

    Science.gov (United States)

    McPhee, D. K.; Glen, J. M.; Egger, A. E.; Chuchel, B. A.

    2009-12-01

    New audiomagnetotelluric (AMT), gravity, and magnetic data were collected in Surprise Valley, northwestern Basin and Range, in order to investigate the role that the Lake City Fault Zone (LCFZ) may play in controlling geothermal circulation in the area. Surprise Valley hosts an extensional geothermal system currently undergoing exploration for development on several scales. The focus of much of that exploration has been the LCFZ, a set of NW-SE-trending structures that has been suggested on the basis of (1) low-relief scarps in the NW portion of the zone, (2) dissolved mineral-rich groundwater chemistry along its length, and (3) parallelism with a strong regional fabric that includes the Brothers Fault Zone. The LCFZ extends across the valley at a topographic high, intersecting the N-S-trending basin-bounding faults where major hot springs occur. This relationship suggests that the LCFZ may be a zone of permeability for flow of hydrothermal fluids. Previous potential field data indicate that there is no vertical offset along this fault zone, and little signature at all in either the gravity or magnetic data; along with the lack of surface expression along most of its length, the subsurface geometry of the LCFZ and its influence on geothermal fluid circulation remains enigmatic. The LCFZ therefore provides an ideal opportunity to utilize AMT data, which measures subsurface resistivity and therefore - unlike potential field data - is highly sensitive to the presence of saline fluids. AMT data and additional gravity and magnetic data were collected in 2009 along 3 profiles perpendicular to the LCFZ in order to define the subsurface geometry and conductivity of the fault zone down to depths of ~ 500 m. AMT soundings were collected using the Geometrics Stratagem EH4 system, a four channel, natural and controlled-source tensor system recording in the range of 10 to 92,000 Hz. To augment the low signal in the natural field a transmitter of two horizontal-magnetic dipoles

  16. Lymphocytic alveolitis: A surprising index of poor prognosis in patients with primary Sjogren's syndrome.

    Science.gov (United States)

    Dalavanga, Y A; Voulgari, P V; Georgiadis, A N; Leontaridi, C; Katsenos, S; Vassiliou, M; Drosos, A A; Constantopoulos, S H

    2006-07-01

    Twelve years ago we reported that lymphocytic alveolitis [or bronchoalveolar lavage (BAL) lymphocytosis] correlates with clinical pulmonary involvement in primary Sjogren's syndrome (pSS). Our thesis was based on subtle clinical and functional evidence of interstitial lung disease (ILD) in pSS patients with "high lymphocytic alveolitis" (>15% lymphocytes in BAL). This report is a follow-up study of these patients. Basic clinical and functional re-evaluation of the 22 patients with pSS, studied in 1991, emphasized the differences between those with alveolitis and those without alveolitis. There was no significant functional decline. There were, however, two statistically significant differences between the two groups: (1) only patients with BAL lymphocytosis had to be treated with steroids (5/12 vs. 0/10, P < 0.05) and (2) only patients with BAL lymphocytosis had died in the mean time (6/12 vs. 0/10, P < 0.01). The causes of death were various. On only two occasions were they related to respiratory infections while there were no deaths from respiratory failure secondary to ILD. BAL lymphocytosis appears to be a surprisingly serious index of dismal prognosis in patients with pSS. We offer no unifying pathophysiologic mechanism for it and, therefore, all we propose is that BAL is performed early, in as many patients with pSS as possible. These patients should then be followed up systematically, in order to evaluate if BAL lymphocytosis has any pathophysiologic importance in the development of clinically serious pSS, which is serious enough to lead to death.

  17. A post-genomic surprise. The molecular reinscription of race in science, law and medicine.

    Science.gov (United States)

    Duster, Troy

    2015-03-01

    The completion of the first draft of the Human Genome Map in 2000 was widely heralded as the promise and future of genetics-based medicines and therapies - so much so that pundits began referring to the new century as 'The Century of Genetics'. Moreover, definitive assertions about the overwhelming similarities of all humans' DNA (99.9 per cent) by the leaders of the Human Genome Project were trumpeted as the end of racial thinking about racial taxonomies of human genetic differences. But the first decade of the new century brought unwelcomed surprises. First, gene therapies turned out to be far more complicated than any had anticipated - and instead the pharmaceutical industry turned to a focus on drugs that might be 'related' to population differences based upon genetic markers. While the language of 'personalized medicine' dominated this frame, research on racially and ethnically designated populations differential responsiveness to drugs dominated the empirical work in the field. Ancestry testing and 'admixture research' would play an important role in a new kind of molecular reification of racial categories. Moreover, the capacity of the super-computer to map differences reverberated into personal identification that would affect both the criminal justice system and forensic science, and generate new levels of concern about personal privacy. Social scientists in general, and sociologists in particular, have been caught short by these developments - relying mainly on assertions that racial categories are socially constructed, regionally and historically contingent, and politically arbitrary. While these assertions are true, the imprimatur of scientific legitimacy has shifted the burden, since now 'admixture research' can claim that its results get at the 'reality' of human differentiation, not the admittedly flawed social constructions of racial categories. Yet what was missing from this framing of the problem: 'admixture research' is itself based upon socially

  18. Explanatory models of health and disease: surprises from within the former Soviet Union

    Directory of Open Access Journals (Sweden)

    Tatiana I Andreeva

    2013-06-01

    Full Text Available Extract The review of anthropological theories as applied to public health by Jennifer J. Carroll (Carroll, 2013 published in this issue of TCPHEE made me recollect my first and most surprising discoveries of how differently same things can be understood in different parts of the world. Probably less unexpectedly, these impressions concern substance abuse and addiction behaviors, similarly to many examples deployed by Jennifer J. Carroll. The first of these events happened soon after the break-up of the Soviet Union when some of the most active people from the West rushed to discover what was going on behind the opening iron curtain. A director of an addiction clinic, who had just come into contact with a Dutch counterpart, invited me to join the collaboration and the innovation process he planned to launch. Being a participant of the exchange program started within this collaboration, I had an opportunity to discover how addictive behaviors were understood and explained in books (English, 1961; Kooyman, 1992; Viorst, 1986 recommended by the colleagues in the Netherlands and, as I could observe with my own eyes, addressed in everyday practice. This was a jaw-dropping contrast to what I learnt at the soviet medical university and some post-graduate courses, where all the diseases related to alcohol, tobacco, or drug abuse were considered predominantly a result of the substance intake. In the Soviet discourse, the intake itself was understood as 'willful and deliberate' or immoral behavior which, in some cases, was to be rectified in prison-like treatment facilities. In the West, quite oppositely, substance abuse was seen rather as a consequence of a constellation of life-course adversities thoroughly considered by developmental psychology. This approach was obviously deeply ingrained in how practitioners diagnosed and treated their patients.

  19. The genome of Pelobacter carbinolicus reveals surprising metabolic capabilities and physiological features

    Directory of Open Access Journals (Sweden)

    Aklujkar Muktak

    2012-12-01

    Full Text Available Abstract Background The bacterium Pelobacter carbinolicus is able to grow by fermentation, syntrophic hydrogen/formate transfer, or electron transfer to sulfur from short-chain alcohols, hydrogen or formate; it does not oxidize acetate and is not known to ferment any sugars or grow autotrophically. The genome of P. carbinolicus was sequenced in order to understand its metabolic capabilities and physiological features in comparison with its relatives, acetate-oxidizing Geobacter species. Results Pathways were predicted for catabolism of known substrates: 2,3-butanediol, acetoin, glycerol, 1,2-ethanediol, ethanolamine, choline and ethanol. Multiple isozymes of 2,3-butanediol dehydrogenase, ATP synthase and [FeFe]-hydrogenase were differentiated and assigned roles according to their structural properties and genomic contexts. The absence of asparagine synthetase and the presence of a mutant tRNA for asparagine encoded among RNA-active enzymes suggest that P. carbinolicus may make asparaginyl-tRNA in a novel way. Catabolic glutamate dehydrogenases were discovered, implying that the tricarboxylic acid (TCA cycle can function catabolically. A phosphotransferase system for uptake of sugars was discovered, along with enzymes that function in 2,3-butanediol production. Pyruvate:ferredoxin/flavodoxin oxidoreductase was identified as a potential bottleneck in both the supply of oxaloacetate for oxidation of acetate by the TCA cycle and the connection of glycolysis to production of ethanol. The P. carbinolicus genome was found to encode autotransporters and various appendages, including three proteins with similarity to the geopilin of electroconductive nanowires. Conclusions Several surprising metabolic capabilities and physiological features were predicted from the genome of P. carbinolicus, suggesting that it is more versatile than anticipated.

  20. Virtual Volatility, an Elementary New Concept with Surprising Stock Market Consequences

    Science.gov (United States)

    Prange, Richard; Silva, A. Christian

    2006-03-01

    Textbook investors start by predicting the future price distribution, PDF, of a candidate stock (or portfolio) at horizon T, e.g. a year hence. A (log)normal PDF with center (=drift =expected return) μT and width (=volatility) σT is often assumed on Central Limit Theorem grounds, i.e. by a random walk of daily (log)price increments δs. The standard deviation, stdev, of historical (ex post) δs `s is usually a fair predictor of the coming year's (ex ante) stdev(δs) = σdaily, but the historical mean E(δs) at best roughly limits the true, to be predicted, drift by μtrueT˜ μhistT ± σhistT. Textbooks take a PDF with σ ˜ σdaily and μ as somehow known, as if accurate predictions of μ were possible. It is elementary and presumably new to argue that an average of PDF's over a range of μ values should be taken, e.g. an average over forecasts by different analysts. We estimate that this leads to a PDF with a `virtual' volatility σ ˜ 1.3σdaily. It is indeed clear that uncertainty in the value of the expected gain parameter increases the risk of investment in that security by most measures, e. g. Sharpe's ratio μT/σT will be 30% smaller because of this effect. It is significant and surprising that there are investments which benefit from this 30% virtual increase in the volatility

  1. Surprising results from abiotic enzyme digestion of dissolved organic matter at the molecular scale

    Science.gov (United States)

    Hess, N. J.; Tfaily, M. M.; Heredia-Langnar, A.; Rodriguez, L.; Purvine, E.; Todd-Brown, K. E.

    2016-12-01

    Sometimes even the simplest of experiments leads to unexpected results and new understanding. We extract dissolved organic matter using water from peat soil obtained from the S1 bog at the Marcell Experimental Forest in northern Minnesota. We characterized the dissolved organic matter in the water extract before and after adding glucosidase, peroxidase and β-N-Acetylglucosaminidase enzymes using electrospray Fourier transform ion cyclotron resonance mass spectrometry in negative ion mode. Based on mass measurement accuracy of less than 1 ppm for singly charged ions, we assigned putative chemical formula to greater than 80% of the measured mass spectrometry features. For each enzyme tested we are able to easily distinguish between the types and composition of dissolved organic molecules that are susceptible to enzyme degradation - and those that are not - based on the presence new compounds in reacted extracts and loss of compounds from the initial water extract. Next, we created a consensus molecular network analysis based on the neutral mass loss between the measured compounds for each enzyme. The connectivity within these networks suggested a unique, distinctive chemistry for each enzyme. Some results were expected, like the nondiscriminatory oxidation of organic molecules by peroxidase and preferential loss of lignin and tannin-like molecules by glucosidase. However, surprising results include the apparent reactivity of glucosidase enzymatic products to reassemble, forming larger mass organic molecules. While these experiments were conducted abiotically, these molecular-resolved results suggest that biotic enzymatic processes may result in product compounds with unexpected chemistry and reactivity, implying that our current conceptual model of microbial enzymatic activity may be overly simplistic.

  2. The analysis of eight transcriptomes from all poriferan classes reveals surprising genetic complexity in sponges.

    Science.gov (United States)

    Riesgo, Ana; Farrar, Nathan; Windsor, Pamela J; Giribet, Gonzalo; Leys, Sally P

    2014-05-01

    Sponges (Porifera) are among the earliest evolving metazoans. Their filter-feeding body plan based on choanocyte chambers organized into a complex aquiferous system is so unique among metazoans that it either reflects an early divergence from other animals prior to the evolution of features such as muscles and nerves, or that sponges lost these characters. Analyses of the Amphimedon and Oscarella genomes support this view of uniqueness-many key metazoan genes are absent in these sponges-but whether this is generally true of other sponges remains unknown. We studied the transcriptomes of eight sponge species in four classes (Hexactinellida, Demospongiae, Homoscleromorpha, and Calcarea) specifically seeking genes and pathways considered to be involved in animal complexity. For reference, we also sought these genes in transcriptomes and genomes of three unicellular opisthokonts, two sponges (A. queenslandica and O. carmela), and two bilaterian taxa. Our analyses showed that all sponge classes share an unexpectedly large complement of genes with other metazoans. Interestingly, hexactinellid, calcareous, and homoscleromorph sponges share more genes with bilaterians than with nonbilaterian metazoans. We were surprised to find representatives of most molecules involved in cell-cell communication, signaling, complex epithelia, immune recognition, and germ-lineage/sex, with only a few, but potentially key, absences. A noteworthy finding was that some important genes were absent from all demosponges (transcriptomes and the Amphimedon genome), which might reflect divergence from main-stem lineages including hexactinellids, calcareous sponges, and homoscleromorphs. Our results suggest that genetic complexity arose early in evolution as shown by the presence of these genes in most of the animal lineages, which suggests sponges either possess cryptic physiological and morphological complexity and/or have lost ancestral cell types or physiological processes.

  3. Young stars in old galaxies - surprising discovery with the world's leading telescopes

    Science.gov (United States)

    2002-06-01

    similar to the way a palaeontologist uses the skeletons of dinosaurs to deduce information about the era in which they lived. A surprising discovery The team combined images of a number of galaxies from Hubble's Wide Field and Planetary Camera 2 with infrared images obtained from the multi-mode ISAAC instrument on the 8.2m VLT Antu telescope at the ESO Paranal Observatory (Chile). To their great surprise, they discovered that many of the globular clusters in one of these galaxies, NGC 4365, a member of the large Virgo cluster of galaxies, were only a few thousand million years old, much younger than most of the other stars in this galaxy (roughly 12 thousand million years old). The astronomers were able to identify three major groups of stellar clusters. There is an old population of clusters of metal-poor stars, some clusters of old but metal-rich stars and now, seen for the first time, a population of clusters with young and metal-rich stars. These results have been fully confirmed by spectroscopic observations made with another of the world's giant telescopes, the 10-metre Keck on Hawaii. "It is a great pleasure to see two projects wholly or partly funded by Europe - VLT and Hubble - work in concert to produce such an important scientific result", says Piero Benvenuti, ESA Hubble Project Scientist. "The synergy between the most advanced ground and space telescopes continues to prove its effectiveness, paving the way to impressive new discoveries that would not otherwise be possible." The discovery of young globular clusters within old galaxies is surprising since the stars in the giant elliptical galaxies were until now believed to have formed during a single period early in the history of the Universe. It is now clear that some of the galaxies may be hiding their true nature and have indeed experienced much more recent periods of major star formation. Notes for editors This press release is issued in coordination between ESA and ESO. The Hubble Space Telescope project

  4. A conceptual review of the psychosocial genomics of expectancy and surprise: neuroscience perspectives about the deep psychobiology of therapeutic hypnosis.

    Science.gov (United States)

    Rossi, Ernest L

    2002-10-01

    This conceptual review explores some speculative associations between the neuroscience of expectancy and surprise during stress and therapeutic hypnosis. Current neuroscience is exploring how novel interactions between the organism and the environment initiate cascades of gene expression, protein synthesis, neurogenesis, and healing that operate via Darwinian principles of natural variation and selection on all levels from the molecular-genomic to the subjective states of consciousness. From a neuroscience perspective, the novel and surprising experiences of consciousness appear to have as important a role as expectancy in memory, learning and behavior change in the psychobiology of therapeutic hypnosis. This paper explores how we may integrate the psychosocial genomics of expectancy and surprise in therapeutic hypnosis as a complex system of creative adaptation on all levels of human experience from mind to gene expression.

  5. For Catholic Colleges, an Important Goal: Don't Surprise the Bishop

    Science.gov (United States)

    Supiano, Beckie

    2009-01-01

    Every college president's success depends on building good relationships with outside groups, whether donors, alumni, or legislators. Presidents of Roman Catholic colleges have one more party to please: the local bishop. In recent months, the bishop of Scranton, Pennsylvania, asked colleges in his diocese to assure him that they were not providing…

  6. How to Produce a Surprise Ending for Readers---Writing Strategies in O. Henry’s The Last Leaf

    Institute of Scientific and Technical Information of China (English)

    姚雪莹

    2014-01-01

    “Twist ending”is well used by writers in a novel in order to leave readers a strong impression. In this essay, the author will focus on the narrative strategies that produce such a surprise ending in a short story, using as the example O. Henry’s The Last Leaf (1907). The author gives each of strategies O. Henry used in the novel a name and explains how it is used to produce a surprise ending, standing at the readers’perspective.

  7. Italian Succession Procedure

    OpenAIRE

    De Tullio, Giandomenico

    2012-01-01

    Italian inheritance law based on the Roman law of succession including legitimate succession and testamentary succession. The importance of drafting an Italian will and the legal requirements according to Italian Law.

  8. Driving Technological Surprise: DARPA’s Mission in a Changing World

    Science.gov (United States)

    2013-04-01

    fundamental ways. Our research, innovation, and entrepreneurial capacity is the envy of the world, but others are building universities, labs, and...through deep engagement with companies, universities, and DoD and other labs. Our success hinges on having a healthy U.S. R&D ecosystem . Within...contest our ability to project military power. Our economy is increasingly interdependent with a China that is redefining its own position in

  9. The Most Distant Mature Galaxy Cluster - Young, but surprisingly grown-up

    Science.gov (United States)

    2011-03-01

    Astronomers have used an armada of telescopes on the ground and in space, including the Very Large Telescope at ESO's Paranal Observatory in Chile to discover and measure the distance to the most remote mature cluster of galaxies yet found. Although this cluster is seen when the Universe was less than one quarter of its current age it looks surprisingly similar to galaxy clusters in the current Universe. "We have measured the distance to the most distant mature cluster of galaxies ever found", says the lead author of the study in which the observations from ESO's VLT have been used, Raphael Gobat (CEA, Paris). "The surprising thing is that when we look closely at this galaxy cluster it doesn't look young - many of the galaxies have settled down and don't resemble the usual star-forming galaxies seen in the early Universe." Clusters of galaxies are the largest structures in the Universe that are held together by gravity. Astronomers expect these clusters to grow through time and hence that massive clusters would be rare in the early Universe. Although even more distant clusters have been seen, they appear to be young clusters in the process of formation and are not settled mature systems. The international team of astronomers used the powerful VIMOS and FORS2 instruments on ESO's Very Large Telescope (VLT) to measure the distances to some of the blobs in a curious patch of very faint red objects first observed with the Spitzer space telescope. This grouping, named CL J1449+0856 [1], had all the hallmarks of being a very remote cluster of galaxies [2]. The results showed that we are indeed seeing a galaxy cluster as it was when the Universe was about three billion years old - less than one quarter of its current age [3]. Once the team knew the distance to this very rare object they looked carefully at the component galaxies using both the NASA/ESA Hubble Space Telescope and ground-based telescopes, including the VLT. They found evidence suggesting that most of the

  10. Surprising synthesis of nanodiamond from single-walled carbon nanotubes by the spark plasma sintering process

    Science.gov (United States)

    Mirzaei, Ali; Ham, Heon; Na, Han Gil; Kwon, Yong Jung; Kang, Sung Yong; Choi, Myung Sik; Bang, Jae Hoon; Park, No-Hyung; Kang, Inpil; Kim, Hyoun Woo

    2016-10-01

    Nanodiamond (ND) was successfully synthesized using single-walled carbon nanotubes (SWCNTs) as a pure solid carbon source by means of a spark plasma sintering process. Raman spectra and X-ray diffraction patterns revealed the generation of the cubic diamond phase by means of the SPS process. Lattice-resolved TEM images confirmed that diamond nanoparticles with a diameter of about ˜10 nm existed in the products. The NDs were generated mainly through the gas-phase nucleation of carbon atoms evaporated from the SWCNTs. [Figure not available: see fulltext.

  11. [Less need for insulin, a surprising effect of phototherapy in insulin-dependent diabetes mellitus].

    Science.gov (United States)

    Nieuwenhuis, R F; Spooren, P F M J; Tilanus, J J D

    2009-01-01

    A 40-year-old woman with insulin-dependent diabetes mellitus was treated successfully with phototherapy for a seasonal affective disorder. Following sessions of phototherapy she developed hypoglycaemias and required less insulin. A review of the literature showed that melatonin has an inhibiting effect on insulin sensitivity. The melatonin secretion, which is suppressed by phototherapy, may cause an immediate decrease in the plasma glucose levels. This decrease may well be important for patients with insulin-resistant diabetes mellitus and seasonal affective disorder.

  12. Potential success factors in brand development

    DEFF Research Database (Denmark)

    Esbjerg, Lars; Grunert, Klaus G.; Poulsen, Carsten Stig

    2005-01-01

    Branding is important to both retailers and manufacturers in the fast-moving consumer goods (FMCG) industry, as both parties attempt to develop strong brands in order to improve their position vis-à-vis each other and direct competitors. But what is required to develop a strong brand...... to the marketing of the brand." The branding literature mentions many important aspects, factors, issues, brand requirements, steps, building blocks or guidelines for building strong brands. However, these are all quite general and abstract. Given the substantial body of literature on branding, surprisingly few...... of this paper is to identify potential success factors in developing strong brands and to test whether these factors can be used to discriminate between strong and weak brands. It does so through a review of the literature for potential success factors. Furthermore, to ensure that important factors have...

  13. Sermon and surprise: The meaning of scheduling in broadcast radio history - and - CBC Radio 3: A disquieting revolution

    OpenAIRE

    Sahota, Anu

    2006-01-01

    Essay 1 : 'Sermon & Surprise' explores the importance of scheduling to radio's communicative uses. The essay argues that its capacity for continuous transmission and promotion of shared listening is unique to terrestrial radio. The strengths of traditional radio relative to contemporary on-demand audio media are explored. Early Canadian and British broadcasting policies and scheduling practices demonstrate how radio's programming conceits may innovatively accommodate broadcasting philosop...

  14. Surprisal analysis characterizes the free energy time course of cancer cells undergoing epithelial-to-mesenchymal transition.

    Science.gov (United States)

    Zadran, Sohila; Arumugam, Rameshkumar; Herschman, Harvey; Phelps, Michael E; Levine, R D

    2014-09-09

    The epithelial-to-mesenchymal transition (EMT) initiates the invasive and metastatic behavior of many epithelial cancers. Mechanisms underlying EMT are not fully known. Surprisal analysis of mRNA time course data from lung and pancreatic cancer cells stimulated to undergo TGF-β1-induced EMT identifies two phenotypes. Examination of the time course for these phenotypes reveals that EMT reprogramming is a multistep process characterized by initiation, maturation, and stabilization stages that correlate with changes in cell metabolism. Surprisal analysis characterizes the free energy time course of the expression levels throughout the transition in terms of two state variables. The landscape of the free energy changes during the EMT for the lung cancer cells shows a stable intermediate state. Existing data suggest this is the previously proposed maturation stage. Using a single-cell ATP assay, we demonstrate that the TGF-β1-induced EMT for lung cancer cells, particularly during the maturation stage, coincides with a metabolic shift resulting in increased cytosolic ATP levels. Surprisal analysis also characterizes the absolute expression levels of the mRNAs and thereby examines the homeostasis of the transcription system during EMT.

  15. A marriage full of surprises: forty-five years living with glutamate dehydrogenase.

    Science.gov (United States)

    Engel, Paul C

    2011-09-01

    Detailed kinetic studies of bovine glutamate dehydrogenase [GDH] from the 1960s revealed complexities that remain to be fully explained. In the absence of heterotropic nucleotide regulators the enzyme follows a random pathway of substrate addition but saturation with ADP enforces a compulsory-order mechanism in which glutamate is the leading substrate. The rate dependence on NAD(P)(+) concentration is complex and is probably only partly explained by negative binding cooperativity. Bovine GDH eluded successful analysis by crystallographers for 30 years but the final structural solution presented in this symposium at last provides a comprehensible framework for much of the heterotropic regulation, focussing attention on an antenna region in the C-terminal tail, a structure that is missing in the slightly smaller hexameric GDHs of lower organisms. Nonetheless, our studies with one such smaller (clostridial) GDH reveal that even without the antenna the underlying core structure still mediates homotropic cooperativity, and the ability to generate a variety of mutants has made it possible to start to dissect this machinery. In addition, this short personal review discusses a number of unresolved issues such as the significance of phospholipid inhibition and of specific interaction with mRNA, and above all the question of why it is necessary to regulate an enzyme reputedly maintaining its reactants at equilibrium and whether this might be in some way related to its coexistence with an energy-linked transhydrogenase.

  16. Fertility Clinic Success Rates

    Science.gov (United States)

    ... and Autism 2013 Assisted Reproductive Technology Fertility Clinic Success Rates Report Recommend on Facebook Tweet Share Compartir 2013 ART Fertility Clinic Success Rates Report [PDF - 1MB] Bookmarks and thumbnails are ...

  17. Networking for success

    Directory of Open Access Journals (Sweden)

    Paula Gould

    2002-02-01

    Few women working in materials science and engineering need reminding that they have entered a profession dominated by men. Having sat through science classes in school and college where more seats are occupied by male students, it can be no great surprise to find the situation replicated in many academic and industrial laboratories. Concern that this imbalance is limiting recruitment and retainment of promising female researchers has prompted many organizations to take a long hard look at gender equity issues. As the value of diversity is increasingly recognized, have women materials scientists and engineers ever had it so good?

  18. Sweet Smell of Success

    Institute of Scientific and Technical Information of China (English)

    Alison Abbott; 秦艳艳

    2004-01-01

    @@ Our ability to smell, known as "olfaction"①, is a potent② yet often neglected player in our sensory world, and a surprising 3% of our genes are dedicated to fine-tuning its subtleties③. T he Nobel Prize committee has now honored two scientists who have done most to determine just how we recognize and differentiate the scents of roses, wines,or of good or bad meat. Their work also helps explain how an evocative④ smell can take us back to a poignant⑤ time in our lives.

  19. USAR recruiting success factors

    OpenAIRE

    Thomas, George W.; Kocher, Kathryn M.; Gandolfo, Robin Ragsdale

    1987-01-01

    This study attempts to identify attributes associated with successful recruiters, to evaluate existing data on recruiter performance and characteristics, and to develop a model to aid in the selection of personnel who are likely to become successful recruiters. Conventional multivariate statistical techniques have not proved adequate in identifying successful recruiters, largely because of the absence of reliable and valid measures of recruiter success. This study applies a relatively new met...

  20. Success in Science, Success in Collaboration

    Energy Technology Data Exchange (ETDEWEB)

    Johnston, Mariann R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-25

    This is a series of four different scientific problems which were resolved through collaborations. They are: "Better flow cytometry through novel focusing technology", "Take Off®: Helping the Agriculture Industry Improve the Viability of Sustainable, Large-Production Crops", "The National Institutes of Health's Models of Infectious Disease Agent Study (MIDAS)", and "Expanding the capabilities of SOLVE/RESOLVE through the PHENIX Consortium." For each one, the problem is listed, the solution, advantages, bottom line, then information about the collaboration including: developing the technology, initial success, and continued success.

  1. Surprising microscopy subtleties: Measuring picoscale thicknesses, visualizing core orbitals, and detecting charge transfer using the TEM

    Science.gov (United States)

    Odlyzko, Michael Luke

    50 years ago, Richard Feynman delivered a now-famous address outlining why there was "plenty of room left at the bottom": there remained much progress to be made in seeing and manipulating matter all the way down to the atomic scale. One of many means to that end, argued Feynman, was to make electron microscopes better. Why could not electrons with wavelengths of a few picometers not be used to clearly image atoms hundreds of picometers in size? Why could not electron beams be used to pattern miniscule wires a handful of metal atoms across? Over the course of decades, Feynman's vision has been pursued zealously with rich reward, not least in the electron microscopy field. Enabled by the development of bright field-emission electron sources, high-resolution polepieces, and now aberration correctors, transmission electron microscopy (TEM) at atomic resolution has become routine. Seemingly, there is little room left at the bottom; after all, once you can clearly see atoms, what more is there left to do? Thankfully, there is plenty. Much of the hard work has been in the development of equipment that expands TEM to allow unprecedented spatially resolved analysis of elemental composition, inelastic scattering, and temporal processes. But there are also many opportunities to uncover new information using now widely available techniques and equipment. In the studies presented here, there has been some success in following the latter path. In tandem with careful computational analysis, selected-area electron diffraction allows not only determination of crystal symmetry, lattice parameter, and microstructure, but also measurements of material thickness on the scale of atomic layers. Supported by careful data processing and rigorous simulations, spatially resolved X-ray spectroscopy data is converted into real-space measurements of core-level electronic orbitals, in addition to providing routine atomic resolution chemical mapping. And aided by the development of novel bonding

  2. A surprisingly poor correlation between in vitro and in vivo testing of biomaterials for bone regeneration: results of a multicentre analysis.

    Science.gov (United States)

    Hulsart-Billström, G; Dawson, J I; Hofmann, S; Müller, R; Stoddart, M J; Alini, M; Redl, H; El Haj, A; Brown, R; Salih, V; Hilborn, J; Larsson, S; Oreffo, R O

    2016-05-24

    New regenerative materials and approaches need to be assessed through reliable and comparable methods for rapid translation to the clinic. There is a considerable need for proven in vitro assays that are able to reduce the burden on animal testing, by allowing assessment of biomaterial utility predictive of the results currently obtained through in vivo studies. The purpose of this multicentre review was to investigate the correlation between existing in vitro results with in vivo outcomes observed for a range of biomaterials. Members from the European consortium BioDesign, comprising 8 universities in a European multicentre study, provided data from 36 in vivo studies and 47 in vitro assays testing 93 different biomaterials. The outcomes of the in vitro and in vivo experiments were scored according to commonly recognised measures of success relevant to each experiment. The correlation of in vitro with in vivo scores for each assay alone and in combination was assessed. A surprisingly poor correlation between in vitro and in vivo assessments of biomaterials was revealed indicating a clear need for further development of relevant in vitro assays. There was no significant overall correlation between in vitro and in vivo outcome. The mean in vitro scores revealed a trend of covariance to in vivo score with 58 %. The inadequacies of the current in vitro assessments highlighted here further stress the need for the development of novel approaches to in vitro biomaterial testing and validated pre-clinical pipelines.

  3. Collaborative Resilience to Episodic Shocks and Surprises: A Very Long-Term Case Study of Zanjera Irrigation in the Philippines 1979–2010

    Directory of Open Access Journals (Sweden)

    Ruth Yabes

    2015-07-01

    Full Text Available This thirty-year case study uses surveys, semi-structured interviews, and content analysis to examine the adaptive capacity of Zanjera San Marcelino, an indigenous irrigation management system in the northern Philippines. This common pool resource (CPR system exists within a turbulent social-ecological system (SES characterized by episodic shocks such as large typhoons as well as novel surprises, such as national political regime change and the construction of large dams. The Zanjera nimbly responded to these challenges, although sometimes in ways that left its structure and function substantially altered. While a partial integration with the Philippine National Irrigation Agency was critical to the Zanjera’s success, this relationship required on-going improvisation and renegotiation. Over time, the Zanjera showed an increasing capacity to learn and adapt. A core contribution of this analysis is the integration of a CPR study within an SES framework to examine resilience, made possible the occurrence of a wide range of challenges to the Zanjera’s function and survival over the long period of study. Long-term analyses like this one, however rare, are particularly useful for understanding the adaptive and transformative dimensions of resilience.

  4. Communication Management and Trust: Their Role in Building Resilience to "Surprises" Such As Natural Disasters, Pandemic Flu, and Terrorism

    Directory of Open Access Journals (Sweden)

    P. H. Longstaff

    2008-06-01

    Full Text Available In times of public danger such as natural disasters and health emergencies, a country's communication systems will be some of its most important assets because access to information will make individuals and groups more resilient. Communication by those charged with dealing with the situation is often critical. We analyzed reports from a wide variety of crisis incidents and found a direct correlation between trust and an organization's preparedness and internal coordination of crisis communication and the effectiveness of its leadership. Thus, trust is one of the most important variables in effective communication management in times of "surprise."

  5. Predicting Classroom Success.

    Science.gov (United States)

    Kessler, Ronald P.

    A study was conducted at Rancho Santiago College (RSC) to identify personal and academic factors that are predictive of students' success in their courses. The study examined the following possible predictors of success: language and math test scores; background characteristics; length of time out of high school; high school background; college…

  6. Ingredients for successful partnerships

    NARCIS (Netherlands)

    S.M. Pfisterer (Stella)

    2011-01-01

    textabstractFor the development of new cross-sector partnerships it is required to know what the essence of successful partnership projects is. Which factors influence success or failure of partnerships is highly related to the specific context where partnerships operate. The literature on critical

  7. Successful grant writing

    NARCIS (Netherlands)

    Koppelman, Gerard H.; Holloway, John W.

    2012-01-01

    Obtaining research funding is central to the research process. However many (clinician-) scientists receive little, or no, training in the process of writing a successful grant application. In an era of reductions in research budgets and application success rates, the ability to construct a well pre

  8. Three Tiers to Success

    Science.gov (United States)

    Barton, Rhonda; Stepanek, Jennifer

    2009-01-01

    This article discusses a three-tiered, differentiated curriculum in a response to intervention (RTI) framework that has successfully raised achievement at all levels, but is particularly successful with ninth-grade students. Walla Walla (Washington) School District implemented the three-tiered intervention program as a series of differentiated…

  9. What is Success

    Institute of Scientific and Technical Information of China (English)

    Zhao; Lifang

    2015-01-01

    <正>Success is actually a feeling.Like confidence and satisfaction,it is a positive feeling which appears after individuals reach their own ideal dreams,fulfill their tasks.But everyone havetheir own opinions so that there are various thoughts on the theme what is success.

  10. Student Success Center Toolkit

    Science.gov (United States)

    Jobs For the Future, 2014

    2014-01-01

    "Student Success Center Toolkit" is a compilation of materials organized to assist Student Success Center directors as they staff, launch, operate, and sustain Centers. The toolkit features materials created and used by existing Centers, such as staffing and budgeting templates, launch materials, sample meeting agendas, and fundraising…

  11. Exploring MBA Career Success

    Science.gov (United States)

    Hay, Amanda; Hodgkinson, Myra

    2006-01-01

    Purpose: The purpose of this paper is to examine the meaning of career success in relation to the attainment of an MBA degree, for a group of experienced managers. In so doing, the paper aims to consider the adequacy of MBA career success, defined solely in terms of external criteria. Design/methodology/approach: A total of 36 in-depth interviews…

  12. Writing successful UX proposals

    CERN Document Server

    Hass, Chris

    2016-01-01

    Bringing new project funding and business opportunities to your organization is a vital part of UX professionals' growth and success. Writing Successful UX Proposals teaches the proven techniques for assessing proposal requests, writing successful commercial and government funding proposals, and enhancing your business development skills. This book will teach UX practitioners how to succeed in UX business development by mastering the following goals: * Understand how to assess a request for proposals* Understand the "anatomy" of a proposal response * Speak the business language of those who will be evaluating the proposed approach* Recognize the successes of others and build upon their advice Complete with case studies, tricks and tips, and real-world examples throughout, this is a must-have resource for UX professionals interested in honing their proposal writing skills and setting themselves up for success. * Provides unique sales and proposal writing insights tailored to the UX arena (including both resear...

  13. Successful systems sustaining change.

    Science.gov (United States)

    Bullas, Sheila; Bryant, John

    2007-01-01

    Much has been published on the success and particularly the failure of IT projects; still failures are commonplace. This prospective study focused from the outset on assessing risk of failure and addressing critical success factors. The aim was to apply existing methods in a challenging acute care hospital where success demanded rapid achievement of sustainable improvements in clinical and administrative processes. The implementations were part of the English National Programme for IT. The desired outcomes required the integration of accepted tools and techniques to provide a pragmatic approach to systems implementation: Lean, Six Sigma, PRINCE2 and Benefits Management. The outcome and further insights into success and failure of IT projects in healthcare are described. In particular lessons are identified related to the business need for the project and the successful achievement of the required benefits and business change.

  14. Successful grant writing.

    Science.gov (United States)

    Koppelman, Gerard H; Holloway, John W

    2012-03-01

    Obtaining research funding is central to the research process. However many (clinician-) scientists receive little, or no, training in the process of writing a successful grant application. In an era of reductions in research budgets and application success rates, the ability to construct a well presented, clear, articulate proposal is becoming more important than ever. Obtaining grants is a method to achieve your long term research goals. If you are able to formulate these long term goals, it is relevant to explore the market and investigate all potential grant opportunities. Finally, we will provide an outline of key elements of successful research grants.

  15. Software Maintenance Success Recipes

    CERN Document Server

    Reifer, Donald J

    2011-01-01

    Dispelling much of the folklore surrounding software maintenance, Software Maintenance Success Recipes identifies actionable formulas for success based on in-depth analysis of more than 200 real-world maintenance projects. It details the set of factors that are usually present when effective software maintenance teams do their work and instructs on the methods required to achieve success. Donald J. Reifer--an award winner for his contributions to the field of software engineering and whose experience includes managing the DoD Software Initiatives Office--provides step-by-step guidance on how t

  16. No surprises, please!

    Science.gov (United States)

    Davis, Dena S

    2013-01-01

    This narrative symposium examines the relationship of bioethics practice to personal experiences of illness. A call for stories was developed by Tod Chambers, the symposium editor, and editorial staff and was sent to several commonly used bioethics listservs and posted on the Narrative Inquiry in Bioethics website. The call asked authors to relate a personal story of being ill or caring for a person who is ill, and to describe how this affected how they think about bioethical questions and the practice of medicine. Eighteen individuals were invited to submit full stories based on review of their proposals. Twelve stories are published in this symposium, and six supplemental stories are published online only through Project MUSE. Authors explore themes of vulnerability, suffering, communication, voluntariness, cultural barriers, and flaws in local healthcare systems through stories about their own illnesses or about caring for children, partners, parents and grandparents. Commentary articles by Arthur Frank, Bradley Lewis, and Carol Taylor follow the collection of personal narratives.

  17. Tohoku earthquake: a surprise?

    CERN Document Server

    Kagan, Yan Y

    2011-01-01

    We consider three issues related to the 2011 Tohoku mega-earthquake: (1) how to evaluate the earthquake maximum size in subduction zones, (2) what is the repeat time for the largest earthquakes in Tohoku area, and (3) what are the possibilities of short-term forecasts during the 2011 sequence. There are two quantitative methods which can be applied to estimate the maximum earthquake size: a statistical analysis of the available earthquake record and the moment conservation principle. The latter technique studies how much of the tectonic deformation rate is released by earthquakes. For the subduction zones, the seismic or historical record is not sufficient to provide a reliable statistical measure of the maximum earthquake. The moment conservation principle yields consistent estimates of maximum earthquake size: for all the subduction zones the magnitude is of the order 9.0--9.7, and for major subduction zones the maximum earthquake size is statistically indistinguishable. Starting in 1999 we have carried out...

  18. A Pleasant Goat Surprise

    Institute of Scientific and Technical Information of China (English)

    JING XIAOLEI

    2010-01-01

    @@ For decades,Chinese children have sat in front of television sets mesmerized as American cartoon cat Tom chased mouse rival Jerry,or as the Japanese manga robot feline Doraemon helped his schoolboy companion Nobita Nobi.Now,the animated tables are turning and foreign kids are able to enjoy Chinese cartoons from the comfort of their couches.

  19. More statistics, less surprise

    CERN Multimedia

    Antonella Del Rosso & the LHCb collaboration

    2013-01-01

    The LHCb collaboration has recently announced new results for a parameter that measures the CP violation effect in particles containing charm quarks. The new values obtained with a larger data set and with a new independent method are showing that the effect is smaller than previous measurements had  suggested. The parameter is back into the Standard Model picture.   CP violation signals – in particles containing charm quarks, such as the D0 particle, is a powerful probe of new physics. Indeed, such effects could result in unexpected values of parameters whose expectation values in the Standard Model are known. Although less precise than similar approaches used in particles made of b quarks, the investigation of the charm system has proven  to be intriguing. The LHCb collaboration has reported new measurements of ΔACP, the difference in CP violation between the D0→K+K– and D0→π+π– decays. The results are ob...

  20. Monotony and Surprise

    Science.gov (United States)

    Apostolico, Alberto

    This paper reviews models and tools emerged in recent years in the author’s work in connection with the discovery of interesting or anomalous patterns in sequences. Whereas customary approaches to pattern discovery proceed from either a statistical or a syntactic characterization alone, the approaches described here present the unifying feature of combining these two descriptors in a solidly intertwined, composite paradigm, whereby both syntactic structure and occurrence lists concur to define and identify a pattern in a subject. In turn, this supports a natural notion of pattern saturation, which enables one to partition patterns into equivalence classes over intervals of monotonicity of commonly adopted scores, in such a way that the subset of class representatives, consisting solely of saturated patterns, suffices to account for all patterns in the subject. The benefits at the outset consist not only of an increased descriptive power, but especially of a mitigation of the often unmanageable roster of candidates unearthed in a discovery attempt, and of the daunting computational burden that goes with it.

  1. Surprised by selectivity

    NARCIS (Netherlands)

    de Jong, Krijn P|info:eu-repo/dai/nl/06885580X

    2016-01-01

    Lower olefins, particularly ethylene (C2H4), propylene (C3H6), and butylene (C4H8), are important intermediates in the manufacture of products such as plastics, solvents, paints, and medicines. They are produced worldwide in amounts exceeding 200 million tons per year (see the photo) (1), mostly

  2. The surprising superconductor

    Directory of Open Access Journals (Sweden)

    Taner Yildirim

    2002-04-01

    The serendipitous discovery by Akimitsu’s group1 of the superconductivity of MgB2 at Tc=39 K, almost twice the temperature of other simple intermetallic compounds, has sparked a race to uncover its basic properties and to find other related diborides with even higher Tcs. After the first announcement, the number of preprints appearing on the Los Alamos preprint server (Fig. 1 grew almost exponentially, reaching a maximum of about 60 studies in March (two papers a day, then decreasing linearly down to a paper every other day in August, and staying steady at about this rate until now. During the first year of the MgB2 era, more than 300 studies were published, exploring both fundamental and practical issues, such as the mechanism of the superconductivity; synthesis of MgB2 in the form of powder, thin films, wires, and tapes; the effect on Tc of substitution with various elements and on critical current and fields.

  3. Surprises in aperiodic diffraction

    CERN Document Server

    Baake, Michael

    2009-01-01

    Mathematical diffraction theory is concerned with the diffraction image of a given structure and the corresponding inverse problem of structure determination. In recent years, the understanding of systems with continuous and mixed spectra has improved considerably. Moreover, the phenomenon of homometry shows various unexpected new facets. Here, we report on some of the recent results in an exemplary and informal fashion.

  4. Jordan: Surprisingly Stable

    OpenAIRE

    Ådnegard, Elisabeth

    2014-01-01

    Over the years, research has demonstrated that conflict spreads to the host country as a consequence of massive influx of refugees. Most studies gathered empirical evidence from African countries and focused on cases where conflict had already spread. In contrast to this literature, the main objective of this thesis is to examine the absence of conflict in Jordan after receiving Syrian refugees that amount to about 10 percent of Jordan s original population over the past three years, 2011-201...

  5. Surprising quantum bounces

    CERN Document Server

    Nesvizhevsky, Valery

    2015-01-01

    This unique book demonstrates the undivided unity and infinite diversity of quantum mechanics using a single phenomenon: quantum bounces of ultra-cold particles. Various examples of such "quantum bounces" are: gravitational quantum states of ultra-cold neutrons (the first observed quantum states of matter in a gravitational field), the neutron whispering gallery (an observed matter-wave analog of the whispering gallery effect well known in acoustics and for electromagnetic waves), and gravitational and whispering gallery states for anti-matter atoms that remain to be observed. These quantum states are an invaluable tool in the search for additional fundamental short-range forces, for exploring the gravitational interaction and quantum effects of gravity, for probing physics beyond the standard model, and for furthering studies into the foundations of quantum mechanics, quantum optics, and surface science.

  6. No More Surprises.

    Science.gov (United States)

    Sorrel, Amy

    2016-05-01

    Texas Medical Association research shows that health plans' shrinking networks, caps on payments for medical care, inaccurate directories, and other tactics - not physician billing - are bearing down on patients in the form of unexpected, out-of-network balance bills.

  7. Neurotransmitter Switching? No Surprise.

    Science.gov (United States)

    Spitzer, Nicholas C

    2015-06-03

    Among the many forms of brain plasticity, changes in synaptic strength and changes in synapse number are particularly prominent. However, evidence for neurotransmitter respecification or switching has been accumulating steadily, both in the developing nervous system and in the adult brain, with observations of transmitter addition, loss, or replacement of one transmitter with another. Natural stimuli can drive these changes in transmitter identity, with matching changes in postsynaptic transmitter receptors. Strikingly, they often convert the synapse from excitatory to inhibitory or vice versa, providing a basis for changes in behavior in those cases in which it has been examined. Progress has been made in identifying the factors that induce transmitter switching and in understanding the molecular mechanisms by which it is achieved. There are many intriguing questions to be addressed.

  8. Surprise Trips 

    DEFF Research Database (Denmark)

    Korn, Matthias; Kawash, Raghid; Andersen, Lisbet Møller

    2010-01-01

    Little treasures in nature often go unnoticed by visitors when roaming about in a national park. Ubiquitous technology with its less intrusive character may be apt to enhance this natural experience of exploration. In this paper, we report on a system that augments this experience. It builds on t...

  9. Successful project management

    CERN Document Server

    Young, Trevor L

    2016-01-01

    Successful Project Management, 5th edition, is an essential guide for anyone who wants to improve the success rate of their projects. It will help managers to maintain a balance between the demands of the customer, the project, the team and the organization. Covering the more technical aspects of a project from start to completion it contains practised and tested techniques, covering project conception and start-up, how to manage stake holders, effective risk management, project planning and launch and execution. Also including a brand new glossary of key terms, it provides help with evaluating your project as well as practical checklists and templates to ensure success for any ambitious project manager. With over one million copies sold, the hugely popular Creating Success series covers a wide variety of topic, with the latest editions including new chapters such as Tough Conversations and Treating People Right. This indispensable business skills collection is suited to a variety of roles, from someone look...

  10. Tribes Communities Success Stories

    Data.gov (United States)

    U.S. Environmental Protection Agency — This data describes successful EPA projects and partnerships which are restoring local communities and watersheds within the San Francisco Bay Delta Watershed.

  11. Goodbye Career, Hello Success.

    Science.gov (United States)

    Komisar, Randy

    2000-01-01

    Success in today's economy means throwing out the old career rules. The "noncareer" career is driven by passion for the work and has the fluidity and flexibility needed in the contemporary workplace. (JOW)

  12. Human Resource Outsourcing Success

    Directory of Open Access Journals (Sweden)

    Hasliza Abdul-Halim

    2014-07-01

    Full Text Available The existing literature on partnership seems to take the relationship between partnership quality and outsourcing success for granted. Therefore, this article aims at examining the role of service quality in strengthening the relationship between partnership quality and human resource (HR outsourcing success. The samples were obtained from 96 manufacturing organizations in Penang, Malaysia. The results showed that partnership quality variables such as trust, business understanding, and communication have significant positive impact on HR outsourcing success, whereas in general, service quality was found to partially moderate these relationships. Therefore, comprehending the HR outsourcing relationship in the context of service quality may assist the organizations to accomplish HR outsourcing success by identifying areas of expected benefits and improvements.

  13. Goodbye Career, Hello Success.

    Science.gov (United States)

    Komisar, Randy

    2000-01-01

    Success in today's economy means throwing out the old career rules. The "noncareer" career is driven by passion for the work and has the fluidity and flexibility needed in the contemporary workplace. (JOW)

  14. Research into Success

    Directory of Open Access Journals (Sweden)

    Bogomir Novak

    1997-12-01

    Full Text Available As competition is becoming ever more fierce, research into the prerequisites for success is gaining ground. By most people, success is perceived as an external phenomenon, but it is in fact the consequence of a person's readiness to perform in the world (of business. In the paper, Novak distinguishes between internal, external and group success. The essence of interna!success, which is the condition for the other two types of success, is assuming responsibility for, and exercising self-control over one's psychic phenomena. This in fact means that one needs to "reprogramme" the old patterns of behaviour and substitute them for the new, which leads to personality changes based on the understanding and acceptance of the self and others as they are. In realizing personal abilities, motives and goals, mental guiding laws must also be taken into account. Nowadays, the overall success of an organization is an important indicator of the quality of gro up work. The working patterns of individuals comply with the patterns used by his or her colleagues. When we do something for ourselves, we do it for others. In certain organizations, through accepted ways of communication all people become successful, and no body needs to be paid off. Employees wholly identify themselves with their organization, and vice versa. This three-part paradigm (I-Others-Community is the basis for various models of practical training for success, which are often idealized, but are primarily aimed at abolishing passivity and flaws in the system and its wider environment.

  15. Reproductive success of bromadiolone-resistant rats in absence of anticoagulant pressure

    DEFF Research Database (Denmark)

    Heiberg, Ann-Charlotte; Leirs, Herwig; Siegismund, Hans Redlef

    2006-01-01

    that, for both males and females, surprisingly few individuals contributed to the next generation with numerous offspring, and most breeders contributed with none or a single offspring. The expected higher reproductive success and consequent increase in proportional numbers of sensitive rats...

  16. Microhydrodynamics of deformable particles: surprising responses of drops and vesicles to uniform electric field or shear flow

    Science.gov (United States)

    Vlahovska, Petia

    2015-11-01

    Particle motion in a viscous fluid is a classic problem that continues to surprise researchers. In this talk, I will discuss some intriguing, experimentally-observed behaviors of droplets and giant vesicles (cell-size lipid membrane sacs) in electric or flow fields. In a uniform electric field, a droplet deforms into an ellipsoid that can either be steadily tilted relative to the applied field direction or undergo unsteady motions (periodic shape oscillations or irregular flipping); a spherical vesicle can adopt a transient square shape or reversibly porate. In a steady shear flow, a vesicle can tank-tread, tumble or swing. Theoretical models show that the nonlinear drop dynamics originates from the interplay of Quincke rotation and interface deformation, while the vesicle dynamics stems from the membrane inextensibility. The practical motivation for this research lies in an improved understanding of technologies that rely on the manipulation of drops and cells by flow or electric fields.

  17. Successful ageing for psychiatrists.

    Science.gov (United States)

    Peisah, Carmelle

    2016-04-01

    This paper aims to explore the concept and determinants of successful ageing as they apply to psychiatrists as a group, and as they can be applied specifically to individuals. Successful ageing is a heterogeneous, inclusive concept that is subjectively defined. No longer constrained by the notion of "super-ageing", successful ageing can still be achieved in the face of physical and/or mental illness. Accordingly, it remains within the reach of most of us. It can, and should be, person-specific and individually defined, specific to one's bio-psycho-social and occupational circumstances, and importantly, reserves. Successful professional ageing is predicated upon insight into signature strengths, with selection of realistic goal setting and substitution of new goals, given the dynamic nature of these constructs as we age. Other essential elements are generativity and self-care. Given that insight is key, taking a regular stock or inventory of our reserves across bio-psycho-social domains might be helpful. Importantly, for successful ageing, this needs to be suitably matched to the professional task and load. This lends itself to a renewable personal ageing plan, which should be systemically adopted with routine expectations of self-care and professional responsibility. © The Royal Australian and New Zealand College of Psychiatrists 2015.

  18. Untangling Performance from Success

    CERN Document Server

    Yucesoy, Burcu

    2015-01-01

    Fame, popularity and celebrity status, frequently used tokens of success, are often loosely related to, or even divorced from professional performance. This dichotomy is partly rooted in the difficulty to distinguish performance, an individual measure that captures the actions of a performer, from success, a collective measure that captures a community's reactions to these actions. Yet, finding the relationship between the two measures is essential for all areas that aim to objectively reward excellence, from science to business. Here we quantify the relationship between performance and success by focusing on tennis, an individual sport where the two quantities can be independently measured. We show that a predictive model, relying only on a tennis player's performance in tournaments, can accurately predict an athlete's popularity, both during a player's active years and after retirement. Hence the model establishes a direct link between performance and momentary popularity. The agreement between the performa...

  19. Succeeding with succession planning.

    Science.gov (United States)

    McConnell, C R

    1996-12-01

    Succession planning is the process of identifying people who could presently move into key positions or could do so after specifically targeted development occurs. The process identifies the better people in the organization and takes a consistent approach to assembling, analyzing, and retaining information about potential leaders and planning for their further development. At its simplest level, it is the development of a backup and potential successor to each manager; at is most formal, it is a documented plan for management succession at all levels in the organization. Strongly supportive of a policy of development and promotion from within the organization, succession planning also represents a proactive posture in respect to inevitable management turnover. In these days of rapid change in health care, no modern organization that expects to keep up with increasing competition can afford to drift--or even to let a single department drift--while replacements are recruited for managers who resign, retire, or otherwise leave.

  20. Planning for physician succession: no longer a luxury.

    Science.gov (United States)

    Stanton, J

    1995-01-01

    Many physicians are opting for early retirement rather than contend with the increasing loss of control within the changing health care environment. Unfortunately, these retirements are often abrupt--the physician is unaware until it's too late that quick transitions to new successors can yield disastrous financial results. Physicians sometimes overestimate the intrinsic value of their practices. Sometimes they are emotionally unprepared for the change, and bail out at the last minute. In either case, unplanned successions often result in a serious loss of practice value for the physician and a loss of market share for affiliated hospitals and groups. Physicians and administrators must work together to prepare a succession plan allowing adequate time to phase in the new successor. This helps maintain practice value, ensuring there are no unpleasant surprises when the physician retires. In this article, the author also provides a checklist containing succession plan guidelines.

  1. Successful aging at work

    NARCIS (Netherlands)

    Zacher, Hannes

    2015-01-01

    The expression successful aging at work and related terms such as active, healthy, and productive aging at work are frequently used by organizational researchers and practitioners. However, there are no concrete definitions or theoretical frameworks that explain their meaning, assumptions, and

  2. Successful School Composting.

    Science.gov (United States)

    Mahar, Rhea Dawn

    2001-01-01

    School composting programs that have met the challenges inherent in long-term composting have several traits in common: a supportive educational program, schoolwide participation, and a consistent maintenance program. Examines the elements of success, offers examples of incorporating composting into the curriculum, and describes three methods of…

  3. Ensuring Students' Success

    Science.gov (United States)

    Oblinger, James L.

    2006-01-01

    James L. Oblinger, Chancellor of North Carolina State University, argues that higher education must continually evolve new methods of teaching and learning to support students' lifelong skills and impending careers. Part of ensuring students' success lies in finding alternative learning models, such as the Student-Centered Activities for Large…

  4. Success in Entrepreneurship

    DEFF Research Database (Denmark)

    Iversen, Jens; Malchow-Møller, Nikolaj; Sørensen, Anders

    2016-01-01

    What makes a successful entrepreneur? Using Danish register data, we find strong support for the hypothesis that theoretical skills from schooling and practical skills acquired through wage-work are complementary inputs in the human capital earnings function of entrepreneurs. In fact, we find tha...

  5. Pathways to School Success

    Science.gov (United States)

    University of Pittsburgh Office of Child Development, 2012

    2012-01-01

    In 2006, the University of Pittsburgh Office of Child Development began implementing a multi-year school readiness project in several area schools. Evidence from both research and the field point to several key elements that foster school readiness and create pathways to school success for all children. This paper presents components of a…

  6. Models of Success

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    @@ Wu Renbao made national celebrity for his commitment to achieving common prosperity among his co-villagers in Huaxi Village, Jiangsu Province.Wu's recipe for success was to take advantage of collective strength by encouraging mutual assistance between villages and households.

  7. International Student Success

    Science.gov (United States)

    Smith, Clayton

    2016-01-01

    This article, with a focus on North American postsecondary education, identifies international students as a strategic enrollment management institutional priority; presents themes in the international student retention, satisfaction, and success research literature; and describes related best practices. It also presents the findings from an…

  8. Mindfulness and Student Success

    Science.gov (United States)

    Leland, Matt

    2015-01-01

    Mindfulness has long been practiced in Eastern spiritual traditions for personal improvement, and educators and educational institutions have recently begun to explore its usefulness in schools. Mindfulness training can be valuable for helping students be more successful learners and more connected members of an educational community. To determine…

  9. FOCUS: Sustainable Mathematics Successes

    Science.gov (United States)

    Mireles, Selina V.; Acee, Taylor W.; Gerber, Lindsey N.

    2014-01-01

    The FOCUS (Fundamentals of Conceptual Understanding and Success) Co-Requisite Model Intervention (FOCUS Intervention) for College Algebra was developed as part of the Developmental Education Demonstration Projects (DEDP) in Texas. The program was designed to use multiple services, courses, and best practices to support student completion of a…

  10. Success in Entrepreneurship

    DEFF Research Database (Denmark)

    Iversen, Jens; Malchow-Møller, Nikolaj; Sørensen, Anders

    2016-01-01

    What makes a successful entrepreneur? Using Danish register data, we find strong support for the hypothesis that theoretical skills from schooling and practical skills acquired through wage-work are complementary inputs in the human capital earnings function of entrepreneurs. In fact, we find...

  11. Ramada, A Successful Example

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    <正>After entry into the WTO, China’s hotels are being challenged by the global economy and overseas counterparts. They should seek creation and development to fight against the fierce competition. Ramada Pudong Hotel, managed by the Shanghai Airport Group Civil Aviation Property Company Ltd, has become a successful example in creative management.

  12. Characteristics of Successful Entrepreneurs.

    Science.gov (United States)

    McClelland, David C.

    1987-01-01

    Comparison of characteristics of 12 average and 12 superior small business people in three developing nations (India, Malawi, and Ecuador) found proactive qualities such as initiative and assertiveness, achievement orientation, and commitment to others characteristic of successful entrepreneurs. Other expected qualities (self-confidence,…

  13. Ensuring Students' Success

    Science.gov (United States)

    Oblinger, James L.

    2006-01-01

    James L. Oblinger, Chancellor of North Carolina State University, argues that higher education must continually evolve new methods of teaching and learning to support students' lifelong skills and impending careers. Part of ensuring students' success lies in finding alternative learning models, such as the Student-Centered Activities for Large…

  14. Characteristics of Successful Entrepreneurs.

    Science.gov (United States)

    McClelland, David C.

    1987-01-01

    Comparison of characteristics of 12 average and 12 superior small business people in three developing nations (India, Malawi, and Ecuador) found proactive qualities such as initiative and assertiveness, achievement orientation, and commitment to others characteristic of successful entrepreneurs. Other expected qualities (self-confidence,…

  15. Successful Bilingual Education Programs

    Science.gov (United States)

    Montecel, Maria Robledo; Cortez, Josie

    2004-01-01

    This article describes a research project carried out by the Intercultural Development Research Association (IDRA). IDRA's primary research question for this study was, "What contributed to the success of a bilingual education classroom as evidenced by LEP student academic achievement?" In addition to the student data, qualitative and contextual…

  16. The plant host can affect the encapsidation of brome mosaic virus (BMV) RNA: BMV virions are surprisingly heterogeneous.

    Science.gov (United States)

    Ni, Peng; Vaughan, Robert C; Tragesser, Brady; Hoover, Haley; Kao, C Cheng

    2014-03-01

    Brome mosaic virus (BMV) packages its genomic and subgenomic RNAs into three separate viral particles. BMV purified from barley, wheat, and tobacco have distinct relative abundances of the encapsidated RNAs. We seek to identify the basis for the host-dependent differences in viral RNA encapsidation. Sequencing of the viral RNAs revealed recombination events in the 3' untranslated region of RNA1 of BMV purified from barley and wheat, but not from tobacco. However, the relative amounts of the BMV RNAs that accumulated in barley and wheat are similar and RNA accumulation is not sufficient to account for the difference in RNA encapsidation. Virions purified from barley and wheat were found to differ in their isoelectric points, resistance to proteolysis, and contacts between the capsid residues and the RNA. Mass spectrometric analyses revealed that virions from the three hosts had different post-translational modifications that should impact the physiochemical properties of the virions. Another major source of variation in RNA encapsidation was due to the purification of BMV particles to homogeneity. Highly enriched BMV present in lysates had a surprising range of sizes, buoyant densities, and distinct relative amounts of encapsidated RNAs. These results show that the encapsidated BMV RNAs reflect a combination of host effects on the physiochemical properties of the viral capsids and the enrichment of a subset of virions. The previously unexpected heterogeneity in BMV should influence the timing of the infection and also the host innate immune responses.

  17. Benford's law predicted digit distribution of aggregated income taxes: the surprising conformity of Italian cities and regions

    Science.gov (United States)

    Mir, Tariq Ahmad; Ausloos, Marcel; Cerqueti, Roy

    2014-11-01

    The yearly aggregated tax income data of all, more than 8000, Italian municipalities are analyzed for a period of five years, from 2007 to 2011, to search for conformity or not with Benford's law, a counter-intuitive phenomenon observed in large tabulated data where the occurrence of numbers having smaller initial digits is more favored than those with larger digits. This is done in anticipation that large deviations from Benford's law will be found in view of tax evasion supposedly being widespread across Italy. Contrary to expectations, we show that the overall tax income data for all these years is in excellent agreement with Benford's law. Furthermore, we also analyze the data of Calabria, Campania and Sicily, the three Italian regions known for strong presence of mafia, to see if there are any marked deviations from Benford's law. Again, we find that all yearly data sets for Calabria and Sicily agree with Benford's law whereas only the 2007 and 2008 yearly data show departures from the law for Campania. These results are again surprising in view of underground and illegal nature of economic activities of mafia which significantly contribute to tax evasion. Some hypothesis for the found conformity is presented.

  18. Medial superior olivary neurons receive surprisingly few excitatory and inhibitory inputs with balanced strength and short-term dynamics.

    Science.gov (United States)

    Couchman, Kiri; Grothe, Benedikt; Felmy, Felix

    2010-12-15

    Neurons in the medial superior olive (MSO) process microsecond interaural time differences, the major cue for localizing low-frequency sounds, by comparing the relative arrival time of binaural, glutamatergic excitatory inputs. This coincidence detection mechanism is additionally shaped by highly specialized glycinergic inhibition. Traditionally, it is assumed that the binaural inputs are conveyed by many independent fibers, but such an anatomical arrangement may decrease temporal precision. Short-term depression on the other hand might enhance temporal fidelity during ongoing activity. For the first time we show that binaural coincidence detection in MSO neurons may require surprisingly few but strong inputs, challenging long-held assumptions about mammalian coincidence detection. This study exclusively uses adult gerbils for in vitro electrophysiology, single-cell electroporation and immunohistochemistry to characterize the size and short-term plasticity of inputs to the MSO. We find that the excitatory and inhibitory inputs to the MSO are well balanced both in strength and short-term dynamics, redefining this fastest of all mammalian coincidence detector circuits.

  19. Small(pox) success?

    Science.gov (United States)

    Birn, Anne-Emanuelle

    2011-02-01

    The 30th anniversary of the World Health Organization's (WHO) official certification of smallpox eradication was marked by a slew of events hailing the campaign's dramatic tale of technological and organizational triumph against an ancient scourge. Yet commemorations also serve as moments of critical reflection. This article questions the acclaim showered upon smallpox eradication as the single greatest public health success in history. It examines how and why smallpox eradication and WHO's concurrent social justice-oriented primary health care approach (following from the Declaration of Alma-Ata) became competing paradigms. It synthesizes critiques of eradication's shortcomings and debunks some of the myths surrounding the global eradication campaign as a public health priority and necessity, and as a Cold War victory of cooperation. The article concludes with thoughts on integrating technical and social-political aspects of health within the context of welfare states as the means to achieving widespread and enduring global public health success.

  20. Small(pox success?

    Directory of Open Access Journals (Sweden)

    Anne-Emanuelle Birn

    Full Text Available The 30th anniversary of the World Health Organization's (WHO official certification of smallpox eradication was marked by a slew of events hailing the campaign's dramatic tale of technological and organizational triumph against an ancient scourge. Yet commemorations also serve as moments of critical reflection. This article questions the acclaim showered upon smallpox eradication as the single greatest public health success in history. It examines how and why smallpox eradication and WHO's concurrent social justice-oriented primary health care approach (following from the Declaration of Alma-Ata became competing paradigms. It synthesizes critiques of eradication's shortcomings and debunks some of the myths surrounding the global eradication campaign as a public health priority and necessity, and as a Cold War victory of cooperation. The article concludes with thoughts on integrating technical and social-political aspects of health within the context of welfare states as the means to achieving widespread and enduring global public health success.

  1. Styles of success

    DEFF Research Database (Denmark)

    Dahlgaard, Jens Jørn; Nørgaard, Anders; Jakobsen, Søren

    1997-01-01

    of business excellence points to a clear agenda for success. Setting clear strategic goals and the ability to take a long-term view of an organization's direction, combined with other leadership attributes such as creativity, teambuilding and learning, are principal keys to creating an excellent organization......Corporate success stories tend to emphasize the "great men" theory of history. But now a European research project established the managerial attributes that can turn an ordinary leader into one ideal for the pursuit of business excellence. The emergence of five leadership styles as crucial drivers...... lower value on the teambuilding and strategic competencies than is required for business excellence. By contrast, their "ideal" leader is heavily characterized by being a creative, inspiring, and active problem-solver....

  2. Styles of success

    DEFF Research Database (Denmark)

    Dahlgaard, Jens Jørn; Nørgaard, Anders; Jakobsen, Søren

    1997-01-01

    Corporate success stories tend to emphasize the "great men" theory of history. But now a European research project established the managerial attributes that can turn an ordinary leader into one ideal for the pursuit of business excellence. The emergence of five leadership styles as crucial drivers...... of business excellence points to a clear agenda for success. Setting clear strategic goals and the ability to take a long-term view of an organization's direction, combined with other leadership attributes such as creativity, teambuilding and learning, are principal keys to creating an excellent organization....... Leaders seeking to achive business excellence must view the high-level attainment of these sets of leadership competencies as their paramount objective. In striving for business excellence, European leaders may encounter resistance among their employees. Crucially, European employees place a markedly...

  3. Profile of success

    DEFF Research Database (Denmark)

    Dahlgaard, Jens Jørn; Nørgaard, Anders; Jakobsen, Søren

    1998-01-01

    What management skills must Europe's business leaders improve to achieve business excellence? Which country's leaders are best placed for success? Does the next generation have what it takes to compete? In the second half of their study of the leadership styles that drive business excellence, Jens...... Dahlgaard, Anders Nørgaard and Søren Jakobsen describe an excellent leadership profile that provides the answers....

  4. Successful time management

    CERN Document Server

    Forsyth, Patrick

    2016-01-01

    Packed with tips and techniques, Successful Time Management serves as a guide to reviewing and assessing new work practices to improve time management. It includes great time-saving ideas, practical solutions, checklists, and advice on controlling paperwork, delegating and working with others, prioritizing to focus on key issues, and getting and staying organized. This new third edition contains new practical tips on using email in a time effective manner and dealing with other internet-based tools and apps to help productivity.

  5. Principles of successful partnerships.

    Science.gov (United States)

    Mangold, Kara; Denke, Nancy J; Gorombei, Deb; Ostroski, Tammy L; Root, Lynda

    2014-01-01

    Health care providers must understand and value the unique contributions of all interdisciplinary professionals, with the goal of optimizing the wellness or illness needs of each patient. Work cannot be done in silos, and the ability to develop and sustain effective professional partnerships is essential. Health care teams must work within a complex environment that depends on the shared efforts of multiple professionals to successfully provide care in a fragmented, highly stressed system. Implementing partnerships that foster relationships through shared interests, vision, and values can aid in the coordination of resources to provide a more positive patient experience and outcome. The development of partnerships requires time and acceptance of shared risks and responsibilities. In return, involved parties will be able to build trust, share rewards, and expand the possibilities of what can be accomplished. The purpose of this review is to describe results-oriented partnerships, which include the attributes of collaboration, coordination, and communication. Essential concepts and practical tools for success are reviewed to offer new and existing partnerships a lens through which to view interdisciplinary interactions that can contribute to organizational success and longevity. Potential pitfalls that may impact patient services and organizational health are also discussed.

  6. Successful Succession in Family Businesses : Individual Level Factors and Succession Planning Models.

    OpenAIRE

    Aleem, Majid; Islam, Md. Shariful

    2009-01-01

    Individual level factors related to the successor have a central role to play in the succession process of the business. When these factors are viewed in relation to succession planning models, these factors have a direct relation to the succession models in terms of success or failure of the succession process. The major contributing factor to the success or failure of the succession process is that of the leadership provided to the organization by the predecessor. These leadership qualities...

  7. Food insecurity and mental health: surprising trends among community health volunteers in Addis Ababa, Ethiopia during the 2008 food crisis.

    Science.gov (United States)

    Maes, Kenneth C; Hadley, Craig; Tesfaye, Fikru; Shifferaw, Selamawit

    2010-05-01

    The 2008 food crisis may have increased household food insecurity and caused distress among impoverished populations in low-income countries. Policy researchers have attempted to quantify the impact that a sharp rise in food prices might have on population wellbeing by asking what proportion of households would drop below conventional poverty lines given a set increase in prices. Our understanding of the impact of food crises can be extended by conducting micro-level ethnographic studies. This study examined self-reported household food insecurity (FI) and common mental disorders (CMD) among 110 community health AIDS care volunteers living in Addis Ababa, Ethiopia during the height of the 2008 food crisis. We used generalized estimating equations that account for associations between responses given by the same participants over 3 survey rounds during 2008, to model the longitudinal response profiles of FI, CMD symptoms, and socio-behavioral and micro-economic covariates. To help explain the patterns observed in the response profiles and regression results, we examine qualitative data that contextualize the cognition and reporting behavior of AIDS care volunteers, as well as potential observation biases inherent in longitudinal, community-based research. Our data show that food insecurity is highly prevalent, that is it associated with household economic factors, and that it is linked to mental health. Surprisingly, the volunteers in this urban sample did not report increasingly severe FI or CMD during the peak of the 2008 food crisis. This is a counter-intuitive result that would not be predicted in analyses of population-level data such as those used in econometrics simulations. But when these results are linked to real people in specific urban ecologies, they can improve our understanding of the psychosocial consequences of food price shocks.

  8. Surprisingly low compliance to local guidelines for risk factor based screening for gestational diabetes mellitus - A population-based study

    Directory of Open Access Journals (Sweden)

    Winkvist Anna

    2009-11-01

    Full Text Available Abstract Background Screening for gestational diabetes mellitus (GDM is routine during pregnancy in many countries in the world. The screening programs are either based on general screening offered to all pregnant women or risk factor based screening stipulated in local clinical guidelines. The aims of this study were to investigate: 1 the compliance with local guidelines of screening for GDM and 2 the outcomes of pregnancy and birth in relation to risk factors of GDM and whether or not exposed to oral glucose tolerance test (OGTT. Methods This study design was a population-based retrospective cross-sectional study of 822 women. A combination of questionnaire data and data collected from medical records was applied. Compliance to the local guidelines of risk factor based screening for GDM was examined and a comparison of outcomes of pregnancy and delivery in relation to risk factor groups for GDM was performed. Results Of the 822 participants, 257 (31.3% women fulfilled at least one criterion for being exposed to screening for GDM according to the local clinical guidelines. However, only 79 (30.7% of these women were actually exposed to OGTT and of those correctly exposed for screening, seven women were diagnosed with GDM. Women developing risk factors for GDM during pregnancy had a substantially increased risk of giving birth to an infant with macrosomia. Conclusion Surprisingly low compliance with the local clinical guidelines for screening for GDM during pregnancy was found. Furthermore, the prevalence of the risk factors of GDM in our study was almost doubled compared to previous Swedish studies. Pregnant women developing risk factors of GDM during pregnancy were found to be at substantially increased risk of giving birth to an infant with macrosomia. There is a need of actions improving compliance to the local guidelines.

  9. A new in vivo model of pantothenate kinase-associated neurodegeneration reveals a surprising role for transcriptional regulation in pathogenesis.

    Directory of Open Access Journals (Sweden)

    Varun ePandey

    2013-09-01

    Full Text Available Pantothenate Kinase-Associated Neurodegeneration (PKAN is a neurodegenerative disorder with a poorly understood molecular mechanism. It is caused by mutations in Pantothenate Kinase, the first enzyme in the Coenzyme A (CoA biosynthetic pathway. Here, we developed a Drosophila model of PKAN (tim-fbl flies that allows us to continuously monitor the modeled disease in the brain. In tim-fbl flies, downregulation of fumble, the Drosophila PanK homologue in the cells containing a circadian clock results in characteristic features of PKAN such as developmental lethality, hypersensitivity to oxidative stress, and diminished life span. Despite quasi-normal circadian transcriptional rhythms, tim-fbl flies display brain-specific aberrant circadian locomotor rhythms, and a unique transcriptional signature. Comparison with expression data from flies exposed to paraquat demonstrates that, as previously suggested, pathways others than oxidative stress are affected by PANK downregulation. Surprisingly we found a significant decrease in the expression of key components of the photoreceptor recycling pathways, which could lead to retinal degeneration, a hallmark of PKAN. Importantly, these defects are not accompanied by changes in structural components in eye genes suggesting that changes in gene expression in the eye precede and may cause the retinal degeneration. Indeed tim-fbl flies have diminished response to light transitions, and their altered day/night patterns of activity demonstrates defects in light perception. This suggest that retinal lesions are not solely due to oxidative stress and demonstrates a role for the transcriptional response to CoA deficiency underlying the defects observed in dPanK deficient flies. Moreover, in the present study we developed a new fly model that can be applied to other diseases and that allows the assessment of neurodegeneration in the brains of living flies.

  10. Meta-Model of Resilient information System

    OpenAIRE

    Ahmed, Adnan; Hussain, Syed Shahram

    2007-01-01

    The role of information systems has become very important in today’s world. It is not only the business organizations who use information systems but the governments also posses’ very critical information systems. The need is to make information systems available at all times under any situation. Information systems must have the capabilities to resist against the dangers to its services,performance & existence, and recover to its normal working state with the available resources in catas...

  11. Kriging metamodels and global opimization in simulation

    NARCIS (Netherlands)

    Mehdad, E.

    2015-01-01

    Simulation is a popular tool for analyzing complex systems. However, simulation models are often difficult to build and require significant time to run. We often need to invest much money and time to use a simulation model of a complex system. To benefit more from a simulation investment, we may use

  12. Control Variates and Optimal Designs in Metamodeling

    Science.gov (United States)

    2013-03-01

    and a series of tasks. Henderson and Kim (2004) apply CVs to a discrete time-finite state space markov chain . Adewunmi and Aickelin (2012) describe...going into further detail with an example in machine repair, an inventory system, and an M/M/1 queue . Fort and Moulines (2008) apply CVs to financial...are not always effective. Because of the long queue length these control variates are ineffective, additional iterations of the model will show that

  13. Negotiating Successfully in Asia

    Directory of Open Access Journals (Sweden)

    Michael Benoliel

    2013-08-01

    Full Text Available Cross-cultural negotiations are complex, challenging, and difficult to navigate because much of the Asian culture is unstated, implicit, and internalized in subtle behavioral patterns. It is like an iceberg; more is invisible and less is visible. To understand how the Asian negotiation values and practices are different from those in the West, I describe briefly the Asian cultural roots, highlight the major dimensions that differentiate cultures, explore the factors that influence the Asian negotiation processes and outcomes, and provide a list of practical suggestions for negotiating successful deals with Asian negotiators.

  14. Successful product realization strategies

    Science.gov (United States)

    Peeples, John; Boulton, William R.

    1995-02-01

    Product realization is the process of defining, designing, developing, and delivering products to the market. While the main thrust of this JTEC panel was to conduct a complete investigation of the state of Japanese low-cost electronic packaging technologies, it is very difficult to totally separate the development of technology and products from the product realization process. Japan's electronics firms adhere to a product realization strategy based on a strong customer focus, a consistent commitment to excellence in design, and a cost-effective approach to technology commercialization. The Japanese product-pull strategy has been a successful driver and influencing factor in every aspect of the product development cycle.

  15. Successful innovation by motivation

    Directory of Open Access Journals (Sweden)

    Petra Koudelková

    2015-10-01

    Full Text Available Innovation is one of the most important factors for business growth. Human capital plays a significant role in the successful process of innovation. This article deals with employee motivation in the innovation process and the main scientific aim of this study is to present results of research that was undertaken in the Czech Republic at the beginning of 2013. Questionnaires were used for the survey and statistical analyses such as Chi square test or Hierarchical cluster analysis were used for data processing. This study also provides a theoretical and practical overview of business innovation in the Czech Republic.

  16. Implant success!!!.....simplified

    Directory of Open Access Journals (Sweden)

    Luthra Kaushal

    2009-01-01

    Full Text Available The endeavor towards life-like restoration has helped nurture new vistas in the art and science of implant dentistry. The protocol of "restoration-driven implant placement" ensures that the implant is an apical extension of the ideal future restoration and not the opposite. Meticulous pre-implant evaluation of soft and hard tissues, diagnostic cast and use of aesthetic wax-up and radiographic template combined with surgical template can simplify the intricate roadmap for appropriate implant treatment. By applying the harmony of artistic skill, scientific knowledge and clinical expertise, we can simply master the outstanding implant success in requisites of aesthetics, phonetics and function.

  17. SUCCESS OUTSIDE THE CITY

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    @@ Coming as a much-needed boon for the countryside,this year's No.1 Document,jointly released by the Central Committee of the Communist Party of China and the State Council,pledges a series of concrete measures to propel improvements in the rural economy and the livelihood of farmers.Nevertheless,rural prosperity in line with the success of urban areas will prove to be a challenge.The foundation of the agriculture sector remains fluid,and farmers lack a stable source of income and a reliable social safety net.

  18. The surprisingly small but increasing role of international agricultural trade on the European Union’s dependence on mineral phosphorus fertiliser

    Science.gov (United States)

    Nesme, Thomas; Roques, Solène; Metson, Geneviève S.; Bennett, Elena M.

    2016-02-01

    Phosphorus (P) is subject to global management challenges due to its importance to both food security and water quality. The European Union (EU) has promoted policies to limit fertiliser over-application and protect water quality for more than 20 years, helping to reduce European P use. Over this time period, the EU has, however, become more reliant on imported agricultural products. These imported products require fertiliser to be used in distant countries to grow crops that will ultimately feed European people and livestock. As such, these imports represent a displacement of European P demand, possibly allowing Europe to decrease its apparent P footprint by moving P use to locations outside the EU. We investigated the effect of EU imports on the European P fertiliser footprint to better understand whether the EU’s decrease in fertiliser use over time resulted from P demand being ‘outsourced’ to other countries or whether it truly represented a decline in P demand. To do this, we quantified the ‘virtual P flow’ defined as the amount of mineral P fertiliser applied to agricultural soils in non-EU countries to support agricultural product imports to the EU. We found that the EU imported a virtual P flow of 0.55 Tg P/yr in 1995 that, surprisingly, decreased to 0.50 Tg P/yr in 2009. These results were contrary to our hypothesis that trade increases would be used to help the EU reduce its domestic P fertiliser use by outsourcing its P footprint abroad. Still, the contribution of virtual P flows to the total P footprint of the EU has increased by 40% from 1995 to 2009 due to a dramatic decrease in domestic P fertiliser use in Europe: in 1995, virtual P was equivalent to 32% of the P used as fertiliser domestically to support domestic consumption but jumped to 53% in 2009. Soybean and palm tree products from South America and South East Asia contributed most to the virtual P flow. These results demonstrate that, although policies in the EU have successfully

  19. LEIR commissioning successfully completed

    CERN Multimedia

    2006-01-01

    An important milestone has been passed in the preparation of the injector complex to supply ions to the LHC experiments. The LEIR lead-ion beam, seen on one of the control screens just before the PS injection region. The Low-Energy Ion Ring - LEIR for short - has passed its first tests with flying colours. On 12 May, the ring that will accumulate lead ions for the LHC was shut down after seven months of tests (see Bulletin 44/2005). 'The commissioning phase was a resounding success,' enthuses a satisfied Michel Chanel, head of the LEIR construction project. After several months of fine-tuning, the LEIR team has achieved its aim of producing the kind of beam required for first lead-ion collisions in the LHC in 2008. This involved creating bunches containing 230 million ions, in line with the specifications for those first beams. This success can be put down to the machine's outstanding design and components. 'It's a great achivement by all the teams involved in the machine's construction,' underlines Christian...

  20. A surprise at the bottom of the main sequence: Rapid rotation and NO H-alpha emission

    Science.gov (United States)

    Basri, Gibor; Marcy, Geoffrey W.

    1995-01-01

    We report Kech Observatory high-resolution echelle spectra from 640-850 nm for eight stars near the faint end of the main sequence. These spectra are the highest resolution spectra of such late-type stars, and clearly resolve the TiO, VO, and atomic lines. The sample includes the field brown-dwarf candidate, BRI 0021-0214 (M9.5+). Very unexpectedly, it shows the most rapid rotation in the entire samples, v sin i approximately 40 km/s, which is 20x faster than typical field nonemission M stars. Equally surprising is that BRI 0021 exhibits no emission or absorptionat H-alpha. We argue that this absence is not simply due to its cool photosphere, but that stellar activity declines in a fundamental way at the end of the main sequence. As it is the first very late M dwarf observed at high spectral resolution, BRI 0021 may be signaling a qualitative change in the angular momentum loss rate among the lowest mass stars. Conventionally, its rapid rotation would have marked BRI 0021 as very young, consistent with the selection effect which arises if the latest-type dwarfs are really brown dwarfs on cooling curves. In any case, it is unprecedented to find no sign of stellar activity in such a rapidly rotating convective star. We also discuss the possible conflict between this observation and the extremely strong H-alpha seen in another very cool star, PC 0025+0447. Extrapolation of M-L relations for BRI 0021 yields M approximately 0.065 solar mass, and the other sample objects have expected masses near the H-burning limit. These include two Pleiades brown-dwarf candidates, four field M6 dwarfs and one late-type T Tauri star. The two Pleiades M6 dwarfs have v sin i of 26 and 37 km/s, H-alpha in emission, and radial velocities consistent with Pleiades M6 dwarfs have v sin i of 26 and 37 km/s, H-alpha in emission, and radial velocities consistent with Pleiades membership. Similarly, the late-type T Tauri star has v sin i approximately 30 km/s and H alpha emission indicate of its

  1. LHC synchronization test successful

    CERN Multimedia

    The synchronization of the LHC's clockwise beam transfer system and the rest of CERN's accelerator chain was successfully achieved last weekend. Tests began on Friday 8 August when a single bunch of a few particles was taken down the transfer line from the SPS accelerator to the LHC. After a period of optimization, one bunch was kicked up from the transfer line into the LHC beam pipe and steered about 3 kilometres around the LHC itself on the first attempt. On Saturday, the test was repeated several times to optimize the transfer before the operations group handed the machine back for hardware commissioning to resume on Sunday. The anti-clockwise synchronization systems will be tested over the weekend of 22 August.Picture:http://lhc-injection-test.web.cern.ch/lhc-injection-test/

  2. Iridium: failures & successes

    Science.gov (United States)

    Christensen, CarissaBryce; Beard, Suzette

    2001-03-01

    This paper will provide an overview of the Iridium business venture in terms of the challenges faced, the successes achieved, and the causes of the ultimate failure of the venture — bankruptcy and system de-orbit. The paper will address technical, business, and policy issues. The intent of the paper is to provide a balanced and accurate overview of the Iridium experience, to aid future decision-making by policy makers, the business community, and technical experts. Key topics will include the history of the program, the objectives and decision-making of Motorola, the market research and analysis conducted, partnering strategies and their impact, consumer equipment availability, and technical issues — target performance, performance achieved, technical accomplishments, and expected and unexpected technical challenges. The paper will use as sources trade media and business articles on the Iridium program, technical papers and conference presentations, Wall Street analyst's reports, and, where possible, interviews with participants and close observers.

  3. SUCCESS OUTSIDE THE CITY

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    Coming as a much-needed boon for the countryside, this year’s No.1 Document, jointly released by the Central Committee of the Communist Party of China and the State Council, pledges a series of concrete measures to propel improvements in the rural economy and the livelihood of farmers. Nevertheless, rural prosperity in line with the success of urban areas will prove to be a challenge. The foundation of the agriculture sector remains fluid, and farmers lack a stable source of income and a reliable social safety net. So how can China promote the sustained development of its agriculture sector and social improvements to the countryside? Experts and economists provide suggestions to achieving these goals.

  4. Fluoridation: strategies for success.

    Science.gov (United States)

    Isman, R

    1981-07-01

    Of 19 referenda on community water fluoridation held in the first six months of 1980, 17 were defeated. Among the postulated reasons are a growing distrust of government and the health establishment. The public remains largely ignorant of the purpose and benefits of fluoridation. The emotionalism surrounding the issue has made it difficult to generate public support outside of the health professions. Opponents have also learned to fight fluoridation with increasingly sophisticated techniques. Some of the strategies used in recent successful campaigns in Oakland, California, and Portland, Oregon are described; recommendations that can be applied to communities considering fluoridation include careful wording of ballot measures so they are unequivocally clear and simple; timing ballot measures with elections likely to draw the largest voter turnout; broadening the base of political and financial support; using a figurehead if possible; and making maximum use of the media.

  5. A metric for success

    Science.gov (United States)

    Carver, Gary P.

    1994-05-01

    The federal agencies are working with industry to ease adoption of the metric system. The goal is to help U.S. industry compete more successfully in the global marketplace, increase exports, and create new jobs. The strategy is to use federal procurement, financial assistance, and other business-related activities to encourage voluntary conversion. Based upon the positive experiences of firms and industries that have converted, federal agencies have concluded that metric use will yield long-term benefits that are beyond any one-time costs or inconveniences. It may be time for additional steps to move the Nation out of its dual-system comfort zone and continue to progress toward metrication. This report includes 'Metric Highlights in U.S. History'.

  6. Overcome barriers to career success

    Energy Technology Data Exchange (ETDEWEB)

    Raudsepp, E.

    1983-04-01

    A test is given to determine if an engineer suffers from one of the three barriers to technical success: fear of success, fear of failure, or perfectionism. As in most such tests, the middle way is best. Successful engineers know that perfection cannot be attained, that they don't have time to worry about failure or success, and that by aiming and perservering in doing things well, success can be achieved.

  7. Surprising Impact of Remote Groups on the Folding-Unfolding and Dimer-Chain Equilibria of Bifunctionl H-Bonding Unimers

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Rui; Cheng, Shuang; Baker, Erin Shammel; Smith, Richard D.; Zeng, Xiao Cheng; Gong, Bing

    2016-01-28

    Oligoamide 1, consisting of two H-bonding units linked by a trimethylene linker, was previously found to form a very stable, folded dimer. In this work, replacing the side chains and end groups of 1 led to derivatives that show the surprising impact of end groups on the folding and dimer-chain equilibria of the resultant molecules.

  8. Success and adaptation

    CERN Multimedia

    2013-01-01

    Yesterday morning, the last colliding proton beams of 2013 were extracted from the LHC, heralding the start of the machine’s first long shutdown (LS1) and crowning its first three glorious years of running. I hardly need to tell the CERN community what a fabulous performance all the people running the machine, the experiments, the computing and all supporting infrastructures put in. Those people are you, and you all know very well what a great job everyone did.   Nevertheless, I would like to express my thanks to all the people who made this first LHC run such a success. Re-measuring the whole Standard Model in such a short period, and then completing it with the discovery of what looks increasingly like the Higgs boson, is no mean feat. What I’d like to focus on today is another aspect of our field: its remarkable ability to adapt. When I started out in research, experiments involved a handful of people and lasted a few years at most. The timescale for the development of ...

  9. Assessing call centers’ success:

    Directory of Open Access Journals (Sweden)

    Hesham A. Baraka

    2013-07-01

    This paper introduces a model to evaluate the performance of call centers based on the Delone and McLean Information Systems success model. A number of indicators are identified to track the call center’s performance. Mapping of the proposed indicators to the six dimensions of the D&M model is presented. A Weighted Call Center Performance Index is proposed to assess the call center performance; the index is used to analyze the effect of the identified indicators. Policy-Weighted approach was used to assume the weights with an analysis of different weights for each dimension. The analysis of the different weights cases gave priority to the User satisfaction and net Benefits dimension as the two outcomes from the system. For the input dimensions, higher priority was given to the system quality and the service quality dimension. Call centers decision makers can use the tool to tune the different weights in order to reach the objectives set by the organization. Multiple linear regression analysis was used in order to provide a linear formula for the User Satisfaction dimension and the Net Benefits dimension in order to be able to forecast the values for these two dimensions as function of the other dimensions

  10. Anatomy of a hostage rescue: what makes hostage rescue operations successful?

    OpenAIRE

    Perez, Carlos M.

    2004-01-01

    Approved for public release; distribution is unlimited This thesis develops a theory to determine the best execution time to conduct a hostage rescue attempt. It does so by explaining the phenomenon of a hostage crisis biorhythm and proposing four principles essential for success. The principles of hostage rescue operations presented in this thesis and used in the biorhythm model-surprise, intelligence, operator's skill, and deception-are derived from looking at numerous planning models fr...

  11. Verbal Behavior and Courtroom Success.

    Science.gov (United States)

    Parkinson, Michael G.

    1981-01-01

    Identifies characteristics of successful courtroom speech for prosecuting attorneys, defense attorneys, and accuseds using computer-based content analysis and rater judgments of verbal behaviors. Demonstrates that verbal aggression is an important factor for successful prosecutors, equivocation is important to success for defense attorneys, and…

  12. Building Successful Information Systems – a Key for Successful Organization

    Directory of Open Access Journals (Sweden)

    Doina ROSCA

    2010-12-01

    Full Text Available Building Successful Information Systems – a Key for Successful OrganizationAbstract: An Information System (IS can have a major impact on corporate strategy and organizational success. The involvement of managers and decision makers in all aspects of information systems is a major factor for organizational success, including higher profits and lower costs. Some of the benefits business organization seek to achieve through information systems include: better safety, competitive advantage, fewer errors, greater accuracy, higher quality products, improved communications, increased efficiency and productivity, more efficient administration, superior financial and managerial decision making.

  13. Implementing Successful Geoscience Education and Outreach Efforts

    Science.gov (United States)

    Braile, L. W.

    2004-12-01

    Successful geoscience Education and Outreach (E&O) efforts associated with a research program benefit from effective planning and a commitment by scientists/researchers to become more knowledgeable about and involved in education. Several suggested strategies have evolved based on experience in Earth science E&O with K-16 educators and students during the past 10 years. E&O programs and materials should be developed at appropriate levels ("start from where they're at") and utilize information, skills and topics that are most relevant to students and teachers. Hands-on and inquiry-based activities that teach or reinforce fundamental science understanding and skills, while introducing new topics, results and discoveries, are particularly effective. It is useful to design materials that can provide for a range of time commitment, level of technical skills, and effort, so that introductory to in-depth curriculum units can be implemented. Use of the Internet and working with teachers can be effective methods for dissemination and taking advantage of a "multiplying factor". Obtaining feedback and evaluation of the programs and developed materials, and connecting the materials to national or state education standards are also highly recommended. Most importantly, scientists should become more involved in the science education community. Attending and presenting papers at appropriate science education sessions or workshops, or state or national science teacher meetings (the annual National Science Teachers Association convention is an excellent place to start) can be a significant educational experience for the scientist/researcher. Effective geoscience E&O programs have significant potential for enhancing K-16 education and scientific literacy, and can help attract students to the sciences. Perhaps surprisingly, these efforts have substantial positive impact on the scientist/researcher as well.

  14. Defining and Measuring Academic Success

    Directory of Open Access Journals (Sweden)

    Travis T. York

    2015-03-01

    Full Text Available Despite, and perhaps because of its amorphous nature, the term - academic success' is one of the most widely used constructs in educational research and assessment within higher education. This paper conducts an analytic literature review to examine the use and operationalization of the term in multiple academic fields. Dominant definitions of the term are conceptually evaluated using Astin's I-E-O model resulting in the proposition of a revised definition and new conceptual model of academic success. Measurements of academic success found throughout the literature are presented in accordance with the presented model of academic success. These measurements are provided with details in a user-friendly table (Appendix B. Results also indicate that grades and GPA are the most commonly used measure of academic success. Finally, recommendations are given for future research and practice to increase effective assessment of academic success.

  15. Successful cognitive and emotional aging

    OpenAIRE

    Jeste, Dilip V.; Depp, Colin A.; Vahia, Ipsit V.

    2010-01-01

    We review the definitions, determinants, and ways of enhancing successful cognitive and emotional aging. Objective definitions of successful aging based on physical health emphasize outcomes including freedom from disability and disease, whereas subjective definitions center on well-being, social connectedness, and adaptation. Most older people do not meet objective criteria for successful aging, while a majority meet the subjective criteria. Older people with severe mental ...

  16. Succession planning and individual development.

    Science.gov (United States)

    Goudreau, Kelly A; Hardy, Jacalyn

    2006-06-01

    The authors present a framework for a succession planning and individual development initiative implemented in a Veterans Health Administration facility. Foundational strategic goals and a conceptual framework in the Veterans Affairs system provide the structure for the 3 facility-level succession planning and individual development programs. Outcomes of the programs are promising with 2 of 3 programs demonstrating clear succession planning outcomes and the other one showing positive preliminary results.

  17. SUCCESS@Seneca: Facilitating Student and Staff Success

    Science.gov (United States)

    Fishman, Steve; Decandia, Lisa

    2006-01-01

    SUCCESS@Seneca has teamed up with the General Arts and Science programs at Seneca's Newnham campus. The design of an integrated service delivery model addresses numerous student success and retention related activities by providing the essential connection between academics and college resources. The program focuses on the promotion and support of…

  18. Evolution, Appearance, and Occupational Success

    National Research Council Canada - National Science Library

    Little, Anthony C; Roberts, S. Craig

    2012-01-01

    .... In this article, we review evidence linking physical appearance to occupational success and evaluate the hypothesis that appearance based biases are consistent with predictions based on evolutionary...

  19. Discuss on Approximate Optimization Strategies Using Design of Computer Experiments and Metamodels for Flight Vehicle Design%基于计算试验设计与代理模型的飞行器近似优化策略探讨

    Institute of Scientific and Technical Information of China (English)

    龙腾; 刘建; WANG G Gary; 刘莉; 史人赫; 郭晓松

    2016-01-01

    现代飞行器设计优化中广泛应用高精度分析模型以提高设计可信度与综合性能,但是也带来了计算复杂性问题。为了有效缓解计算耗时的问题,基于计算试验设计与代理模型的飞行器近似优化策略成为研究热点。近似优化策略通过构造合理的近似模型引导优化过程快速收敛到最优解,从而达到降低计算成本,缩短设计周期的目的。根据广泛的文献调研,对飞行器近似优化策略的发展现状进行详细探讨。给出近似优化策略的定义、求解流程、特点以及关键技术。对计算试验设计方法、代理模型方法、精度校验与代理模型选择方法等技术进行综述。围绕静态与自适应两类近似优化策略,重点讨论典型的代理模型管理与更新策略与收敛准则。针对飞行器多学科设计优化问题,探讨近似优化策略与分解策略在求解效率与收敛性方面的技术特点。通过数值算例对各项关键技术的特点进行较详尽的对比分析与总结,并依托飞行器设计优化工程实例对近似优化策略的综合性能进行探讨,指出不同近似优化策略的适用范围。研究结果表明,飞行器近似优化策略在优化效率、全局收敛性以及鲁棒性等方面体现出较显著的优势,具有良好的工程应用前景。指出飞行器近似优化策略的未来研究方向。%Although the wide use of high fidelity analysis models in modern flight vehicle design is beneficial to improving the design credibility and overall performance of flight vehicle systems, it also causes high computational cost and complexity. In order to alleviate the computational difficulty, approximate optimization strategies using design of computer experiments(DoCE) and metamodels for flight vehicles have become more and more popular, which construct reasonable approximation models to enable efficient convergence to the optimal solution with

  20. Succession Planning for Library Instruction

    Science.gov (United States)

    Sobel, Karen; Drewry, Josiah

    2015-01-01

    Detailed succession planning helps libraries pass information from one employee to the next. This is crucial in preparing for hiring, turnover, retirements, training of graduate teaching assistants in academic libraries, and other common situations. The authors of this article discuss succession planning for instruction programs in academic…

  1. RE 05: engineering successful products

    NARCIS (Netherlands)

    Atlee, Joanne M.; Wieringa, Roel J.

    2006-01-01

    At the Requirements Engineering conference series, researchers and practitioners exchange experiences, discuss problems, and propose solutions. The theme of RE 05--Engineering Successful Products--reflects the understanding that high-quality requirements are at the heart of successful products. To b

  2. Succession Planning for Library Instruction

    Science.gov (United States)

    Sobel, Karen; Drewry, Josiah

    2015-01-01

    Detailed succession planning helps libraries pass information from one employee to the next. This is crucial in preparing for hiring, turnover, retirements, training of graduate teaching assistants in academic libraries, and other common situations. The authors of this article discuss succession planning for instruction programs in academic…

  3. Succession planning: securing the future.

    Science.gov (United States)

    Bolton, Julia; Roy, Wendy

    2004-12-01

    Succession planning is an essential business strategy for healthcare organizations, given the impending retirement of the huge Baby Boomer cohort of experienced nurses. It ensures that there will be qualified candidates when key vacancies occur. The authors describe the critical elements of a succession plan and suggest ways to implement them. The model can be applied to leadership and clinical positions.

  4. Success Analysis for School Administrators.

    Science.gov (United States)

    Carranza, Elihu

    The success analysis approach consists of individual and group exercises through which participants, in sharing meaning-centered experiences, identify their strengths and needs and clarify values operative in both past and expected successes. This approach constitutes a rational and practical perspective whereby students, teachers, staff, and…

  5. Success and Women's Career Adjustment.

    Science.gov (United States)

    Russell, Joyce E. A.; Burgess, Jennifer R. D.

    1998-01-01

    Women still face barriers to career success and satisfaction: stereotypes, assumptions, organizational culture, human resource practices, and lack of opportunities. Despite individual and organizational strategies, many women leave to become entrepreneurs. There is a need to investigate how women define career success. (SK)

  6. Success Analysis for School Administrators.

    Science.gov (United States)

    Carranza, Elihu

    The success analysis approach consists of individual and group exercises through which participants, in sharing meaning-centered experiences, identify their strengths and needs and clarify values operative in both past and expected successes. This approach constitutes a rational and practical perspective whereby students, teachers, staff, and…

  7. Moderate rates of late Quaternary slip along the northwestern margin of the Basin and Range Province, Surprise Valley fault, northeastern California

    Science.gov (United States)

    Personius, Stephen F.; Crone, Anthony J.; Machette, Michael N.; Mahan, Shannon; Lidke, David J.

    2009-01-01

    The 86-km-long Surprise Valley normal fault forms part of the active northwestern margin of the Basin and Range province in northeastern California. We use trench mapping and radiocarbon, luminescence, and tephra dating to estimate displacements and timing of the past five surface-rupturing earthquakes on the central part of the fault near Cedarville. A Bayesian OxCal analysis of timing constraints indicates earthquake times of 18.2 ± 2.6, 10.9 ± 3.2, 8.5 ± 0.5, 5.8 ± 1.5, and 1.2 ± 0.1 ka. These data yield recurrence intervals of 7.3 ± 4.1, 2.5 ± 3.2, 2.7 ± 1.6, and 4.5 ± 1.5 ka and an elapsed time of 1.2 ± 0.1 ka since the latest surface-rupturing earthquake. Our best estimate of latest Quaternary vertical slip rate is 0.6 ?? 0.1 mm/a. This late Quaternary rate is remarkably similar to long-term (8-14 Ma) minimum vertical slip rates (>0.4-0.5 ± 0.3 mm/a) calculated from recently acquired seismic reflection and chronologic and structural data in Surprise Valley and the adjacent Warner Mountains. However, our slip rate yields estimates of extension that are lower than recent campaign GPS determinations by factors of 1.5-4 unless the fault has an unusually shallow (30°-35°) dip as suggested by recently acquired seismic reflection data. Coseismic displacements of 2-4.5 ± 1 m documented in the trench and probable rupture lengths of 53-65 km indicate a history of latest Quaternary earthquakes of M 6.8-7.3 on the central part of the. Surprise Valley fault.

  8. Surprisal analysis of transcripts expression levels in the presence of noise: a reliable determination of the onset of a tumor phenotype.

    Directory of Open Access Journals (Sweden)

    Ayelet Gross

    Full Text Available Towards a reliable identification of the onset in time of a cancer phenotype, changes in transcription levels in cell models were tested. Surprisal analysis, an information-theoretic approach grounded in thermodynamics, was used to characterize the expression level of mRNAs as time changed. Surprisal Analysis provides a very compact representation for the measured expression levels of many thousands of mRNAs in terms of very few - three, four - transcription patterns. The patterns, that are a collection of transcripts that respond together, can be assigned definite biological phenotypic role. We identify a transcription pattern that is a clear marker of eventual malignancy. The weight of each transcription pattern is determined by surprisal analysis. The weight of this pattern changes with time; it is never strictly zero but it is very low at early times and then rises rather suddenly. We suggest that the low weights at early time points are primarily due to experimental noise. We develop the necessary formalism to determine at what point in time the value of that pattern becomes reliable. Beyond the point in time when a pattern is deemed reliable the data shows that the pattern remain reliable. We suggest that this allows a determination of the presence of a cancer forewarning. We apply the same formalism to the weight of the transcription patterns that account for healthy cell pathways, such as apoptosis, that need to be switched off in cancer cells. We show that their weight eventually falls below the threshold. Lastly we discuss patient heterogeneity as an additional source of fluctuation and show how to incorporate it within the developed formalism.

  9. SUCCESSION MANAGEMENT: UPAYA HUMAN RESOURCE PLANNING MENUJU SUCCESS CORPORATE

    Directory of Open Access Journals (Sweden)

    Rini Kuswati

    2010-06-01

    create a more flexible and dynamic approach for preparing future executive and have the leadership necessary ready to meet the business challenges of the remainder of the decade and beyond. Succession management allows the corporate leadership to instill a more dynamic process became easier to integrate with the firm’s strategic initiatives. It better aligns organizational thinking with the external environment where the discontinuities make it possible to anticipate the full spectrum of change that a corporation will confront. It is the leadership and succession philosophy that focuses on developing the creativity and flexibility that allows for a more rapid response to change. So succession management as one way to became the success corporate.

  10. Succession Planning in Australian Farming

    Directory of Open Access Journals (Sweden)

    John Hicks

    2012-11-01

    Full Text Available The theme of this paper is that succession planning in Australian farming is under-developed.It may be linked to economic and social change which suggests that farmers need to adapt togenerational change but this is being resisted or ignored. The implications of this are the slowdecline of family farming, a poor transfer of skills and knowledge to subsequent generationsof farmers in some parts of the agricultural sector and the potential for an extension of thefinancial services industry to develop a more effective raft of succession planning measuresto mitigate the effects of a traditional approach to succession in agriculture.

  11. Aesthetic Surprise of Artistic Expression in Chinese Ancient Literature%审美惊奇在中国古代文学中的艺术表现

    Institute of Scientific and Technical Information of China (English)

    李潇云

    2015-01-01

    Aesthetic surprise is an appreciation of wonder beauty of literature, art and natural beauty. It is related with ideological content of expression, prefers to the artistic form and artistic expression, but also closely related with the aesthetic psychology and aesthetic experience. With the main identifications as aesthetic psychological effects which appeared as "surprising" "astonishing""touched"and so on, its reason can be roughly analyzed from four aspects:imagery, artistic conception, rhetoric and language.%审美惊奇是对具有奇异之美的文学艺术甚至奇特自然美的一种激赏,它和表现的思想内容有关,偏重于艺术形式、艺术表现,也和审美心理和审美经验密不可分,其主要标识为审美心理效果,体现为“惊人”“惊心动魄”“动心惊耳”等,其原因大致可以从四个方面来分析:语言、修辞、意象、意境。

  12. Predicting Success in Elementary Algebra

    Science.gov (United States)

    Mogull, R. G.; Rosengarten, W., Jr.

    1974-01-01

    The purpose of this study was to develop a device for predicting student success in a high school Elementary Algebra course. It was intended to assist guidance counselors in advising students in selecting the most appropriate mathematics course. (Editor)

  13. Leadership for Successful School Desegregation.

    Science.gov (United States)

    Sommerville, Joseph C.

    1980-01-01

    A review of the literature and questionnaire responses shows that superintendents who successfully meet the challenge of school desegregation assess the situation of each school, provide for communication, focus on potential student benefits, and demonstrate personal commitment. (Author/MLF)

  14. Human capital and career success

    DEFF Research Database (Denmark)

    Frederiksen, Anders; Kato, Takao

    Denmark’s registry data provide accurate and complete career history data along with detailed personal characteristics (e.g., education, gender, work experience, tenure and others) for the population of Danish workers longitudinally. By using such data from 1992 to 2002, we provide rigorous...... evidence for the first time for the population of workers in an entire economy (as opposed to case study evidence) on the effects of the nature and scope of human capital on career success (measured by appointments to top management). First, we confirm the beneficial effect of acquiring general human...... capital formally through schooling for career success, as well as the gender gap in career success rates. Second, broadening the scope of human capital by experiencing various occupations (becoming a generalist) is found to be advantageous for career success. Third, initial human capital earned through...

  15. Success Factors of PLM Projects

    Institute of Scientific and Technical Information of China (English)

    谭月梅; 李莉敏; 胡庆夕; 方明伦

    2004-01-01

    Product Lifecycle Management (PLM) is a business strategy beginning to gain wide acceptance. Concerning the implementation of PLM projects, six success factors are presented and analyzed in details in this paper.

  16. The neurobiology of successful abstinence.

    Science.gov (United States)

    Garavan, H; Brennan, K L; Hester, R; Whelan, R

    2013-08-01

    This review focuses on the neurobiological processes involved in achieving successful abstinence from drugs of abuse. While there is clinical and public health value in knowing if the deficits associated with drug use correct with abstinence, studying the neurobiology that underlies successful abstinence can also illuminate the processes that enable drug-dependent individuals to successfully quit. Here, we review studies on human addicts that assess the neurobiological changes that arise with abstinence and the neurobiological predictors of successfully avoiding relapse. The literature, while modest in size, suggests that abstinence is associated with improvement in prefrontal structure and function, which may underscore the importance of prefrontally mediated cognitive control processes in avoiding relapse. Given the implication that the prefrontal cortex may be an important target for therapeutic interventions, we also review evidence indicating the efficacy of cognitive control training for abstinence.

  17. Success Teaching Spelling with Music.

    Science.gov (United States)

    Martin, Mariellen

    1983-01-01

    A spelling approach which incorporates music on a cassette with spelling, pronunciation, and definition of specific words was successful in improving junior high learning disabled students' spelling performance, self-esteem, and sequential memories. (CL)

  18. The characteristics of successful entrepreneurs

    National Research Council Canada - National Science Library

    Pokrajcic, Dragana

    2004-01-01

    .... The psychological characteristic theory of entrepreneur argues that successful entrepreneurs possess certain personality traits that mark them out as special, and tries to determine and to evaluate these special traits...

  19. Success Teaching Spelling with Music.

    Science.gov (United States)

    Martin, Mariellen

    1983-01-01

    A spelling approach which incorporates music on a cassette with spelling, pronunciation, and definition of specific words was successful in improving junior high learning disabled students' spelling performance, self-esteem, and sequential memories. (CL)

  20. Organizational Climate for Successful Aging.

    Science.gov (United States)

    Zacher, Hannes; Yang, Jie

    2016-01-01

    Research on successful aging at work has neglected contextual resources such as organizational climate, which refers to employees' shared perceptions of their work environment. We introduce the construct of organizational climate for successful aging (OCSA) and examine it as a buffer of the negative relationship between employee age and focus on opportunities (i.e., beliefs about future goals and possibilities at work). Moreover, we expected that focus on opportunities, in turn, positively predicts job satisfaction, organizational commitment, and motivation to continue working after official retirement age. Data came from 649 employees working in 120 companies (M age = 44 years, SD = 13). We controlled for organizational tenure, psychological climate for successful aging (i.e., individuals' perceptions), and psychological and organizational age discrimination climate. Results of multilevel analyses supported our hypotheses. Overall, our findings suggest that OCSA is an important contextual resource for successful aging at work.

  1. Building a Successful Technology Cluster

    Science.gov (United States)

    Silicon Valley is the iconic cluster—a dense regional network of companies, universities, research institutions, and other stakeholders involved in a single industry. Many regions have sought to replicate the success of Silicon Valley, which has produced technological innov...

  2. Biosphere reserves: Attributes for success.

    Science.gov (United States)

    Van Cuong, Chu; Dart, Peter; Hockings, Marc

    2017-03-01

    Biosphere reserves established under the UNESCO Man and the Biosphere Program aim to harmonise biodiversity conservation and sustainable development. Concerns over the extent to which the reserve network was living up to this ideal led to the development of a new strategy in 1995 (the Seville Strategy) to enhance the operation of the network of reserves. An evaluation of effectiveness of management of the biosphere reserve network was called for as part of this strategy. Expert opinion was assembled through a Delphi Process to identify successful and less successful reserves and investigate common factors influencing success or failure. Ninety biosphere reserves including sixty successful and thirty less successful reserves in 42 countries across all five Man and the Biosphere Program regions were identified. Most successful sites are the post-Seville generation while the majority of unsuccessful sites are pre-Seville that are managed as national parks and have not been amended to conform to the characteristics that are meant to define a biosphere reserve. Stakeholder participation and collaboration, governance, finance and resources, management, and awareness and communication are the most influential factors in the success or failure of the biosphere reserves. For success, the biosphere reserve concept needs to be clearly understood and applied through landscape zoning. Designated reserves then need a management system with inclusive good governance, strong participation and collaboration, adequate finance and human resource allocation and stable and responsible management and implementation. All rather obvious but it is difficult to achieve without commitment to the biosphere reserve concept by the governance authorities. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Enterprise architecture for business success

    CERN Document Server

    Wijegunaratne, Inji; Evans-Greenwood, Peter

    2014-01-01

    Enterprise Architecture (EA) has evolved to become a prominent presence in today's information systems and technology landscape. The EA discipline is rich in frameworks, methodologies, and the like. However, the question of 'value' for business ;professionals remains largely unanswered - that is, how best can Enterprise Architecture and Enterprise Architects deliver value to the enterprise? Enterprise Architecture for Business Success answers this question. Enterprise Architecture for Business Success is primarily intended for IT professionals working in the area of Enterprise Architectu

  4. Geometry success in 20 mins

    CERN Document Server

    Editors, LearningExpress

    2010-01-01

    Whether you're new to geometry or just looking for a refresher, this completely revised and updated third edition of Geometry Success in 20 Minutes a Day offers a 20-step lesson plan that provides quick and thorough instruction in practical, critical skills. Stripped of unnecessary math jargon but bursting with geometry essentials, Geometry Success in 20 Minutes a Day is an invaluable resource for both students and adults.

  5. The characteristics of successful entrepreneurs

    Directory of Open Access Journals (Sweden)

    Pokrajčić Dragana M.

    2004-01-01

    Full Text Available This paper examines the economic, psychological and social-behavioral theories of the entrepreneur in order to determine the characteristics of a successful entrepreneur. The major contribution of economic theories of the entrepreneur is better understanding of the entrepreneur and his/her role in economic development. The psychological characteristic theory of entrepreneur argues that successful entrepreneurs possess certain personality traits that mark them out as special, and tries to determine and to evaluate these special traits. The social-behavioral theories stress the influence of experience, knowledge, social environment and ability to learn on the entrepreneur’s success as well as his/her personality traits. Neither of the examined theories of entrepreneur gives a satisfactory explanation of the entrepreneur’s success, but taken as a whole, they can explain key factors of entrepreneur’s success. The entrepreneur’s success comes about as a result of his/her personality traits, ability to learn from experience and ability to adjust to his/her environment.

  6. Small bowel obstruction- a surprise.

    Science.gov (United States)

    Mathew, Jeffrey Daniel; Cp, Ganesh Babu; M, Balachandar; M, Ramanathan

    2015-01-01

    Trans - omental hernia is very rare, accounting to 1-4% of all internal hernias which is an unusual cause of small bowel obstruction. Here we present a case report of a small bowel obstruction in a female due to trans - omental hernia presenting with central abdominal pain, distension and bilious vomiting. She had no previous history of trauma, surgery. Plain X-ray abdomen erect showed multiple air fluid levels with dilated small bowel loops. Emergency laparotomy revealed a segment of congested small bowel loop (ileum) through a defect in greater omentum. On table the herniated bowel loop was reduced and the defect in greater omentum was closed primarily. There was no necessity for bowel resection as it regained normal colour after reduction. Postoperative period was uneventful with complete resolution of symptoms. This case is presented for its rarity and its importance in clinical differential diagnosis of acute abdomen due to small bowel obstruction.

  7. Surprising Beauty in Technical Photography

    Science.gov (United States)

    Davidhazy, Andrew

    2009-01-01

    The Imaging and Photographic Technology area, in which the author teaches, is an applications- and technology-oriented photography program designed to prepare students for work in technical, corporate, industrial, and scientific environments. One day, the author received an e-mail message from an editor who had found his Web site and thought he…

  8. Surprised by the Parimutuel Odds?

    DEFF Research Database (Denmark)

    Ottaviani, Marco; Sørensen, Peter Norman

    2009-01-01

    Empirical analyses of parimutuel betting markets have documented that market probabilities of favorites (longshots) tend to underestimate (overestimate) the corresponding empirical probabilities. We argue that this favorite-longshot bias is consistent with bettors taking simultaneous positions...... on the basis of private information about the likelihood of different outcomes. The ex post realization of a high market probability indicates favorable information about the occurrence of an outcome -- and the opposite is true for longshots. This explanation for the bias relies on the bettors' inability...

  9. Conversation Simulation and Sensible Surprises

    Science.gov (United States)

    Hutchens, Jason L.

    I have entered the Loebner Prize five times, winning the "most humanlike program" category in 1996 with a surly ELIZA-clone named HeX, but failed to repeat the performance in subsequent years with more sophisticated techniques. Whether this is indicative of an unanticipated improvement in "conversation simulation" technology, or whether it highlights the strengths of ELIZA-style trickery, is as an exercise for the reader. In 2000, I was invited to assume the role of Chief Scientist at Artificial Intelligence Ltd. (Ai) on a project inspired by the advice given by Alan Turing in the final section of his classic paper - our quest was to build a "child machine" that could learn and use language from scratch. In this chapter, I will discuss both of these experiences, presenting my thoughts regarding the Chinese Room argument and Artificial Intelligence (AI) in between.

  10. Some new surprises in chaos

    Energy Technology Data Exchange (ETDEWEB)

    Bunimovich, Leonid A., E-mail: bunimovh@math.gatech.edu [ABC Program, Georgia Institute of Technology, Atlanta, Georgia 30332 (United States); School of Mathematics, Georgia Institute of Technology, Atlanta, Georgia 30332 (United States); Vela-Arevalo, Luz V., E-mail: luzvela@math.gatech.edu [School of Mathematics, Georgia Institute of Technology, Atlanta, Georgia 30332 (United States)

    2015-09-15

    A brief review is presented of some recent findings in the theory of chaotic dynamics. We also prove a statement that could be naturally considered as a dual one to the Poincaré theorem on recurrences. Numerical results demonstrate that some parts of the phase space of chaotic systems are more likely to be visited earlier than other parts. A new class of chaotic focusing billiards is discussed that clearly violates the main condition considered to be necessary for chaos in focusing billiards.

  11. Surprising results from cosmic rays

    Energy Technology Data Exchange (ETDEWEB)

    Wilk, G. [Soltan Institute for Nuclear Studies, Warsaw (Poland); Wlodarczyk, Z. [Institute for Physics, Pedagogical University, Kielce (Poland)

    1996-10-01

    A number of seemingly exotic phenomena seen in the highest cosmic-ray experiments are briefly discussed. We argue that they indicate existence of non-statistical fluctuations and strong correlations in the fragmentation region of multiparticle production processes unaccessible to the present accelerators. (author) 12 refs, 3 figs

  12. Education in Japan: Surprising Lessons.

    Science.gov (United States)

    Deasy, Richard J.

    1986-01-01

    Reports on an indepth probe of the Japanese educational system by American educators. This study shows that the United States cannot expect to achieve economic growth by copying the Japanese system, which has been shaped largely by its geography and culture and society. (MD)

  13. What is the mechanism of Ketamine's rapid-onset antidepressant effect? A concise overview of the surprisingly large number of possibilities.

    Science.gov (United States)

    Strasburger, S E; Bhimani, P M; Kaabe, J H; Krysiak, J T; Nanchanatt, D L; Nguyen, T N; Pough, K A; Prince, T A; Ramsey, N S; Savsani, K H; Scandlen, L; Cavaretta, M J; Raffa, R B

    2017-04-01

    Abundant clinical data now confirm that ketamine produces a remarkable rapid-onset antidepressant effect - hours or days - in contrast to the delayed onset (typically weeks) of current antidepressant drugs. This surprising and revolutionary finding may lead to the development of life-saving pharmacotherapy for depressive illness by reducing the high suicide risk associated with the delayed onset of effect of current drugs. As ketamine has serious self-limiting drawbacks that restrict its widespread use for this purpose, a safer alternative is needed. Our objective is to review the proposed mechanism(s) of ketamine's rapid-onset antidepressant action for new insights into the physiological basis of depressive illness that may lead to new and novel targets for antidepressant drug discovery. A search was conducted on published literature (e.g. PubMed) and Internet sources to identify information relevant to ketamine's rapid-acting antidepressant action and, specifically, to the possible mechanism(s) of this action. Key search words included 'ketamine', 'antidepressant', 'mechanism of action', 'depression' and 'rapid acting', either individually or in combination. Information was sought that would include less well-known, as well as well-known, basic pharmacologic properties of ketamine and that identified and evaluated the several hypotheses about ketamine's mechanism of antidepressant action. Whether the mechanistic explanation for ketamine's rapid-onset antidepressant action is related to its well-known antagonism of the NMDA (N-Methyl-d-aspartate) subtype of glutamate receptor or to something else has not yet been fully elucidated. The evidence from pharmacologic, medicinal chemistry, animal model and drug-discovery sources reveals a wide variety of postulated mechanisms. The surprising discovery of ketamine's rapid-onset antidepressant effect is a game-changer for the understanding and treatment of depressive illness. There is some convergence on NMDA receptor

  14. Leadership Succession and Successful Leadership: The Case of Laura Martinez

    Science.gov (United States)

    Garza, Encarnacion, Jr.; Murakami-Ramalho, Elizabeth; Merchant, Betty

    2011-01-01

    This case study follows the work of principal Laura Martinez as moving from leading Stevens Elementary for 9 years, and now opening a new P-8th grade academy in the same south Texas urban, inner-city district. The purpose of this case study was to observe successful leadership and the principal's strategies both in her previous and present school,…

  15. The International Successful School Principalship Project: Success Sustained?

    Science.gov (United States)

    Moos, Lejf; Johansson, Olof

    2009-01-01

    Purpose: The purpose of this paper is to synthesize the findings of the follow-up studies of successful school principals in six countries: Australia, Denmark, England, Norway, Sweden, and the USA. Design/methodology/approach: Data were categorized according to stakeholder expectations, the concept and practice of leadership, and the…

  16. Leadership Succession and Successful Leadership: The Case of Laura Martinez

    Science.gov (United States)

    Garza, Encarnacion, Jr.; Murakami-Ramalho, Elizabeth; Merchant, Betty

    2011-01-01

    This case study follows the work of principal Laura Martinez as moving from leading Stevens Elementary for 9 years, and now opening a new P-8th grade academy in the same south Texas urban, inner-city district. The purpose of this case study was to observe successful leadership and the principal's strategies both in her previous and present school,…

  17. Succession planning. A strategy for taking charge.

    Science.gov (United States)

    Bower, F L

    2000-01-01

    This article on succession planning includes a definition of succession planning, the reasons for and components of succession planning, and why succession planning is important to the leadership of an organization, unit, division or department. Each component of succession planning, that is, vision, networking, and mentorship, is described, with examples that are intended to guide the reader to initiate and evaluate succession planning.

  18. Drilling successful from ROV Ventana

    Science.gov (United States)

    Stakes, Debra S.; McFarlane, James A. R.; Holloway, G. Leon; Greene, H. Gary

    Cores of granite and deformed sediment from the walls of Monterey Canyon were successfully recovered from December 30 to 31, 1992, by Monterey Bay Aquarium Research Institute's (MBARI) Remotely Operated Vehicle (ROV) Ventana using a small-diameter, double-barrel drill with a diamond bit. This HSTR (Holloway-Stakes-Tengdin-Rajcula) drill was developed to drill cores horizontally from sulfide/sulfate walls of active black smokers. The drill was first successfully used by the submersible Alvin in October 1991 to drill into massive sulfide chimneys, on the Juan de Fuca Ridge (Eos, June 30, 1992, p. 273), and it was subsequently used with equal success on the chalcopyrite-rich chimneys from 21°N and 9°N on the East Pacific Rise. The recent December dives, however, marked the first time that drilling has ever been attempted from the smaller ROV and the first time coring into the harder igneous rock substrate has been attempted.

  19. Success factors in technology development

    Science.gov (United States)

    Preston, John T.

    1995-01-01

    Universities in the U.S. have a significant impact on business through the transfer of technology. This paper describes goals and philosophy of the Technology Licensing Office at the Massachusetts Institute of Technology. This paper also relates the critical factors for susscessful technology transfer, particularly relating to new business formation. These critical factors include the quality of the technology, the quality of the management, the quality of the investor, the passion for success, and the image of the company. Descriptions of three different levels of investment are also given and the most successful level of investment for starting a new company is reviewed. Licensing to large companies is also briefly reviewed, as this type of licensing requires some different strategies than that of licensing to start-up companies. High quality critical factors and intelligent investment create rewards for the parties and successful ventures.

  20. Mission Success for Combustion Science

    Science.gov (United States)

    Weiland, Karen J.

    2004-01-01

    This presentation describes how mission success for combustion experiments has been obtained in previous spaceflight experiments and how it will be obtained for future International Space Station (ISS) experiments. The fluids and combustion facility is a payload planned for the ISS. It is composed of two racks: the fluids Integrated rack and the Combustion INtegrated Rack (CIR). Requirements for the CIR were obtained from a set of combustion basis experiments that served as surrogates for later experiments. The process for experiments that fly on the ISS includes proposal selection, requirements and success criteria definition, science and engineering reviews, mission operations, and postflight operations. By following this process, the microgravity combustion science program has attained success in 41 out of 42 experiments.

  1. Fast Success and Slow Failure

    DEFF Research Database (Denmark)

    Mors, Louise; Waguespack, David

    Full Abstract: Do the benefits of cross boundary collaborations outweigh the costs? We seek to answer this question by examining 5079 collaborations in the Internet Engineering Task Force (IETF). Our findings suggest that crossing formal boundaries is positively related to success and efficiency...... of the collaboration. Yet there are high costs associated with cross boundary collaborations for unsuccessful projects, as these take longer to fail, and therefore hold up resources that could be reallocated to other projects. We find that even with a boost in success, the costs are higher than the likelihood...... of success, suggesting that firms are better off investing in nondiverse projects. This finding has important implications for how we think about the benefits of seeking novelty....

  2. Crystal structure of di-μ-chlorido-bis[dichloridobis(methanol-κOiridium(III] dihydrate: a surprisingly simple chloridoiridium(III dinuclear complex with methanol ligands

    Directory of Open Access Journals (Sweden)

    Joseph S. Merola

    2015-05-01

    Full Text Available The reaction between IrCl3·xH2O in methanol led to the formation of small amounts of the title compound, [Ir2Cl6(CH3OH4]·2H2O, which consists of two IrCl4O2 octahedra sharing an edge via chloride bridges. The molecule lies across an inversion center. Each octahedron can be envisioned as being comprised of four chloride ligands in the equatorial plane with methanol ligands in the axial positions. A lattice water molecule is strongly hydrogen-bonded to the coordinating methanol ligands and weak interactions with coordinating chloride ligands lead to the formation of a three-dimensional network. This is a surprising structure given that, while many reactions of iridium chloride hydrate are carried out in alcoholic solvents, especially methanol and ethanol, this is the first structure of a chloridoiridium compound with only methanol ligands.

  3. Developing a Successful HIV Vaccine.

    Science.gov (United States)

    Gallo, Robert C

    2015-07-15

    Human immunodeficiency virus (HIV) genome integration indicates that persistent sterilizing immunity will be needed for a successful vaccine candidate. This suggests a need for broad antibodies targeting the Env protein. Immunogens targeting gp120 have been developed that block infection in monkeys and mimic the modest success of the RV144 clinical trial in that protection is short-lived following a decline in antibody-depending cell-mediated cytotoxicity-like antibodies. Attempts to induce antibody persistence have been complicated by a loss of efficacy, presumably by increasing the number of HIV-target cells. The key seems to be achieving an immune balance.

  4. Successive quadratic programming multiuser detector

    Institute of Scientific and Technical Information of China (English)

    Mu Xuewen; Zhang Yaling; Liu Sanyang

    2007-01-01

    Based on the semidefinite programming relaxation of the CDMA maximum likelihood multiuser detection problem,a detection strategy by the successive quadratic programming algorithm is presented. Coupled with the randomized cut generation scheme, the suboptimal solution of the multiuser detection problem in obtained. Compared to the interior point methods previously reported based on semidefinite programming, simulations demonstrate that the successive quadratic programming algorithm often yields the similar BER performances of the multiuser detection problem. But the average CPU time of this approach is significantly reduced.

  5. Staggering successes amid controversy in California water management

    Science.gov (United States)

    Lund, J. R.

    2012-12-01

    Water in California has always been important and controversial, and it probably always will be. California has a large, growing economy and population in a semi-arid climate. But California's aridity, hydrologic variability, and water controversies have not precluded considerable economic successes. The successes of California's water system have stemmed from the decentralization of water management with historically punctuated periods of more centralized strategic decision-making. Decentralized management has allowed California's water users to efficiently explore incremental solutions to water problems, ranging from early local development of water systems (such as Hetch Hetchy, Owens Valley, and numerous local irrigation projects) to more contemporary efforts at water conservation, water markets, wastewater reuse, and conjunctive use of surface and groundwater. In the cacophony of local and stakeholder interests, strategic decisions have been more difficult, and consequently occur less frequently. California state water projects and Sacramento Valley flood control are examples where decades of effort, crises, floods and droughts were needed to mobilize local interests to agree to major strategic decisions. Currently, the state is faced with making strategic environmental and water management decisions regarding its deteriorating Sacramento-San Joaquin Delta. Not surprisingly, human uncertainties and physical and fiscal non-stationarities dominate this process.

  6. Maternal Cohabitation and Educational Success

    Science.gov (United States)

    Raley, R. Kelly; Frisco, Michelle L.; Wildsmith, Elizabeth

    2005-01-01

    Despite the dramatic increase in children's experiences in cohabiting families, little is known about how living in such families affects children's academic success. Extrapolating from two theoretical frameworks that have been commonly used to explain the association between parental divorce and educational outcomes, the authors constructed…

  7. Academic Success at Selective Institutions.

    Science.gov (United States)

    Felix, Oscar

    2003-01-01

    Examined success factors of eight under-prepared students at Colorado State University. Four major themes emerged from the data: pre-college experiences, struggles, positive campus experiences and support, and student growth. Findings were conceptualized within a student resiliency framework. (EV)

  8. Lobomycosis Successfully Treated with Posaconazole

    Science.gov (United States)

    Bustamante, Beatriz; Seas, Carlos; Salomon, Martín; Bravo, Francisco

    2013-01-01

    Lobomycosis is a chronic subcutaneous mycosis for which no standard treatment is available to date. We describe a patient in Peru with lobomycosis on the left earlobe that was successfully treated with posaconazole for 27 months. No evidence of recurrence was observed after five years of follow-up. PMID:23546805

  9. Parents, School Success, and Health

    Centers for Disease Control (CDC) Podcasts

    2009-08-03

    In this podcast, Dr. William Jeynes, CDC Parenting Speaker Series guest, discusses the importance of parental involvement in children's academic success and lifelong health.  Created: 8/3/2009 by National Center on Birth Defects and Developmental Disabilities (NCBDDD).   Date Released: 8/3/2009.

  10. Trusted advisers build business success

    NARCIS (Netherlands)

    V.M. Strike (Vanessa)

    2010-01-01

    textabstractThe role of the trusted and most loyal adviser has been a crucial one to family concerns throughout the ages. During the Korean Joseon Dynasty, Kim Cheo-Seon, a eunuch, advised successive kings wisely. Sir Francis Walsingham, Principal Secretary to Queen Elizabeth I of England, is rememb

  11. Successful I.D. Encounters.

    Science.gov (United States)

    Poorman, Margaret J.

    Instructional Development (I.D.) encounters are dependent for success on such variables as power, politics, promotion, and organizational placement. I.D. consultants must be aware of power bases or orientation of other personnel and clients, e.g., these four "power personalities" which affect their efforts in managing I.D. encounters: the gate…

  12. Measuring Success: Evaluating Educational Programs

    Science.gov (United States)

    Fisher, Yael

    2010-01-01

    This paper reveals a new evaluation model, which enables educational program and project managers to evaluate their programs with a simple and easy to understand approach. The "index of success model" is comprised of five parameters that enable to focus on and evaluate both the implementation and results of an educational program. The integration…

  13. of the Intestate Succession Act

    African Journals Online (AJOL)

    10332324

    substitution in terms of the Intestate Succession Act 81 of 1987. See the ... the Court explained the reasons for the difference between its .... (b) Each surviving spouse should inherit a child's share of the intestate estate or so ... the Black Administration Act 38 of 1927 and the Births and Deaths Registration Act 51 of 1992. 36.

  14. Practices of Successful School Leaders

    Science.gov (United States)

    Dolph, Dave; Grant, Stephen

    2010-01-01

    Leadership is a discipline that has been studied to improve organizational effectiveness and efficiency. Owing to the attention and emphasis that education has been subject to in recent decades, the study of successful school leadership presents an area worth examining (Smith & Piele, 1997). Therefore, the research underpinning this article…

  15. The Passion of Successful Leadership

    Science.gov (United States)

    Day, Christopher

    2004-01-01

    This paper reports multiperspective research on 10 successful, experienced headteachers working in a range of urban and suburban schools of different sizes (with different school populations and free school meals indices of between 20% and 62%). All had raised the levels of measurable pupil attainments in their schools and all were highly regarded…

  16. Archery: Success through Classroom Instruction.

    Science.gov (United States)

    Hensley, Ralph W.

    1982-01-01

    For maximum early success in mastering the sport of archery, the first few days of instruction should be taken in the classroom. Two positions, the grip and the anchor, which can be taught and rehearsed in the classroom, are described. (JN)

  17. Components of Successful Magnet Schools.

    Science.gov (United States)

    Bryant, Faye B.

    This paper identifies and discusses components of successful magnet programs. It is based on a review of existing research literature and information gathered directly from school districts. First, the paper discusses separately the following elements, which are considered "core components": (1) leadership; (2) organizational structure; (3)…

  18. Trusted advisers build business success

    NARCIS (Netherlands)

    V.M. Strike (Vanessa)

    2010-01-01

    textabstractThe role of the trusted and most loyal adviser has been a crucial one to family concerns throughout the ages. During the Korean Joseon Dynasty, Kim Cheo-Seon, a eunuch, advised successive kings wisely. Sir Francis Walsingham, Principal Secretary to Queen Elizabeth I of England, is rememb

  19. Defining and Measuring Academic Success

    Science.gov (United States)

    York, Travis T.; Gibson, Charles; Rankin, Susan

    2015-01-01

    Despite, and perhaps because of its amorphous nature, the term "academic success" is one of the most widely used constructs in educational research and assessment within higher education. This paper conducts an analytic literature review to examine the use and operationalization of the term in multiple academic fields. Dominant…

  20. Successful Teachers Practice Perpetual Learning

    Science.gov (United States)

    Main, Marisa

    2007-01-01

    Successful teaching involves continuous learning, stimulation, motivation, and networking with other art educators. To help art teachers improve themselves, SchoolArts magazine recently organized the Folk Art Traditions and Beyond Seminar at Ghost Ranch in Santa Fe. In this article, the author describes the highlights of the Folk Art Traditions…