WorldWideScience

Sample records for surprisingly successful metamodels

  1. Schizoanalysis as Metamodeling

    Directory of Open Access Journals (Sweden)

    Janell Watson

    2008-01-01

    Full Text Available Félix Guattari, writing both on his own and with philosopher Gilles Deleuze, developed the notion of schizoanalysis out of his frustration with what he saw as the shortcomings of Freudian and Lacanian psychoanalysis, namely the orientation toward neurosis, emphasis on language, and lack of socio-political engagement. Guattari was analyzed by Lacan, attended the seminars from the beginning, and remained a member of Lacan's school until his death in 1992. His unorthodox lacanism grew out of his clinical work with schizophrenics and involvement in militant politics. Paradoxically, even as he rebelled theoretically and practically against Lacan's 'mathemes of the unconscious' and topology of knots, Guattari ceaselessly drew diagrams and models. Deleuze once said of him that 'His ideas are drawings, or even diagrams.' Guattari's singled-authored books are filled with strange figures, which borrow from fields as diverse as linguistics, cultural anthropology, chaos theory, energetics, and non-equilibrium thermodynamics. Guattari himself declared schizoanalysis a 'metamodeling,' but at the same time insisted that his models were constructed aesthetically, not scientifically, despite his liberal borrowing of scientific terminology. The practice of schizoanalytic metamodeling is complicated by his and Deleuze's concept of the diagram, which they define as a way of thinking that bypasses language, as for example in musical notation or mathematical formulas. This article will explore Guattari's models, in relation to Freud, Lacan, C.S. Peirce, Louis Hjelmslev, Noam Chomsky, and Ilya Prigogine. I will also situate his drawings in relation to his work as a practicing clinician, political activist, and co-author of Anti-Oedipus and A Thousand Plateaus.

  2. Towards the formalisation of the TOGAF Content Metamodel using ontologies

    CSIR Research Space (South Africa)

    Gerber, A

    2010-06-01

    Full Text Available Metamodels are abstractions that are used to specify characteristics of models. Such metamodels are generally included in specifications or framework descriptions. A metamodel is for instance used to inform the generation of enterprise architecture...

  3. Ontological Surprises

    DEFF Research Database (Denmark)

    Leahu, Lucian

    2016-01-01

    a hybrid approach where machine learning algorithms are used to identify objects as well as connections between them; finally, it argues for remaining open to ontological surprises in machine learning as they may enable the crafting of different relations with and through technologies.......This paper investigates how we might rethink design as the technological crafting of human-machine relations in the context of a machine learning technique called neural networks. It analyzes Google’s Inceptionism project, which uses neural networks for image recognition. The surprising output...

  4. Surprise Trips

    DEFF Research Database (Denmark)

    Korn, Matthias; Kawash, Raghid; Andersen, Lisbet Møller

    2010-01-01

    We report on a platform that augments the natural experience of exploration in diverse indoor and outdoor environments. The system builds on the theme of surprises in terms of user expectations and finding points of interest. It utilizes physical icons as representations of users' interests...... and as notification tokens to alert users when they are within proximity of a surprise. To evaluate the concept, we developed mock-ups, a video prototype and conducted a wizard-of-oz user test for a national park in Denmark....

  5. University Students' Meta-Modelling Knowledge

    Science.gov (United States)

    Krell, Moritz; Krüger, Dirk

    2017-01-01

    Background: As one part of scientific meta-knowledge, students' meta-modelling knowledge should be promoted on different educational levels such as primary school, secondary school and university. This study focuses on the assessment of university students' meta-modelling knowledge using a paper-pencil questionnaire. Purpose: The general purpose…

  6. METAMODELS OF INFORMATION TECHNOLOGY BEST PRACTICES FRAMEWORKS

    Directory of Open Access Journals (Sweden)

    Arthur Nunes Ferreira Neto

    2011-12-01

    Full Text Available This article deals with the generation and application of ontological metamodels of frameworks of best practices in IT. The ontological metamodels represent the logical structures and fundamental semantics of framework models and constitute adequate tools for the analysis, adaptation, comparison and integration of the frameworks of best practices in IT. The MetaFrame methodology for the construction of the metamodels, founded on the discipline of the conceptual metamodelling and on the extended Entity/Relationship methodology is described herein, as well as the metamodels of the best practices for the outsourcing of IT, the eSCM-SP v2.01 (eSourcing Capability Model for Service Providers and the eSCM-CL v1.1 (eSourcing Capability Model for Client Organizations, constructed according to the MetaFrame methodology.

  7. Challenges to Applying a Metamodel for Groundwater Flow Beyond Underlying Numerical Model Boundaries

    Science.gov (United States)

    Reeves, H. W.; Fienen, M. N.; Feinstein, D.

    2015-12-01

    Metamodels of environmental behavior offer opportunities for decision support, adaptive management, and increased stakeholder engagement through participatory modeling and model exploration. Metamodels are derived from calibrated, computationally demanding, numerical models. They may potentially be applied to non-modeled areas to provide screening or preliminary analysis tools for areas that do not yet have the benefit of more comprehensive study. In this decision-support mode, they may be fulfilling a role often accomplished by application of analytical solutions. The major challenge to transferring a metamodel to a non-modeled area is how to quantify the spatial data in the new area of interest in such a way that it is consistent with the data used to derive the metamodel. Tests based on transferring a metamodel derived from a numerical groundwater-flow model of the Lake Michigan Basin to other glacial settings across the northern U.S. show that the spatial scale of the numerical model must be appropriately scaled to adequately represent different settings. Careful GIS analysis of the numerical model, metamodel, and new area of interest is required for successful transfer of results.

  8. Charming surprise

    CERN Multimedia

    Antonella Del Rosso

    2011-01-01

    The CP violation in charm quarks has always been thought to be extremely small. So, looking at particle decays involving matter and antimatter, the LHCb experiment has recently been surprised to observe that things might be different. Theorists are on the case. The study of the physics of the charm quark was not in the initial plans of the LHCb experiment, whose letter “b” stands for “beauty quark”. However, already one year ago, the Collaboration decided to look into a wider spectrum of processes that involve charm quarks among other things. The LHCb trigger allows a lot of these processes to be selected, and, among them, one has recently shown interesting features. Other experiments at b-factories have already performed the same measurement but this is the first time that it has been possible to achieve such high precision, thanks to the huge amount of data provided by the very high luminosity of the LHC. “We have observed the decay modes of the D0, a pa...

  9. Charming surprise

    CERN Multimedia

    Antonella Del Rosso

    2011-01-01

    The CP violation in charm quarks has always been thought to be extremely small. So, looking at particle decays involving matter and antimatter, the LHCb experiment has recently been surprised to observe that things might be different. Theorists are on the case.   The study of the physics of the charm quark was not in the initial plans of the LHCb experiment, whose letter “b” stands for “beauty quark”. However, already one year ago, the Collaboration decided to look into a wider spectrum of processes that involve charm quarks among other things. The LHCb trigger allows a lot of these processes to be selected, and, among them, one has recently shown interesting features. Other experiments at b-factories have already performed the same measurement but this is the first time that it has been possible to achieve such high precision, thanks to the huge amount of data provided by the very high luminosity of the LHC. “We have observed the decay modes of t...

  10. Metamodeling of Semantic Web Enabled Multiagent Systems

    NARCIS (Netherlands)

    Kardas, G.; Göknil, Arda; Dikenelli, O.; Topaloglu, N.Y.; Weyns, D.; Holvoet, T.

    2006-01-01

    Several agent researchers are currently studying agent modeling and they propose dierent architectural metamodels for developing Multiagent Systems (MAS) according to specic agent development methodologies. When support for Semantic Web technology and its related constructs are considered, agent

  11. Metamodel of the IT Governance Framework COBIT

    Directory of Open Access Journals (Sweden)

    João Souza Neto

    2013-10-01

    Full Text Available This paper addresses the generation and analysis of the COBIT 4.1 ontological metamodel of IT Governance framework. The ontological metamodels represent the logical structures and fundamental semantics of framework models and constitute adequate tools for the analysis, adaptation, comparison and integration of IT best practice frameworks. The MetaFrame methodology used for the construction of the COBIT metamodel is based on the discipline of conceptual metamodeling and on the extended Entity/Relationship methodology. It has an iterative process of construction of the metamodel’s components, using techniques of modeling and documentation of information systems. In the COBIT 4.1metamodel, the central entity type is the IT Process. The entity type of IT Domain represents the four domains that group one or more IT processes of the COBIT 4.1. In turn, these domains are divided into one or more Activities that are carried through by one or more Roles which are consulted, informed, accounted for or liable for each Activity. The COBIT 4.1 metamodel may suggest adaptation or implementation of a new process within the framework or even contribute to the integration of frameworks, when, after the processes of analysis and comparison, there are connection points between the components and the logical structures of its relationships.

  12. Surprise, Recipes for Surprise, and Social Influence.

    Science.gov (United States)

    Loewenstein, Jeffrey

    2018-02-07

    Surprising people can provide an opening for influencing them. Surprises garner attention, are arousing, are memorable, and can prompt shifts in understanding. Less noted is that, as a result, surprises can serve to persuade others by leading them to shifts in attitudes. Furthermore, because stories, pictures, and music can generate surprises and those can be widely shared, surprise can have broad social influence. People also tend to share surprising items with others, as anyone on social media has discovered. This means that in addition to broadcasting surprising information, surprising items can also spread through networks. The joint result is that surprise not only has individual effects on beliefs and attitudes but also collective effects on the content of culture. Items that generate surprise need not be random or accidental. There are predictable methods or recipes for generating surprise. One such recipe is discussed, the repetition-break plot structure, to explore the psychological and social possibilities of examining surprise. Recipes for surprise offer a useful means for understanding how surprise works and offer prospects for harnessing surprise to a wide array of ends. Copyright © 2017 Cognitive Science Society, Inc.

  13. SPEM: Software Process Engineering Metamodel

    Directory of Open Access Journals (Sweden)

    Víctor Hugo Menéndez Domínguez

    2015-05-01

    Full Text Available Todas las organizaciones involucradas en el desarrollo de software necesitan establecer, gestionar y soportar el trabajo de desarrollo. El término “proceso de desarrollo de software” tiende a unificar todas las actividades y prácticas que cubren esas necesidades. Modelar el proceso de software es una forma para mejorar el desarrollo y la calidad de las aplicaciones resultantes. De entre todos los lenguajes existentes para el modelado de procesos, aquellos basados en productos de trabajo son los más adecuados. Uno de tales lenguajes es SPEM (Software Process Engineering Metamodel. SPEM fue creado por OMG (Object Management Group como un estándar de alto nivel, que está basado en MOF (MetaObject Facility y es un metamodelo UML (Uniform Model Language. Constituye un tipo de ontología de procesos de desarrollo de software. En este artículo se ofrece una descripción, en términos generales, del estándar SPEM. También se destacan los cambios que ha experimentado entre la versión 1.1 y la versión 2.0, presentando tanto las ventajas como las desventajas encontradas entre ambas versiones.

  14. Information-theoretic metamodel of organizational evolution

    Science.gov (United States)

    Sepulveda, Alfredo

    2011-12-01

    Social organizations are abstractly modeled by holarchies---self-similar connected networks---and intelligent complex adaptive multiagent systems---large networks of autonomous reasoning agents interacting via scaled processes. However, little is known of how information shapes evolution in such organizations, a gap that can lead to misleading analytics. The research problem addressed in this study was the ineffective manner in which classical model-predict-control methods used in business analytics attempt to define organization evolution. The purpose of the study was to construct an effective metamodel for organization evolution based on a proposed complex adaptive structure---the info-holarchy. Theoretical foundations of this study were holarchies, complex adaptive systems, evolutionary theory, and quantum mechanics, among other recently developed physical and information theories. Research questions addressed how information evolution patterns gleamed from the study's inductive metamodel more aptly explained volatility in organization. In this study, a hybrid grounded theory based on abstract inductive extensions of information theories was utilized as the research methodology. An overarching heuristic metamodel was framed from the theoretical analysis of the properties of these extension theories and applied to business, neural, and computational entities. This metamodel resulted in the synthesis of a metaphor for, and generalization of organization evolution, serving as the recommended and appropriate analytical tool to view business dynamics for future applications. This study may manifest positive social change through a fundamental understanding of complexity in business from general information theories, resulting in more effective management.

  15. Introducing Meta-models for a More Efficient Hazard Mitigation Strategy with Rockfall Protection Barriers

    Science.gov (United States)

    Toe, David; Mentani, Alessio; Govoni, Laura; Bourrier, Franck; Gottardi, Guido; Lambert, Stéphane

    2018-04-01

    The paper presents a new approach to assess the effecctiveness of rockfall protection barriers, accounting for the wide variety of impact conditions observed on natural sites. This approach makes use of meta-models, considering a widely used rockfall barrier type and was developed from on FE simulation results. Six input parameters relevant to the block impact conditions have been considered. Two meta-models were developed concerning the barrier capability either of stopping the block or in reducing its kinetic energy. The outcome of the parameters range on the meta-model accuracy has been also investigated. The results of the study reveal that the meta-models are effective in reproducing with accuracy the response of the barrier to any impact conditions, providing a formidable tool to support the design of these structures. Furthermore, allowing to accommodate the effects of the impact conditions on the prediction of the block-barrier interaction, the approach can be successfully used in combination with rockfall trajectory simulation tools to improve rockfall quantitative hazard assessment and optimise rockfall mitigation strategies.

  16. A precategorical spatial-data metamodel

    OpenAIRE

    Steven A Roberts; G Brent Hall; Paul H Calamai

    2006-01-01

    Increasing recognition of the extent and speed of habitat fragmentation and loss, particularly in the urban fringe, is driving the need to analyze qualitatively and quantitatively regional landscape structure for decision support in land-use planning and environmental-policy implementation. The spatial analysis required in this area is not well served by existing spatial-data models. In this paper a new theoretical spatial-data metamodel is introduced as a tool for addressing such needs and a...

  17. Unifying approach for model transformations in the MOF metamodeling architecture

    NARCIS (Netherlands)

    Ivanov, Ivan; van den Berg, Klaas

    2004-01-01

    In the Meta Object Facility (MOF) metamodeling architecture a number of model transformation scenarios can be identified. It could be expected that a metamodeling architecture will be accompanied by a transformation technology supporting the model transformation scenarios in a uniform way. Despite

  18. Certified metamodels for sensitivity indices estimation

    Directory of Open Access Journals (Sweden)

    Prieur Clémentine

    2012-04-01

    Full Text Available Global sensitivity analysis of a numerical code, more specifically estimation of Sobol indices associated with input variables, generally requires a large number of model runs. When those demand too much computation time, it is necessary to use a reduced model (metamodel to perform sensitivity analysis, whose outputs are numerically close to the ones of the original model, while being much faster to run. In this case, estimated indices are subject to two kinds of errors: sampling error, caused by the computation of the integrals appearing in the definition of the Sobol indices by a Monte-Carlo method, and metamodel error, caused by the replacement of the original model by the metamodel. In cases where we have certified bounds for the metamodel error, we propose a method to quantify both types of error, and we compute confidence intervals for first-order Sobol indices. L’analyse de sensibilité globale d’un modèle numérique, plus précisément l’estimation des indices de Sobol associés aux variables d’entrée, nécessite généralement un nombre important d’exécutions du modèle à analyser. Lorsque celles-ci requièrent un temps de calcul important, il est judicieux d’effectuer l’analyse de sensibilité sur un modèle réduit (ou métamodèle, fournissant des sorties numériquement proches du modèle original mais pour un coût nettement inférieur. Les indices estimés sont alors entâchés de deux sortes d’erreur : l’erreur d’échantillonnage, causée par l’estimation des intégrales définissant les indices de Sobol par une méthode de Monte-Carlo, et l’erreur de métamodèle, liée au remplacement du modèle original par le métamodèle. Lorsque nous disposons de bornes d’erreurs certifiées pour le métamodèle, nous proposons une méthode pour quantifier les deux types d’erreurs et fournir des intervalles de confiance pour les indices de Sobol du premier ordre.

  19. An organizational metamodel for hospital emergency departments.

    Science.gov (United States)

    Kaptan, Kubilay

    2014-10-01

    I introduce an organizational model describing the response of the hospital emergency department. The hybrid simulation/analytical model (called a "metamodel") can estimate a hospital's capacity and dynamic response in real time and incorporate the influence of damage to structural and nonstructural components on the organizational ones. The waiting time is the main parameter of response and is used to evaluate the disaster resilience of health care facilities. Waiting time behavior is described by using a double exponential function and its parameters are calibrated based on simulated data. The metamodel covers a large range of hospital configurations and takes into account hospital resources in terms of staff and infrastructures, operational efficiency, and the possible existence of an emergency plan; maximum capacity; and behavior both in saturated and overcapacitated conditions. The sensitivity of the model to different arrival rates, hospital configurations, and capacities and the technical and organizational policies applied during and before a disaster were investigated. This model becomes an important tool in the decision process either for the engineering profession or for policy makers.

  20. Achieving Actionable Results from Available Inputs: Metamodels Take Building Energy Simulations One Step Further

    Energy Technology Data Exchange (ETDEWEB)

    Horsey, Henry; Fleming, Katherine; Ball, Brian; Long, Nicholas

    2016-08-26

    Modeling commercial building energy usage can be a difficult and time-consuming task. The increasing prevalence of optimization algorithms provides one path for reducing the time and difficulty. Many use cases remain, however, where information regarding whole-building energy usage is valuable, but the time and expertise required to run and post-process a large number of building energy simulations is intractable. A relatively underutilized option to accurately estimate building energy consumption in real time is to pre-compute large datasets of potential building energy models, and use the set of results to quickly and efficiently provide highly accurate data. This process is called metamodeling. In this paper, two case studies are presented demonstrating the successful applications of metamodeling using the open-source OpenStudio Analysis Framework. The first case study involves the U.S. Department of Energy's Asset Score Tool, specifically the Preview Asset Score Tool, which is designed to give nontechnical users a near-instantaneous estimated range of expected results based on building system-level inputs. The second case study involves estimating the potential demand response capabilities of retail buildings in Colorado. The metamodel developed in this second application not only allows for estimation of a single building's expected performance, but also can be combined with public data to estimate the aggregate DR potential across various geographic (county and state) scales. In both case studies, the unique advantages of pre-computation allow building energy models to take the place of topdown actuarial evaluations. This paper ends by exploring the benefits of using metamodels and then examines the cost-effectiveness of this approach.

  1. Optimization Using Metamodeling in the Context of Integrated Computational Materials Engineering (ICME)

    Energy Technology Data Exchange (ETDEWEB)

    Hammi, Youssef; Horstemeyer, Mark F; Wang, Paul; David, Francis; Carino, Ricolindo

    2013-11-18

    Predictive Design Technologies, LLC (PDT) proposed to employ Integrated Computational Materials Engineering (ICME) tools to help the manufacturing industry in the United States regain the competitive advantage in the global economy. ICME uses computational materials science tools within a holistic system in order to accelerate materials development, improve design optimization, and unify design and manufacturing. With the advent of accurate modeling and simulation along with significant increases in high performance computing (HPC) power, virtual design and manufacturing using ICME tools provide the means to reduce product development time and cost by alleviating costly trial-and-error physical design iterations while improving overall quality and manufacturing efficiency. To reduce the computational cost necessary for the large-scale HPC simulations and to make the methodology accessible for small and medium-sized manufacturers (SMMs), metamodels are employed. Metamodels are approximate models (functional relationships between input and output variables) that can reduce the simulation times by one to two orders of magnitude. In Phase I, PDT, partnered with Mississippi State University (MSU), demonstrated the feasibility of the proposed methodology by employing MSU?s internal state variable (ISV) plasticity-damage model with the help of metamodels to optimize the microstructure-process-property-cost for tube manufacturing processes used by Plymouth Tube Company (PTC), which involves complicated temperature and mechanical loading histories. PDT quantified the microstructure-property relationships for PTC?s SAE J525 electric resistance-welded cold drawn low carbon hydraulic 1010 steel tube manufacturing processes at seven different material states and calibrated the ISV plasticity material parameters to fit experimental tensile stress-strain curves. PDT successfully performed large scale finite element (FE) simulations in an HPC environment using the ISV plasticity

  2. Breeding ecology of the southern shrike, Lanius meridionalis, in an agrosystem of south–eastern Spain: the surprisingly excellent breeding success in a declining population

    Energy Technology Data Exchange (ETDEWEB)

    Moreno-Rueda, G.; Abril-Colon, I.; Lopez-Orta, A.; Alvarez-Benito, I.; Castillo-Gomez, C.; Comas, M.; Rivas, J.M.

    2016-07-01

    The southern shrike, Lanius meridionalis, is declining at the Spanish and European level. One cause of this decline could be low reproductive success due to low availability of prey in agricultural environments. To investigate this possibility we analysed the breeding ecology of a population of southern shrike in an agrosystem in Lomas de Padul (SE Spain). Our results suggest the population is declining in this area. However, contrary to expectations, the population showed the highest reproductive success (% nests in which at least one egg produces a fledgling) reported for this species to date (83.3%), with a productivity of 4.04 fledglings per nest. Reproductive success varied throughout the years, ranging from 75% in the worst year to 92.9% in the best year. Similarly, productivity ranged from 3.25 to 5.0 fledglings per nest depending on the year. Other aspects of reproductive biology, such as clutch size, brood size, and nestling diet, were similar to those reported in other studies. Based on these results, we hypothesise that the determinant of population decline acts on the juvenile fraction, drastically reducing the recruitment rate, or affecting the dispersion of adults and recruits. Nevertheless, the exact factor or factors are unknown. This study shows that a high reproductive success does not guarantee good health status of the population. (Author)

  3. Statistical metamodeling for revealing synergistic antimicrobial interactions.

    Directory of Open Access Journals (Sweden)

    Hsiang Chia Chen

    2010-11-01

    Full Text Available Many bacterial pathogens are becoming drug resistant faster than we can develop new antimicrobials. To address this threat in public health, a metamodel antimicrobial cocktail optimization (MACO scheme is demonstrated for rapid screening of potent antibiotic cocktails using uropathogenic clinical isolates as model systems. With the MACO scheme, only 18 parallel trials were required to determine a potent antimicrobial cocktail out of hundreds of possible combinations. In particular, trimethoprim and gentamicin were identified to work synergistically for inhibiting the bacterial growth. Sensitivity analysis indicated gentamicin functions as a synergist for trimethoprim, and reduces its minimum inhibitory concentration for 40-fold. Validation study also confirmed that the trimethoprim-gentamicin synergistic cocktail effectively inhibited the growths of multiple strains of uropathogenic clinical isolates. With its effectiveness and simplicity, the MACO scheme possesses the potential to serve as a generic platform for identifying synergistic antimicrobial cocktails toward management of bacterial infection in the future.

  4. Mean-Variance-Validation Technique for Sequential Kriging Metamodels

    International Nuclear Information System (INIS)

    Lee, Tae Hee; Kim, Ho Sung

    2010-01-01

    The rigorous validation of the accuracy of metamodels is an important topic in research on metamodel techniques. Although a leave-k-out cross-validation technique involves a considerably high computational cost, it cannot be used to measure the fidelity of metamodels. Recently, the mean 0 validation technique has been proposed to quantitatively determine the accuracy of metamodels. However, the use of mean 0 validation criterion may lead to premature termination of a sampling process even if the kriging model is inaccurate. In this study, we propose a new validation technique based on the mean and variance of the response evaluated when sequential sampling method, such as maximum entropy sampling, is used. The proposed validation technique is more efficient and accurate than the leave-k-out cross-validation technique, because instead of performing numerical integration, the kriging model is explicitly integrated to accurately evaluate the mean and variance of the response evaluated. The error in the proposed validation technique resembles a root mean squared error, thus it can be used to determine a stop criterion for sequential sampling of metamodels

  5. A Sequential Optimization Sampling Method for Metamodels with Radial Basis Functions

    Science.gov (United States)

    Pan, Guang; Ye, Pengcheng; Yang, Zhidong

    2014-01-01

    Metamodels have been widely used in engineering design to facilitate analysis and optimization of complex systems that involve computationally expensive simulation programs. The accuracy of metamodels is strongly affected by the sampling methods. In this paper, a new sequential optimization sampling method is proposed. Based on the new sampling method, metamodels can be constructed repeatedly through the addition of sampling points, namely, extrema points of metamodels and minimum points of density function. Afterwards, the more accurate metamodels would be constructed by the procedure above. The validity and effectiveness of proposed sampling method are examined by studying typical numerical examples. PMID:25133206

  6. A toolkit for detecting technical surprise.

    Energy Technology Data Exchange (ETDEWEB)

    Trahan, Michael Wayne; Foehse, Mark C.

    2010-10-01

    The detection of a scientific or technological surprise within a secretive country or institute is very difficult. The ability to detect such surprises would allow analysts to identify the capabilities that could be a military or economic threat to national security. Sandia's current approach utilizing ThreatView has been successful in revealing potential technological surprises. However, as data sets become larger, it becomes critical to use algorithms as filters along with the visualization environments. Our two-year LDRD had two primary goals. First, we developed a tool, a Self-Organizing Map (SOM), to extend ThreatView and improve our understanding of the issues involved in working with textual data sets. Second, we developed a toolkit for detecting indicators of technical surprise in textual data sets. Our toolkit has been successfully used to perform technology assessments for the Science & Technology Intelligence (S&TI) program.

  7. Metamodel comparison and model comparison for safety assurance

    NARCIS (Netherlands)

    Luo, Y.; Engelen, L.J.P.; Brand, van den M.G.J.; Bondavelli, A.; Ceccarelli, A.; Ortmeier, F.

    2014-01-01

    In safety-critical domains, conceptual models are created in the form of metamodels using different concepts from possibly overlapping domains. Comparison between those conceptual models can facilitate the reuse of models from one domain to another. This paper describes the mappings detected when

  8. Property preservation and quality measures in meta-models

    NARCIS (Netherlands)

    Siem, A.Y.D.

    2008-01-01

    This thesis consists of three parts. Each part considers different sorts of meta-models. In the first part so-called Sandwich models are considered. In the second part Kriging models are considered. Finally, in the third part, (trigonometric) Polynomials and Rational models are studied.

  9. SWAT meta-modeling as support of the management scenario analysis in large watersheds.

    Science.gov (United States)

    Azzellino, A; Çevirgen, S; Giupponi, C; Parati, P; Ragusa, F; Salvetti, R

    2015-01-01

    In the last two decades, numerous models and modeling techniques have been developed to simulate nonpoint source pollution effects. Most models simulate the hydrological, chemical, and physical processes involved in the entrainment and transport of sediment, nutrients, and pesticides. Very often these models require a distributed modeling approach and are limited in scope by the requirement of homogeneity and by the need to manipulate extensive data sets. Physically based models are extensively used in this field as a decision support for managing the nonpoint source emissions. A common characteristic of this type of model is a demanding input of several state variables that makes the calibration and effort-costing in implementing any simulation scenario more difficult. In this study the USDA Soil and Water Assessment Tool (SWAT) was used to model the Venice Lagoon Watershed (VLW), Northern Italy. A Multi-Layer Perceptron (MLP) network was trained on SWAT simulations and used as a meta-model for scenario analysis. The MLP meta-model was successfully trained and showed an overall accuracy higher than 70% both on the training and on the evaluation set, allowing a significant simplification in conducting scenario analysis.

  10. Laser Welding Process Parameters Optimization Using Variable-Fidelity Metamodel and NSGA-II

    Directory of Open Access Journals (Sweden)

    Wang Chaochao

    2017-01-01

    Full Text Available An optimization methodology based on variable-fidelity (VF metamodels and nondominated sorting genetic algorithm II (NSGA-II for laser bead-on-plate welding of stainless steel 316L is presented. The relationships between input process parameters (laser power, welding speed and laser focal position and output responses (weld width and weld depth are constructed by VF metamodels. In VF metamodels, the information from two levels fidelity models are integrated, in which the low-fidelity model (LF is finite element simulation model that is used to capture the general trend of the metamodels, and high-fidelity (HF model which from physical experiments is used to ensure the accuracy of metamodels. The accuracy of the VF metamodel is verified by actual experiments. To slove the optimization problem, NSGA-II is used to search for multi-objective Pareto optimal solutions. The results of verification experiments show that the obtained optimal parameters are effective and reliable.

  11. Applying the Business Process and Practice Alignment Meta-model: Daily Practices and Process Modelling

    Directory of Open Access Journals (Sweden)

    Ventura Martins Paula

    2017-03-01

    Full Text Available Background: Business Process Modelling (BPM is one of the most important phases of information system design. Business Process (BP meta-models allow capturing informational and behavioural aspects of business processes. Unfortunately, standard BP meta-modelling approaches focus just on process description, providing different BP models. It is not possible to compare and identify related daily practices in order to improve BP models. This lack of information implies that further research in BP meta-models is needed to reflect the evolution/change in BP. Considering this limitation, this paper introduces a new BP meta-model designed by Business Process and Practice Alignment Meta-model (BPPAMeta-model. Our intention is to present a meta-model that addresses features related to the alignment between daily work practices and BP descriptions. Objectives: This paper intends to present a metamodel which is going to integrate daily work information into coherent and sound process definitions. Methods/Approach: The methodology employed in the research follows a design-science approach. Results: The results of the case study are related to the application of the proposed meta-model to align the specification of a BP model with work practices models. Conclusions: This meta-model can be used within the BPPAM methodology to specify or improve business processes models based on work practice descriptions.

  12. Application of Metamodels to Identification of Metallic Materials Models

    OpenAIRE

    Pietrzyk, Maciej; Kusiak, Jan; Szeliga, Danuta; Rauch, Łukasz; Sztangret, Łukasz; Górecki, Grzegorz

    2016-01-01

    Improvement of the efficiency of the inverse analysis (IA) for various material tests was the objective of the paper. Flow stress models and microstructure evolution models of various complexity of mathematical formulation were considered. Different types of experiments were performed and the results were used for the identification of models. Sensitivity analysis was performed for all the models and the importance of parameters in these models was evaluated. Metamodels based on artificial ne...

  13. Global sensitivity analysis using a Gaussian Radial Basis Function metamodel

    International Nuclear Information System (INIS)

    Wu, Zeping; Wang, Donghui; Okolo N, Patrick; Hu, Fan; Zhang, Weihua

    2016-01-01

    Sensitivity analysis plays an important role in exploring the actual impact of adjustable parameters on response variables. Amongst the wide range of documented studies on sensitivity measures and analysis, Sobol' indices have received greater portion of attention due to the fact that they can provide accurate information for most models. In this paper, a novel analytical expression to compute the Sobol' indices is derived by introducing a method which uses the Gaussian Radial Basis Function to build metamodels of computationally expensive computer codes. Performance of the proposed method is validated against various analytical functions and also a structural simulation scenario. Results demonstrate that the proposed method is an efficient approach, requiring a computational cost of one to two orders of magnitude less when compared to the traditional Quasi Monte Carlo-based evaluation of Sobol' indices. - Highlights: • RBF based sensitivity analysis method is proposed. • Sobol' decomposition of Gaussian RBF metamodel is obtained. • Sobol' indices of Gaussian RBF metamodel are derived based on the decomposition. • The efficiency of proposed method is validated by some numerical examples.

  14. Application of Metamodels to Identification of Metallic Materials Models

    Directory of Open Access Journals (Sweden)

    Maciej Pietrzyk

    2016-01-01

    Full Text Available Improvement of the efficiency of the inverse analysis (IA for various material tests was the objective of the paper. Flow stress models and microstructure evolution models of various complexity of mathematical formulation were considered. Different types of experiments were performed and the results were used for the identification of models. Sensitivity analysis was performed for all the models and the importance of parameters in these models was evaluated. Metamodels based on artificial neural network were proposed to simulate experiments in the inverse solution. Performed analysis has shown that significant decrease of the computing times could be achieved when metamodels substitute finite element model in the inverse analysis, which is the case in the identification of flow stress models. Application of metamodels gave good results for flow stress models based on closed form equations accounting for an influence of temperature, strain, and strain rate (4 coefficients and additionally for softening due to recrystallization (5 coefficients and for softening and saturation (7 coefficients. Good accuracy and high efficiency of the IA were confirmed. On the contrary, identification of microstructure evolution models, including phase transformation models, did not give noticeable reduction of the computing time.

  15. Surprising radiation detectors

    CERN Document Server

    Fleischer, Robert

    2003-01-01

    Radiation doses received by the human body can be measured indirectly and retrospectively by counting the tracks left by particles in ordinary objects like pair of spectacles, glassware, compact disks...This method has been successfully applied to determine neutron radiation doses received 50 years ago on the Hiroshima site. Neutrons themselves do not leave tracks in bulk matter but glass contains atoms of uranium that may fission when hurt by a neutron, the recoil of the fission fragments generates a track that is detectable. The most difficult is to find adequate glass items and to evaluate the radiation shield they benefited at their initial place. The same method has been used to determine the radiation dose due to the pile-up of radon in houses. In that case the tracks left by alpha particles due to the radioactive decay of polonium-210 have been counted on the superficial layer of the window panes. Other materials like polycarbonate plastics have been used to determine the radiation dose due to heavy io...

  16. idSpace D2.3 – Semantic meta-model integration and transformations v2

    DEFF Research Database (Denmark)

    Dolog, Peter; Grube, Pascal; Schmid, Klaus

    2009-01-01

    This deliverable discusses an extended set of requirements for transformations and metamodel for creativity techniques. Based on the requirements, the deliverable provides refined meta-model. The metamodel allows for more advanced transforma-tion concepts besides the previously delivered graph tr...... oriented implemen-tation with portlets and widgets in the Liferay portal....

  17. Surprise... Surprise..., An Empirical Investigation on How Surprise is Connected to Customer Satisfaction

    NARCIS (Netherlands)

    J. Vanhamme (Joëlle)

    2003-01-01

    textabstractThis research investigates the specific influence of the emotion of surprise on customer transaction-specific satisfaction. Four empirical studies-two field studies (a diary study and a cross section survey) and two experiments-were conducted. The results show that surprise positively

  18. MISTRAL : A Language for Model Transformations in the MOF Meta-modeling Architecture

    NARCIS (Netherlands)

    Kurtev, Ivan; van den Berg, Klaas; Aßmann, Uwe; Aksit, Mehmet; Rensink, Arend

    2005-01-01

    n the Meta Object Facility (MOF) meta-modeling architecture a number of model transformation scenarios can be identified. It could be expected that a meta-modeling architecture will be accompanied by a transformation technology supporting the model transformation scenarios in a uniform way. Despite

  19. A MOF Metamodel for the Development of Context-Aware Mobile Applications

    NARCIS (Netherlands)

    Guareis de farias, Cléver; Leite, M.M.; Calvi, C.Z.; Mantovaneli Pessoa, Rodrigo; Pereira Filho, J.G.; Pereira Filho, J.

    Context-aware mobile applications are increasingly attracting interest of the research community. To facilitate the development of this class of applications, it is necessary that both applications and support platforms share a common context metamodel. This paper presents a metamodel defined using

  20. SM4AM: A Semantic Metamodel for Analytical Metadata

    DEFF Research Database (Denmark)

    Varga, Jovan; Romero, Oscar; Pedersen, Torben Bach

    2014-01-01

    Next generation BI systems emerge as platforms where traditional BI tools meet semi-structured and unstructured data coming from the Web. In these settings, the user-centric orientation represents a key characteristic for the acceptance and wide usage by numerous and diverse end users in their data....... We present SM4AM, a Semantic Metamodel for Analytical Metadata created as an RDF formalization of the Analytical Metadata artifacts needed for user assistance exploitation purposes in next generation BI systems. We consider the Linked Data initiative and its relevance for user assistance...

  1. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    Science.gov (United States)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  2. Artificial intelligence metamodel comparison and application to wind turbine airfoil uncertainty analysis

    Directory of Open Access Journals (Sweden)

    Yaping Ju

    2016-05-01

    Full Text Available The Monte Carlo simulation method for turbomachinery uncertainty analysis often requires performing a huge number of simulations, the computational cost of which can be greatly alleviated with the help of metamodeling techniques. An intensive comparative study was performed on the approximation performance of three prospective artificial intelligence metamodels, that is, artificial neural network, radial basis function, and support vector regression. The genetic algorithm was used to optimize the predetermined parameters of each metamodel for the sake of a fair comparison. Through testing on 10 nonlinear functions with different problem scales and sample sizes, the genetic algorithm–support vector regression metamodel was found more accurate and robust than the other two counterparts. Accordingly, the genetic algorithm–support vector regression metamodel was selected and combined with the Monte Carlo simulation method for the uncertainty analysis of a wind turbine airfoil under two types of surface roughness uncertainties. The results show that the genetic algorithm–support vector regression metamodel can capture well the uncertainty propagation from the surface roughness to the airfoil aerodynamic performance. This work is useful to the application of metamodeling techniques in the robust design optimization of turbomachinery.

  3. A meta-model for computer executable dynamic clinical safety checklists.

    Science.gov (United States)

    Nan, Shan; Van Gorp, Pieter; Lu, Xudong; Kaymak, Uzay; Korsten, Hendrikus; Vdovjak, Richard; Duan, Huilong

    2017-12-12

    Safety checklist is a type of cognitive tool enforcing short term memory of medical workers with the purpose of reducing medical errors caused by overlook and ignorance. To facilitate the daily use of safety checklists, computerized systems embedded in the clinical workflow and adapted to patient-context are increasingly developed. However, the current hard-coded approach of implementing checklists in these systems increase the cognitive efforts of clinical experts and coding efforts for informaticists. This is due to the lack of a formal representation format that is both understandable by clinical experts and executable by computer programs. We developed a dynamic checklist meta-model with a three-step approach. Dynamic checklist modeling requirements were extracted by performing a domain analysis. Then, existing modeling approaches and tools were investigated with the purpose of reusing these languages. Finally, the meta-model was developed by eliciting domain concepts and their hierarchies. The feasibility of using the meta-model was validated by two case studies. The meta-model was mapped to specific modeling languages according to the requirements of hospitals. Using the proposed meta-model, a comprehensive coronary artery bypass graft peri-operative checklist set and a percutaneous coronary intervention peri-operative checklist set have been developed in a Dutch hospital and a Chinese hospital, respectively. The result shows that it is feasible to use the meta-model to facilitate the modeling and execution of dynamic checklists. We proposed a novel meta-model for the dynamic checklist with the purpose of facilitating creating dynamic checklists. The meta-model is a framework of reusing existing modeling languages and tools to model dynamic checklists. The feasibility of using the meta-model is validated by implementing a use case in the system.

  4. Surprise as a design strategy

    NARCIS (Netherlands)

    Ludden, G.D.S.; Schifferstein, H.N.J.; Hekkert, P.P.M.

    2008-01-01

    Imagine yourself queuing for the cashier’s desk in a supermarket. Naturally, you have picked the wrong line, the one that does not seem to move at all. Soon, you get tired of waiting. Now, how would you feel if the cashier suddenly started to sing? Many of us would be surprised and, regardless of

  5. Risk Analysis for Road Tunnels – A Metamodel to Efficiently Integrate Complex Fire Scenarios

    DEFF Research Database (Denmark)

    Berchtold, Florian; Knaust, Christian; Arnold, Lukas

    2018-01-01

    Fires in road tunnels constitute complex scenarios with interactions between the fire, tunnel users and safety measures. More and more methodologies for risk analysis quantify the consequences of these scenarios with complex models. Examples for complex models are the computational fluid dynamics...... complex scenarios in risk analysis. To face this challenge, we improved the metamodel used in the methodology for risk analysis presented on ISTSS 2016. In general, a metamodel quickly interpolates the consequences of few scenarios simulated with the complex models to a large number of arbitrary scenarios...... used in risk analysis. Now, our metamodel consists of the projection array-based design, the moving least squares method, and the prediction interval to quantify the metamodel uncertainty. Additionally, we adapted the projection array-based design in two ways: the focus of the sequential refinement...

  6. Failure Probability Calculation Method Using Kriging Metamodel-based Importance Sampling Method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seunggyu [Korea Aerospace Research Institue, Daejeon (Korea, Republic of); Kim, Jae Hoon [Chungnam Nat’l Univ., Daejeon (Korea, Republic of)

    2017-05-15

    The kernel density was determined based on sampling points obtained in a Markov chain simulation and was assumed to be an important sampling function. A Kriging metamodel was constructed in more detail in the vicinity of a limit state. The failure probability was calculated based on importance sampling, which was performed for the Kriging metamodel. A pre-existing method was modified to obtain more sampling points for a kernel density in the vicinity of a limit state. A stable numerical method was proposed to find a parameter of the kernel density. To assess the completeness of the Kriging metamodel, the possibility of changes in the calculated failure probability due to the uncertainty of the Kriging metamodel was calculated.

  7. Meta-model of EPortfolio Usage in Different Environments

    Directory of Open Access Journals (Sweden)

    Igor Balaban

    2011-09-01

    Full Text Available EPortfolio offers a new philosophy of teaching and learning, giving the learner an opportunity to express oneself, to show one’s past work and experience to all the interested parties ranging from teachers to potential employers. However, an integral model for ePortfolio implementation in academic institutions that would take into account three different levels of stakeholders: 1. Individual (student and teacher; 2. Institution; and 3. Employer, currently does not exist. In this paper the role of ePortfolio in academic environment as well as the context in which ePortfolio operates is analyzed in detail. As a result of the comprehensive analysis that takes into account individual, academic institution and employer, a meta-model of ePortfolio usage in Lifelong Learning is proposed.

  8. Modeling Enterprise Authorization: A Unified Metamodel and Initial Validation

    Directory of Open Access Journals (Sweden)

    Matus Korman

    2016-07-01

    Full Text Available Authorization and its enforcement, access control, have stood at the beginning of the art and science of information security, and remain being crucial pillar of security in the information technology (IT and enterprises operations. Dozens of different models of access control have been proposed. Although Enterprise Architecture as the discipline strives to support the management of IT, support for modeling access policies in enterprises is often lacking, both in terms of supporting the variety of individual models of access control nowadays used, and in terms of providing a unified ontology capable of flexibly expressing access policies for all or the most of the models. This study summarizes a number of existing models of access control, proposes a unified metamodel mapped to ArchiMate, and illustrates its use on a selection of example scenarios and two business cases.

  9. Foundations of Meta-Pyramids: Languages vs. Metamodels -- Episode II: Story of Thotus the Baboon1

    OpenAIRE

    Favre, Jean-Marie

    2005-01-01

    Despite the recent interest for Model Driven Engineering approaches, the so-called four-layers metamodelling architecture is subject to a lot of debate. The relationship that exists between a model and a metamodel is often called instanceOf, but this terminology, which comes directly from the object oriented technology, is not appropriate for the modelling of similar meta-pyramids in other domains. The goal of this paper is to study which are the foundations of the meta-pyra...

  10. Tackling Biocomplexity with Meta-models for Species Risk Assessment

    Directory of Open Access Journals (Sweden)

    Philip J. Nyhus

    2007-06-01

    Full Text Available We describe results of a multi-year effort to strengthen consideration of the human dimension into endangered species risk assessments and to strengthen research capacity to understand biodiversity risk assessment in the context of coupled human-natural systems. A core group of social and biological scientists have worked with a network of more than 50 individuals from four countries to develop a conceptual framework illustrating how human-mediated processes influence biological systems and to develop tools to gather, translate, and incorporate these data into existing simulation models. A central theme of our research focused on (1 the difficulties often encountered in identifying and securing diverse bodies of expertise and information that is necessary to adequately address complex species conservation issues; and (2 the development of quantitative simulation modeling tools that could explicitly link these datasets as a way to gain deeper insight into these issues. To address these important challenges, we promote a "meta-modeling" approach where computational links are constructed between discipline-specific models already in existence. In this approach, each model can function as a powerful stand-alone program, but interaction between applications is achieved by passing data structures describing the state of the system between programs. As one example of this concept, an integrated meta-model of wildlife disease and population biology is described. A goal of this effort is to improve science-based capabilities for decision making by scientists, natural resource managers, and policy makers addressing environmental problems in general, and focusing on biodiversity risk assessment in particular.

  11. Metamodel-based inverse method for parameter identification: elastic-plastic damage model

    Science.gov (United States)

    Huang, Changwu; El Hami, Abdelkhalak; Radi, Bouchaïb

    2017-04-01

    This article proposed a metamodel-based inverse method for material parameter identification and applies it to elastic-plastic damage model parameter identification. An elastic-plastic damage model is presented and implemented in numerical simulation. The metamodel-based inverse method is proposed in order to overcome the disadvantage in computational cost of the inverse method. In the metamodel-based inverse method, a Kriging metamodel is constructed based on the experimental design in order to model the relationship between material parameters and the objective function values in the inverse problem, and then the optimization procedure is executed by the use of a metamodel. The applications of the presented material model and proposed parameter identification method in the standard A 2017-T4 tensile test prove that the presented elastic-plastic damage model is adequate to describe the material's mechanical behaviour and that the proposed metamodel-based inverse method not only enhances the efficiency of parameter identification but also gives reliable results.

  12. Coordination of Project and Current Activities on the Basis of the Strategy Alignment Metamodel in the Oil and Gas Company

    Directory of Open Access Journals (Sweden)

    R. Yu. Dashkov

    2017-01-01

    Full Text Available Purpose: the purpose of this article is to describe the Strategy Alignment Metamodel of the project and current activities, which allows us to connect the Goals and Strategies for Phases of the project with the Goals and Strategies of the company at all levels of the organization through targeted measurement and application of Interpretive Models. Building Networks of Goals and Strategies, and adopting organizational solutions, you coordinate the interaction of the Project office and departments of the company. This methodology is based on a Logical Rationale of the Contexts and Assumptions for establishing Goals and Strategies both for the project and for the company, and on preparation of Contexts and Assumptions, Goals and Strategies Alignment Matrices, which provides a flexible adaptation to the internal and external environment in the process of selecting the most successful Strategies to achieve the Goals. Methods: this article is based on the concept of Goals-Questions-Metrics+ Strategies, which is adapted as another concept of strategic monitoring and control system of projects: Goals-Phases-Metrics+Strategies. These concepts have formed the basis of the Strategy Alignment Metamodel, where a technology of Phases Earned Value Management is used as a measurement system for the project activity, and Balanced scorecard is applied for current operations. Results: strategy Alignment Metamodel of the project and current activities of the company is proposed hereby. It uses modern strategic monitoring and control systems for projects: Goals-Phases-Metrics+Strategies, and for the company: Goals-Questions-Metrics+ Strategies. The interaction between these systems is based on Contexts and Assumptions, Goals and Strategies Alignment Matrices. The existence of such matrices greatly simplifies management decisions and prevents the risk of delays in the execution of project Phases based on rational participation and coordination of the company

  13. Effectiveness of meta-models for multi-objective optimization of centrifugal impeller

    International Nuclear Information System (INIS)

    Bellary, Sayed Ahmed Imran; Samad, Abdus; Husain, Afzal

    2014-01-01

    The major issue of multiple fidelity based analysis and optimization of fluid machinery system depends upon the proper construction of low fidelity model or meta-model. A low fidelity model uses responses obtained from a high fidelity model, and the meta-model is then used to produce population of solutions required for evolutionary algorithm for multi-objective optimization. The Pareto-optimal front which shows functional relationships among the multiple objectives can produce erroneous results if the low fidelity models are not well-constructed. In the present research, response surface approximation and Kriging meta-models were evaluated for their effectiveness for the application in the turbomachinery design and optimization. A high fidelity model such as CFD technique along with the metamodels was used to obtain Pareto-optimal front via multi-objective genetic algorithm. A centrifugal impeller has been considered as case study to find relationship between two conflicting objectives, viz., hydraulic efficiency and head. Design variables from the impeller geometry have been chosen and the responses of the objective functions were evaluated through CFD analysis. The fidelity of each metamodel has been discussed in context of their predictions in entire design space in general and near optimal region in particular. Exploitation of the multiple meta-models enhances the quality of multi-objective optimization and provides the information pertaining to fidelity of optimization model. It was observed that the Kriging meta-model was better suited for this type of problem as it involved less approximation error in the Pareto-optimal front.

  14. Effectiveness of meta-models for multi-objective optimization of centrifugal impeller

    Energy Technology Data Exchange (ETDEWEB)

    Bellary, Sayed Ahmed Imran; Samad, Abdus [Indian Institute of Technology Madras, Chennai (India); Husain, Afzal [Sultan Qaboos University, Al-Khoudh (Oman)

    2014-12-15

    The major issue of multiple fidelity based analysis and optimization of fluid machinery system depends upon the proper construction of low fidelity model or meta-model. A low fidelity model uses responses obtained from a high fidelity model, and the meta-model is then used to produce population of solutions required for evolutionary algorithm for multi-objective optimization. The Pareto-optimal front which shows functional relationships among the multiple objectives can produce erroneous results if the low fidelity models are not well-constructed. In the present research, response surface approximation and Kriging meta-models were evaluated for their effectiveness for the application in the turbomachinery design and optimization. A high fidelity model such as CFD technique along with the metamodels was used to obtain Pareto-optimal front via multi-objective genetic algorithm. A centrifugal impeller has been considered as case study to find relationship between two conflicting objectives, viz., hydraulic efficiency and head. Design variables from the impeller geometry have been chosen and the responses of the objective functions were evaluated through CFD analysis. The fidelity of each metamodel has been discussed in context of their predictions in entire design space in general and near optimal region in particular. Exploitation of the multiple meta-models enhances the quality of multi-objective optimization and provides the information pertaining to fidelity of optimization model. It was observed that the Kriging meta-model was better suited for this type of problem as it involved less approximation error in the Pareto-optimal front.

  15. Metamodeling and mapping of nitrate flux in the unsaturated zone and groundwater, Wisconsin, USA

    Science.gov (United States)

    Nolan, Bernard T.; Green, Christopher T.; Juckem, Paul F.; Liao, Lixia; Reddy, James E.

    2018-01-01

    Nitrate contamination of groundwater in agricultural areas poses a major challenge to the sustainability of water resources. Aquifer vulnerability models are useful tools that can help resource managers identify areas of concern, but quantifying nitrogen (N) inputs in such models is challenging, especially at large spatial scales. We sought to improve regional nitrate (NO3−) input functions by characterizing unsaturated zone NO3− transport to groundwater through use of surrogate, machine-learning metamodels of a process-based N flux model. The metamodels used boosted regression trees (BRTs) to relate mappable landscape variables to parameters and outputs of a previous “vertical flux method” (VFM) applied at sampled wells in the Fox, Wolf, and Peshtigo (FWP) river basins in northeastern Wisconsin. In this context, the metamodels upscaled the VFM results throughout the region, and the VFM parameters and outputs are the metamodel response variables. The study area encompassed the domain of a detailed numerical model that provided additional predictor variables, including groundwater recharge, to the metamodels. We used a statistical learning framework to test a range of model complexities to identify suitable hyperparameters of the six BRT metamodels corresponding to each response variable of interest: NO3− source concentration factor (which determines the local NO3− input concentration); unsaturated zone travel time; NO3− concentration at the water table in 1980, 2000, and 2020 (three separate metamodels); and NO3− “extinction depth”, the eventual steady state depth of the NO3−front. The final metamodels were trained to 129 wells within the active numerical flow model area, and considered 58 mappable predictor variables compiled in a geographic information system (GIS). These metamodels had training and cross-validation testing R2 values of 0.52 – 0.86 and 0.22 – 0.38, respectively, and predictions were compiled as maps of the above

  16. Metamodeling and mapping of nitrate flux in the unsaturated zone and groundwater, Wisconsin, USA

    Science.gov (United States)

    Nolan, Bernard T.; Green, Christopher T.; Juckem, Paul F.; Liao, Lixia; Reddy, James E.

    2018-04-01

    Nitrate contamination of groundwater in agricultural areas poses a major challenge to the sustainability of water resources. Aquifer vulnerability models are useful tools that can help resource managers identify areas of concern, but quantifying nitrogen (N) inputs in such models is challenging, especially at large spatial scales. We sought to improve regional nitrate (NO3-) input functions by characterizing unsaturated zone NO3- transport to groundwater through use of surrogate, machine-learning metamodels of a process-based N flux model. The metamodels used boosted regression trees (BRTs) to relate mappable landscape variables to parameters and outputs of a previous "vertical flux method" (VFM) applied at sampled wells in the Fox, Wolf, and Peshtigo (FWP) river basins in northeastern Wisconsin. In this context, the metamodels upscaled the VFM results throughout the region, and the VFM parameters and outputs are the metamodel response variables. The study area encompassed the domain of a detailed numerical model that provided additional predictor variables, including groundwater recharge, to the metamodels. We used a statistical learning framework to test a range of model complexities to identify suitable hyperparameters of the six BRT metamodels corresponding to each response variable of interest: NO3- source concentration factor (which determines the local NO3- input concentration); unsaturated zone travel time; NO3- concentration at the water table in 1980, 2000, and 2020 (three separate metamodels); and NO3- "extinction depth", the eventual steady state depth of the NO3- front. The final metamodels were trained to 129 wells within the active numerical flow model area, and considered 58 mappable predictor variables compiled in a geographic information system (GIS). These metamodels had training and cross-validation testing R2 values of 0.52 - 0.86 and 0.22 - 0.38, respectively, and predictions were compiled as maps of the above response variables. Testing

  17. Calculations of Sobol indices for the Gaussian process metamodel

    Energy Technology Data Exchange (ETDEWEB)

    Marrel, Amandine [CEA, DEN, DTN/SMTM/LMTE, F-13108 Saint Paul lez Durance (France)], E-mail: amandine.marrel@cea.fr; Iooss, Bertrand [CEA, DEN, DER/SESI/LCFR, F-13108 Saint Paul lez Durance (France); Laurent, Beatrice [Institut de Mathematiques, Universite de Toulouse (UMR 5219) (France); Roustant, Olivier [Ecole des Mines de Saint-Etienne (France)

    2009-03-15

    Global sensitivity analysis of complex numerical models can be performed by calculating variance-based importance measures of the input variables, such as the Sobol indices. However, these techniques, requiring a large number of model evaluations, are often unacceptable for time expensive computer codes. A well-known and widely used decision consists in replacing the computer code by a metamodel, predicting the model responses with a negligible computation time and rending straightforward the estimation of Sobol indices. In this paper, we discuss about the Gaussian process model which gives analytical expressions of Sobol indices. Two approaches are studied to compute the Sobol indices: the first based on the predictor of the Gaussian process model and the second based on the global stochastic process model. Comparisons between the two estimates, made on analytical examples, show the superiority of the second approach in terms of convergence and robustness. Moreover, the second approach allows to integrate the modeling error of the Gaussian process model by directly giving some confidence intervals on the Sobol indices. These techniques are finally applied to a real case of hydrogeological modeling.

  18. Calculations of Sobol indices for the Gaussian process metamodel

    International Nuclear Information System (INIS)

    Marrel, Amandine; Iooss, Bertrand; Laurent, Beatrice; Roustant, Olivier

    2009-01-01

    Global sensitivity analysis of complex numerical models can be performed by calculating variance-based importance measures of the input variables, such as the Sobol indices. However, these techniques, requiring a large number of model evaluations, are often unacceptable for time expensive computer codes. A well-known and widely used decision consists in replacing the computer code by a metamodel, predicting the model responses with a negligible computation time and rending straightforward the estimation of Sobol indices. In this paper, we discuss about the Gaussian process model which gives analytical expressions of Sobol indices. Two approaches are studied to compute the Sobol indices: the first based on the predictor of the Gaussian process model and the second based on the global stochastic process model. Comparisons between the two estimates, made on analytical examples, show the superiority of the second approach in terms of convergence and robustness. Moreover, the second approach allows to integrate the modeling error of the Gaussian process model by directly giving some confidence intervals on the Sobol indices. These techniques are finally applied to a real case of hydrogeological modeling

  19. Metamodeling and optimization of the THF process with pulsating pressure

    Science.gov (United States)

    Bucconi, Marco; Strano, Matteo

    2018-05-01

    Tube hydroforming is a process used in various applications to form the tube in a desired complex shape, by combining the use of internal pressure, which provides the required stress to yield the material, and axial feeding, which helps the material to flow towards the bulging zone. In many studies it has been demonstrated how wrinkling and bursting defects can be severely reduced by means of a pulsating pressure, and how the so-called hammering hydroforming enhances the formability of the material. The definition of the optimum pressure and axial feeding profiles represent a daunting challenge in the designing phase of the hydroforming operation of a new part. The quality of the formed part is highly dependent on the amplitude and the peak value of the pulsating pressure, along with the axial stroke. In this paper, a research is reported, conducted by means of explicit finite element simulations of a hammering THF operation and metamodeling techniques aimed at optimizing the process parameters for the production of a complex part. The improved formability is explored for different factors and an optimization strategy is used to determine the most convenient pressure and axial feed profile curves for the hammering THF process of the examined part. It is shown how the pulsating pressure allows the minimization of the energy input in the process, still respecting final quality requirements.

  20. Some Surprising Introductory Physics Facts and Numbers

    Science.gov (United States)

    Mallmann, A. James

    2016-01-01

    In the entertainment world, people usually like, and find memorable, novels, short stories, and movies with surprise endings. This suggests that classroom teachers might want to present to their students examples of surprising facts associated with principles of physics. Possible benefits of finding surprising facts about principles of physics are…

  1. Evaluation of convergence behavior of metamodeling techniques for bridging scales in multi-scale multimaterial simulation

    International Nuclear Information System (INIS)

    Sen, Oishik; Davis, Sean; Jacobs, Gustaaf; Udaykumar, H.S.

    2015-01-01

    The effectiveness of several metamodeling techniques, viz. the Polynomial Stochastic Collocation method, Adaptive Stochastic Collocation method, a Radial Basis Function Neural Network, a Kriging Method and a Dynamic Kriging Method is evaluated. This is done with the express purpose of using metamodels to bridge scales between micro- and macro-scale models in a multi-scale multimaterial simulation. The rate of convergence of the error when used to reconstruct hypersurfaces of known functions is studied. For sufficiently large number of training points, Stochastic Collocation methods generally converge faster than the other metamodeling techniques, while the DKG method converges faster when the number of input points is less than 100 in a two-dimensional parameter space. Because the input points correspond to computationally expensive micro/meso-scale computations, the DKG is favored for bridging scales in a multi-scale solver

  2. Computational Intelligence and Wavelet Transform Based Metamodel for Efficient Generation of Not-Yet Simulated Waveforms.

    Directory of Open Access Journals (Sweden)

    Gabriel Oltean

    Full Text Available The design and verification of complex electronic systems, especially the analog and mixed-signal ones, prove to be extremely time consuming tasks, if only circuit-level simulations are involved. A significant amount of time can be saved if a cost effective solution is used for the extensive analysis of the system, under all conceivable conditions. This paper proposes a data-driven method to build fast to evaluate, but also accurate metamodels capable of generating not-yet simulated waveforms as a function of different combinations of the parameters of the system. The necessary data are obtained by early-stage simulation of an electronic control system from the automotive industry. The metamodel development is based on three key elements: a wavelet transform for waveform characterization, a genetic algorithm optimization to detect the optimal wavelet transform and to identify the most relevant decomposition coefficients, and an artificial neuronal network to derive the relevant coefficients of the wavelet transform for any new parameters combination. The resulted metamodels for three different waveform families are fully reliable. They satisfy the required key points: high accuracy (a maximum mean squared error of 7.1x10-5 for the unity-based normalized waveforms, efficiency (fully affordable computational effort for metamodel build-up: maximum 18 minutes on a general purpose computer, and simplicity (less than 1 second for running the metamodel, the user only provides the parameters combination. The metamodels can be used for very efficient generation of new waveforms, for any possible combination of dependent parameters, offering the possibility to explore the entire design space. A wide range of possibilities becomes achievable for the user, such as: all design corners can be analyzed, possible worst-case situations can be investigated, extreme values of waveforms can be discovered, sensitivity analyses can be performed (the influence of each

  3. Computational Intelligence and Wavelet Transform Based Metamodel for Efficient Generation of Not-Yet Simulated Waveforms.

    Science.gov (United States)

    Oltean, Gabriel; Ivanciu, Laura-Nicoleta

    2016-01-01

    The design and verification of complex electronic systems, especially the analog and mixed-signal ones, prove to be extremely time consuming tasks, if only circuit-level simulations are involved. A significant amount of time can be saved if a cost effective solution is used for the extensive analysis of the system, under all conceivable conditions. This paper proposes a data-driven method to build fast to evaluate, but also accurate metamodels capable of generating not-yet simulated waveforms as a function of different combinations of the parameters of the system. The necessary data are obtained by early-stage simulation of an electronic control system from the automotive industry. The metamodel development is based on three key elements: a wavelet transform for waveform characterization, a genetic algorithm optimization to detect the optimal wavelet transform and to identify the most relevant decomposition coefficients, and an artificial neuronal network to derive the relevant coefficients of the wavelet transform for any new parameters combination. The resulted metamodels for three different waveform families are fully reliable. They satisfy the required key points: high accuracy (a maximum mean squared error of 7.1x10-5 for the unity-based normalized waveforms), efficiency (fully affordable computational effort for metamodel build-up: maximum 18 minutes on a general purpose computer), and simplicity (less than 1 second for running the metamodel, the user only provides the parameters combination). The metamodels can be used for very efficient generation of new waveforms, for any possible combination of dependent parameters, offering the possibility to explore the entire design space. A wide range of possibilities becomes achievable for the user, such as: all design corners can be analyzed, possible worst-case situations can be investigated, extreme values of waveforms can be discovered, sensitivity analyses can be performed (the influence of each parameter on the

  4. Computational Intelligence and Wavelet Transform Based Metamodel for Efficient Generation of Not-Yet Simulated Waveforms

    Science.gov (United States)

    Oltean, Gabriel; Ivanciu, Laura-Nicoleta

    2016-01-01

    The design and verification of complex electronic systems, especially the analog and mixed-signal ones, prove to be extremely time consuming tasks, if only circuit-level simulations are involved. A significant amount of time can be saved if a cost effective solution is used for the extensive analysis of the system, under all conceivable conditions. This paper proposes a data-driven method to build fast to evaluate, but also accurate metamodels capable of generating not-yet simulated waveforms as a function of different combinations of the parameters of the system. The necessary data are obtained by early-stage simulation of an electronic control system from the automotive industry. The metamodel development is based on three key elements: a wavelet transform for waveform characterization, a genetic algorithm optimization to detect the optimal wavelet transform and to identify the most relevant decomposition coefficients, and an artificial neuronal network to derive the relevant coefficients of the wavelet transform for any new parameters combination. The resulted metamodels for three different waveform families are fully reliable. They satisfy the required key points: high accuracy (a maximum mean squared error of 7.1x10-5 for the unity-based normalized waveforms), efficiency (fully affordable computational effort for metamodel build-up: maximum 18 minutes on a general purpose computer), and simplicity (less than 1 second for running the metamodel, the user only provides the parameters combination). The metamodels can be used for very efficient generation of new waveforms, for any possible combination of dependent parameters, offering the possibility to explore the entire design space. A wide range of possibilities becomes achievable for the user, such as: all design corners can be analyzed, possible worst-case situations can be investigated, extreme values of waveforms can be discovered, sensitivity analyses can be performed (the influence of each parameter on the

  5. A surprising exception. Himachal's success in promoting female education.

    Science.gov (United States)

    Dreze, J

    1999-01-01

    Gender inequalities in India are derived partly from the economic dependence of women on men. Low levels of formal education among women reinforce the asymmetry of power between the sexes. A general pattern of sharp gender bias in education levels is noted in most Indian states; however, in the small state of Himachal Pradesh, school participation rates are almost as high for girls as for boys. Rates of school participation for girls at the primary level is close to universal in this state, and while gender bias persists at higher levels of education, it is much lower than elsewhere in India and rapidly declining. This was not the case 50 years ago; educational levels in Himachal Pradesh were no higher than in Bihar or Uttar Pradesh. Today, the spectacular transition towards universal elementary education in Himachal Pradesh has contributed to the impressive reduction of poverty, mortality, illness, undernutrition, and related deprivations.

  6. Kolb's Experiential Learning Theory: A Meta-Model for Career Exploration.

    Science.gov (United States)

    Atkinson, George, Jr.; Murrell, Patricia H.

    1988-01-01

    Kolb's experiential learning theory offers the career counselor a meta-model with which to structure career exploration exercises and ensure a thorough investigation of self and the world of work in a manner that provides the client with an optimal amount of learning and personal development. (Author)

  7. Aggregate meta-models for evolutionary multiobjective and many-objective optimization

    Czech Academy of Sciences Publication Activity Database

    Pilát, Martin; Neruda, Roman

    Roč. 116, 20 September (2013), s. 392-402 ISSN 0925-2312 R&D Projects: GA ČR GAP202/11/1368 Institutional support: RVO:67985807 Keywords : evolutionary algorithms * multiobjective optimization * many-objective optimization * surrogate models * meta-models * memetic algorithm Subject RIV: IN - Informatics, Computer Science Impact factor: 2.005, year: 2013

  8. Meta-modelling, visualization and emulation of multi-dimensional data for virtual production intelligence

    Science.gov (United States)

    Schulz, Wolfgang; Hermanns, Torsten; Al Khawli, Toufik

    2017-07-01

    Decision making for competitive production in high-wage countries is a daily challenge where rational and irrational methods are used. The design of decision making processes is an intriguing, discipline spanning science. However, there are gaps in understanding the impact of the known mathematical and procedural methods on the usage of rational choice theory. Following Benjamin Franklin's rule for decision making formulated in London 1772, he called "Prudential Algebra" with the meaning of prudential reasons, one of the major ingredients of Meta-Modelling can be identified finally leading to one algebraic value labelling the results (criteria settings) of alternative decisions (parameter settings). This work describes the advances in Meta-Modelling techniques applied to multi-dimensional and multi-criterial optimization by identifying the persistence level of the corresponding Morse-Smale Complex. Implementations for laser cutting and laser drilling are presented, including the generation of fast and frugal Meta-Models with controlled error based on mathematical model reduction Reduced Models are derived to avoid any unnecessary complexity. Both, model reduction and analysis of multi-dimensional parameter space are used to enable interactive communication between Discovery Finders and Invention Makers. Emulators and visualizations of a metamodel are introduced as components of Virtual Production Intelligence making applicable the methods of Scientific Design Thinking and getting the developer as well as the operator more skilled.

  9. Customized sequential designs for random simulation experiments: Kriging metamodeling and bootstrapping

    NARCIS (Netherlands)

    Beers, van W.C.M.; Kleijnen, J.P.C.

    2005-01-01

    This paper proposes a novel method to select an experimental design for interpolation in random simulation, especially discrete event simulation. (Though the paper focuses on Kriging, this design approach may also apply to other types of metamodels such as linear regression models.) Assuming that

  10. Ufo-element presentation in metamodel structure of triune continuum paradigm

    OpenAIRE

    Ukrayinets, ?.

    2006-01-01

    This paper describes results of UFO-element formal description in metamodel structure of Triune Continuum Paradigm. This can promote the solution of a problem of development of methods of mutual system-object UFO- and UML-models transformation for providing of more effective information systems designing, in particular, for visual modelling CASE-tools Rational Rose and UFO-Toolkit integration.

  11. A kriging metamodel-assisted robust optimization method based on a reverse model

    Science.gov (United States)

    Zhou, Hui; Zhou, Qi; Liu, Congwei; Zhou, Taotao

    2018-02-01

    The goal of robust optimization methods is to obtain a solution that is both optimum and relatively insensitive to uncertainty factors. Most existing robust optimization approaches use outer-inner nested optimization structures where a large amount of computational effort is required because the robustness of each candidate solution delivered from the outer level should be evaluated in the inner level. In this article, a kriging metamodel-assisted robust optimization method based on a reverse model (K-RMRO) is first proposed, in which the nested optimization structure is reduced into a single-loop optimization structure to ease the computational burden. Ignoring the interpolation uncertainties from kriging, K-RMRO may yield non-robust optima. Hence, an improved kriging-assisted robust optimization method based on a reverse model (IK-RMRO) is presented to take the interpolation uncertainty of kriging metamodel into consideration. In IK-RMRO, an objective switching criterion is introduced to determine whether the inner level robust optimization or the kriging metamodel replacement should be used to evaluate the robustness of design alternatives. The proposed criterion is developed according to whether or not the robust status of the individual can be changed because of the interpolation uncertainties from the kriging metamodel. Numerical and engineering cases are used to demonstrate the applicability and efficiency of the proposed approach.

  12. Modeling units of study from a pedagogical perspective: the pedagogical meta-model behind EML

    NARCIS (Netherlands)

    Koper, Rob

    2003-01-01

    This text is a short summary of the work on pedagogical analysis carried out when EML (Educational Modelling Language) was being developed. Because we address pedagogical meta-models the consequence is that I must justify the underlying pedagogical models it describes. I have included a (far from

  13. Feature and Meta-Models in Clafer: Mixed, Specialized, and Coupled

    DEFF Research Database (Denmark)

    Bąk, Kacper; Czarnecki, Krzysztof; Wasowski, Andrzej

    2011-01-01

    constraints (such as mapping feature configurations to component configurations or model templates). Clafer also allows arranging models into multiple specialization and extension layers via constraints and inheritance. We identify four key mechanisms allowing a meta-modeling language to express feature...

  14. Meta-modeling soil organic carbon sequestration potential and its application at regional scale.

    Science.gov (United States)

    Luo, Zhongkui; Wang, Enli; Bryan, Brett A; King, Darran; Zhao, Gang; Pan, Xubin; Bende-Michl, Ulrike

    2013-03-01

    Upscaling the results from process-based soil-plant models to assess regional soil organic carbon (SOC) change and sequestration potential is a great challenge due to the lack of detailed spatial information, particularly soil properties. Meta-modeling can be used to simplify and summarize process-based models and significantly reduce the demand for input data and thus could be easily applied on regional scales. We used the pre-validated Agricultural Production Systems sIMulator (APSIM) to simulate the impact of climate, soil, and management on SOC at 613 reference sites across Australia's cereal-growing regions under a continuous wheat system. We then developed a simple meta-model to link the APSIM-modeled SOC change to primary drivers, i.e., the amount of recalcitrant SOC, plant available water capacity of soil, soil pH, and solar radiation, temperature, and rainfall in the growing season. Based on high-resolution soil texture data and 8165 climate data points across the study area, we used the meta-model to assess SOC sequestration potential and the uncertainty associated with the variability of soil characteristics. The meta-model explained 74% of the variation of final SOC content as simulated by APSIM. Applying the meta-model to Australia's cereal-growing regions reveals regional patterns in SOC, with higher SOC stock in cool, wet regions. Overall, the potential SOC stock ranged from 21.14 to 152.71 Mg/ha with a mean of 52.18 Mg/ha. Variation of soil properties induced uncertainty ranging from 12% to 117% with higher uncertainty in warm, wet regions. In general, soils in Australia's cereal-growing regions under continuous wheat production were simulated as a sink of atmospheric carbon dioxide with a mean sequestration potential of 8.17 Mg/ha.

  15. The role of surprise in satisfaction judgements

    NARCIS (Netherlands)

    Vanhamme, J.; Snelders, H.M.J.J.

    2001-01-01

    Empirical findings suggest that surprise plays an important role in consumer satisfaction, but there is a lack of theory to explain why this is so. The present paper provides explanations for the process through which positive (negative) surprise might enhance (reduce) consumer satisfaction. First,

  16. Climate Change as a Predictable Surprise

    International Nuclear Information System (INIS)

    Bazerman, M.H.

    2006-01-01

    In this article, I analyze climate change as a 'predictable surprise', an event that leads an organization or nation to react with surprise, despite the fact that the information necessary to anticipate the event and its consequences was available (Bazerman and Watkins, 2004). I then assess the cognitive, organizational, and political reasons why society fails to implement wise strategies to prevent predictable surprises generally and climate change specifically. Finally, I conclude with an outline of a set of response strategies to overcome barriers to change

  17. Mapping ground water vulnerability to pesticide leaching with a process-based metamodel of EuroPEARL.

    Science.gov (United States)

    Tiktak, A; Boesten, J J T I; van der Linden, A M A; Vanclooster, M

    2006-01-01

    To support EU policy, indicators of pesticide leaching at the European level are required. For this reason, a metamodel of the spatially distributed European pesticide leaching model EuroPEARL was developed. EuroPEARL considers transient flow and solute transport and assumes Freundlich adsorption, first-order degradation and passive plant uptake of pesticides. Physical parameters are depth dependent while (bio)-chemical parameters are depth, temperature, and moisture dependent. The metamodel is based on an analytical expression that describes the mass fraction of pesticide leached. The metamodel ignores vertical parameter variations and assumes steady flow. The calibration dataset was generated with EuroPEARL and consisted of approximately 60,000 simulations done for 56 pesticides with different half-lives and partitioning coefficients. The target variable was the 80th percentile of the annual average leaching concentration at 1-m depth from a time series of 20 yr. The metamodel explains over 90% of the variation of the original model with only four independent spatial attributes. These parameters are available in European soil and climate databases, so that the calibrated metamodel could be applied to generate maps of the predicted leaching concentration in the European Union. Maps generated with the metamodel showed a good similarity with the maps obtained with EuroPEARL, which was confirmed by means of quantitative performance indicators.

  18. X rays and radioactivity: a complete surprise

    International Nuclear Information System (INIS)

    Radvanyi, P.; Bordry, M.

    1995-01-01

    The discoveries of X rays and of radioactivity came as complete experimental surprises; the physicists, at that time, had no previous hint of a possible structure of atoms. It is difficult now, knowing what we know, to replace ourselves in the spirit, astonishment and questioning of these years, between 1895 and 1903. The nature of X rays was soon hypothesized, but the nature of the rays emitted by uranium, polonium and radium was much more difficult to disentangle, as they were a mixture of different types of radiations. The origin of the energy continuously released in radioactivity remained a complete mystery for a few years. The multiplicity of the radioactive substances became soon a difficult matter: what was real and what was induced ? Isotopy was still far ahead. It appeared that some radioactive substances had ''half-lifes'': were they genuine radioactive elements or was it just a transitory phenomenon ? Henri Becquerel (in 1900) and Pierre and Marie Curie (in 1902) hesitated on the correct answer. Only after Ernest Rutherford and Frederick Soddy established that radioactivity was the transmutation of one element into another, could one understand that a solid element transformed into a gaseous element, which in turn transformed itself into a succession of solid radioactive elements. It was only in 1913 - after the discovery of the atomic nucleus -, through precise measurements of X ray spectra, that Henry Moseley showed that the number of electrons of a given atom - and the charge of its nucleus - was equal to its atomic number in the periodic table. (authors)

  19. X rays and radioactivity: a complete surprise

    Energy Technology Data Exchange (ETDEWEB)

    Radvanyi, P. [Laboratoire National Saturne, Centre d`Etudes de Saclay, 91 - Gif-sur-Yvette (France); Bordry, M. [Institut du Radium, 75 - Paris (France)

    1995-12-31

    The discoveries of X rays and of radioactivity came as complete experimental surprises; the physicists, at that time, had no previous hint of a possible structure of atoms. It is difficult now, knowing what we know, to replace ourselves in the spirit, astonishment and questioning of these years, between 1895 and 1903. The nature of X rays was soon hypothesized, but the nature of the rays emitted by uranium, polonium and radium was much more difficult to disentangle, as they were a mixture of different types of radiations. The origin of the energy continuously released in radioactivity remained a complete mystery for a few years. The multiplicity of the radioactive substances became soon a difficult matter: what was real and what was induced ? Isotopy was still far ahead. It appeared that some radioactive substances had ``half-lifes``: were they genuine radioactive elements or was it just a transitory phenomenon ? Henri Becquerel (in 1900) and Pierre and Marie Curie (in 1902) hesitated on the correct answer. Only after Ernest Rutherford and Frederick Soddy established that radioactivity was the transmutation of one element into another, could one understand that a solid element transformed into a gaseous element, which in turn transformed itself into a succession of solid radioactive elements. It was only in 1913 - after the discovery of the atomic nucleus -, through precise measurements of X ray spectra, that Henry Moseley showed that the number of electrons of a given atom - and the charge of its nucleus - was equal to its atomic number in the periodic table. (authors).

  20. An improved version of Inverse Distance Weighting metamodel assisted Harmony Search algorithm for truss design optimization

    Directory of Open Access Journals (Sweden)

    Y. Gholipour

    Full Text Available This paper focuses on a metamodel-based design optimization algorithm. The intention is to improve its computational cost and convergence rate. Metamodel-based optimization method introduced here, provides the necessary means to reduce the computational cost and convergence rate of the optimization through a surrogate. This algorithm is a combination of a high quality approximation technique called Inverse Distance Weighting and a meta-heuristic algorithm called Harmony Search. The outcome is then polished by a semi-tabu search algorithm. This algorithm adopts a filtering system and determines solution vectors where exact simulation should be applied. The performance of the algorithm is evaluated by standard truss design problems and there has been a significant decrease in the computational effort and improvement of convergence rate.

  1. UML Profile for Mining Process: Supporting Modeling and Simulation Based on Metamodels of Activity Diagram

    Directory of Open Access Journals (Sweden)

    Andrea Giubergia

    2014-01-01

    Full Text Available An UML profile describes lightweight extension mechanism to the UML by defining custom stereotypes, tagged values, and constraints. They are used to adapt UML metamodel to different platforms and domains. In this paper we present an UML profile for models supporting event driving simulation. In particular, we use the Arena simulation tool and we focus on the mining process domain. Profiles provide an easy way to obtain well-defined specifications, regulated by the Object Management Group (OMG. They can be used as a presimulation technique to obtain solid models for the mining industry. In this work we present a new profile to extend the UML metamodel; in particular we focus on the activity diagram. This extended model is applied to an industry problem involving loading and transportation of minerals in the field of mining process.

  2. A knowledge representation meta-model for rule-based modelling of signalling networks

    Directory of Open Access Journals (Sweden)

    Adrien Basso-Blandin

    2016-03-01

    Full Text Available The study of cellular signalling pathways and their deregulation in disease states, such as cancer, is a large and extremely complex task. Indeed, these systems involve many parts and processes but are studied piecewise and their literatures and data are consequently fragmented, distributed and sometimes—at least apparently—inconsistent. This makes it extremely difficult to build significant explanatory models with the result that effects in these systems that are brought about by many interacting factors are poorly understood. The rule-based approach to modelling has shown some promise for the representation of the highly combinatorial systems typically found in signalling where many of the proteins are composed of multiple binding domains, capable of simultaneous interactions, and/or peptide motifs controlled by post-translational modifications. However, the rule-based approach requires highly detailed information about the precise conditions for each and every interaction which is rarely available from any one single source. Rather, these conditions must be painstakingly inferred and curated, by hand, from information contained in many papers—each of which contains only part of the story. In this paper, we introduce a graph-based meta-model, attuned to the representation of cellular signalling networks, which aims to ease this massive cognitive burden on the rule-based curation process. This meta-model is a generalization of that used by Kappa and BNGL which allows for the flexible representation of knowledge at various levels of granularity. In particular, it allows us to deal with information which has either too little, or too much, detail with respect to the strict rule-based meta-model. Our approach provides a basis for the gradual aggregation of fragmented biological knowledge extracted from the literature into an instance of the meta-model from which we can define an automated translation into executable Kappa programs.

  3. Low cost metamodel for robust design of periodic nonlinear coupled micro-systems

    Directory of Open Access Journals (Sweden)

    Chikhaoui K.

    2016-01-01

    Full Text Available To achieve robust design, in presence of uncertainty, nonlinearity and structural periodicity, a metamodel combining the Latin Hypercube Sampling (LHS method for uncertainty propagation and an enriched Craig-Bampton Component Mode Synthesis approach (CB-CMS for model reduction is proposed. Its application to predict the time responses of a stochastic periodic nonlinear micro-system proves its efficiency in terms of accuracy and reduction of computational cost.

  4. A meta-model based approach for rapid formability estimation of continuous fibre reinforced components

    Science.gov (United States)

    Zimmerling, Clemens; Dörr, Dominik; Henning, Frank; Kärger, Luise

    2018-05-01

    Due to their high mechanical performance, continuous fibre reinforced plastics (CoFRP) become increasingly important for load bearing structures. In many cases, manufacturing CoFRPs comprises a forming process of textiles. To predict and optimise the forming behaviour of a component, numerical simulations are applied. However, for maximum part quality, both the geometry and the process parameters must match in mutual regard, which in turn requires numerous numerically expensive optimisation iterations. In both textile and metal forming, a lot of research has focused on determining optimum process parameters, whilst regarding the geometry as invariable. In this work, a meta-model based approach on component level is proposed, that provides a rapid estimation of the formability for variable geometries based on pre-sampled, physics-based draping data. Initially, a geometry recognition algorithm scans the geometry and extracts a set of doubly-curved regions with relevant geometry parameters. If the relevant parameter space is not part of an underlying data base, additional samples via Finite-Element draping simulations are drawn according to a suitable design-table for computer experiments. Time saving parallel runs of the physical simulations accelerate the data acquisition. Ultimately, a Gaussian Regression meta-model is built from the data base. The method is demonstrated on a box-shaped generic structure. The predicted results are in good agreement with physics-based draping simulations. Since evaluations of the established meta-model are numerically inexpensive, any further design exploration (e.g. robustness analysis or design optimisation) can be performed in short time. It is expected that the proposed method also offers great potential for future applications along virtual process chains: For each process step along the chain, a meta-model can be set-up to predict the impact of design variations on manufacturability and part performance. Thus, the method is

  5. Accelerated optimizations of an electromagnetic acoustic transducer with artificial neural networks as metamodels

    Directory of Open Access Journals (Sweden)

    S. Wang

    2017-08-01

    Full Text Available Electromagnetic acoustic transducers (EMATs are noncontact transducers generating ultrasonic waves directly in the conductive sample. Despite the advantages, their transduction efficiencies are relatively low, so it is imperative to build accurate multiphysics models of EMATs and optimize the structural parameters accordingly, using a suitable optimization algorithm. The optimizing process often involves a large number of runs of the computationally expensive numerical models, so metamodels as substitutes for the real numerical models are helpful for the optimizations. In this work the focus is on the artificial neural networks as the metamodels of an omnidirectional EMAT, including the multilayer feedforward networks trained with the basic and improved back propagation algorithms and the radial basis function networks with exact and nonexact interpolations. The developed neural-network programs are tested on an example problem. Then the model of an omnidirectional EMAT generating Lamb waves in a linearized steel plate is introduced, and various approaches to calculate the amplitudes of the displacement component waveforms are discussed. The neural-network metamodels are then built for the EMAT model and compared to the displacement component amplitude (or ratio of amplitudes surface data on a discrete grid of the design variables as the reference, applying a multifrequency model with FFT (fast Fourier transform/IFFT (inverse FFT processing. Finally the two-objective optimization problem is formulated with one objective function minimizing the ratio of the amplitude of the S0-mode Lamb wave to that of the A0 mode, and the other objective function minimizing as the negative amplitude of the A0 mode. Pareto fronts in the criterion space are solved with the neural-network models and the total time consumption is greatly decreased. From the study it could be observed that the radial basis function network with exact interpolation has the best

  6. Linear regression metamodeling as a tool to summarize and present simulation model results.

    Science.gov (United States)

    Jalal, Hawre; Dowd, Bryan; Sainfort, François; Kuntz, Karen M

    2013-10-01

    Modelers lack a tool to systematically and clearly present complex model results, including those from sensitivity analyses. The objective was to propose linear regression metamodeling as a tool to increase transparency of decision analytic models and better communicate their results. We used a simplified cancer cure model to demonstrate our approach. The model computed the lifetime cost and benefit of 3 treatment options for cancer patients. We simulated 10,000 cohorts in a probabilistic sensitivity analysis (PSA) and regressed the model outcomes on the standardized input parameter values in a set of regression analyses. We used the regression coefficients to describe measures of sensitivity analyses, including threshold and parameter sensitivity analyses. We also compared the results of the PSA to deterministic full-factorial and one-factor-at-a-time designs. The regression intercept represented the estimated base-case outcome, and the other coefficients described the relative parameter uncertainty in the model. We defined simple relationships that compute the average and incremental net benefit of each intervention. Metamodeling produced outputs similar to traditional deterministic 1-way or 2-way sensitivity analyses but was more reliable since it used all parameter values. Linear regression metamodeling is a simple, yet powerful, tool that can assist modelers in communicating model characteristics and sensitivity analyses.

  7. Metamodel for Efficient Estimation of Capacity-Fade Uncertainty in Li-Ion Batteries for Electric Vehicles

    Directory of Open Access Journals (Sweden)

    Jaewook Lee

    2015-06-01

    Full Text Available This paper presents an efficient method for estimating capacity-fade uncertainty in lithium-ion batteries (LIBs in order to integrate them into the battery-management system (BMS of electric vehicles, which requires simple and inexpensive computation for successful application. The study uses the pseudo-two-dimensional (P2D electrochemical model, which simulates the battery state by solving a system of coupled nonlinear partial differential equations (PDEs. The model parameters that are responsible for electrode degradation are identified and estimated, based on battery data obtained from the charge cycles. The Bayesian approach, with parameters estimated by probability distributions, is employed to account for uncertainties arising in the model and battery data. The Markov Chain Monte Carlo (MCMC technique is used to draw samples from the distributions. The complex computations that solve a PDE system for each sample are avoided by employing a polynomial-based metamodel. As a result, the computational cost is reduced from 5.5 h to a few seconds, enabling the integration of the method into the vehicle BMS. Using this approach, the conservative bound of capacity fade can be determined for the vehicle in service, which represents the safety margin reflecting the uncertainty.

  8. Surprise: a belief or an emotion?

    Science.gov (United States)

    Mellers, Barbara; Fincher, Katrina; Drummond, Caitlin; Bigony, Michelle

    2013-01-01

    Surprise is a fundamental link between cognition and emotion. It is shaped by cognitive assessments of likelihood, intuition, and superstition, and it in turn shapes hedonic experiences. We examine this connection between cognition and emotion and offer an explanation called decision affect theory. Our theory predicts the affective consequences of mistaken beliefs, such as overconfidence and hindsight. It provides insight about why the pleasure of a gain can loom larger than the pain of a comparable loss. Finally, it explains cross-cultural differences in emotional reactions to surprising events. By changing the nature of the unexpected (from chance to good luck), one can alter the emotional reaction to surprising events. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. Viral marketing: the use of surprise

    NARCIS (Netherlands)

    Lindgreen, A.; Vanhamme, J.; Clarke, I.; Flaherty, T.B.

    2005-01-01

    Viral marketing involves consumers passing along a company's marketing message to their friends, family, and colleagues. This chapter reviews viral marketing campaigns and argues that the emotion of surprise often is at work and that this mechanism resembles that of word-of-mouth marketing.

  10. Exploration, Novelty, Surprise and Free Energy Minimisation

    Directory of Open Access Journals (Sweden)

    Philipp eSchwartenbeck

    2013-10-01

    Full Text Available This paper reviews recent developments under the free energy principle that introduce a normative perspective on classical economic (utilitarian decision-making based on (active Bayesian inference. It has been suggested that the free energy principle precludes novelty and complexity, because it assumes that biological systems – like ourselves - try to minimise the long-term average of surprise to maintain their homeostasis. However, recent formulations show that minimising surprise leads naturally to concepts such as exploration and novelty bonuses. In this approach, agents infer a policy that minimises surprise by minimising the difference (or relative entropy between likely and desired outcomes, which involves both pursuing the goal-state that has the highest expected utility (often termed ‘exploitation’ and visiting a number of different goal-states (‘exploration’. Crucially, the opportunity to visit new states increases the value of the current state. Casting decision-making problems within a variational framework, therefore, predicts that our behaviour is governed by both the entropy and expected utility of future states. This dissolves any dialectic between minimising surprise and exploration or novelty seeking.

  11. Glial heterotopia of maxilla: A clinical surprise

    Directory of Open Access Journals (Sweden)

    Santosh Kumar Mahalik

    2011-01-01

    Full Text Available Glial heterotopia is a rare congenital mass lesion which often presents as a clinical surprise. We report a case of extranasal glial heterotopia in a neonate with unusual features. The presentation, management strategy, etiopathogenesis and histopathology of the mass lesion has been reviewed.

  12. A software complex intended for constructing applied models and meta-models on the basis of mathematical programming principles

    Directory of Open Access Journals (Sweden)

    Михаил Юрьевич Чернышов

    2013-12-01

    Full Text Available A software complex (SC elaborated by the authors on the basis of the language LMPL and representing a software tool intended for synthesis of applied software models and meta-models constructed on the basis of mathematical programming (MP principles is described. LMPL provides for an explicit form of declarative representation of MP-models, presumes automatic constructing and transformation of models and the capability of adding external software packages. The following software versions of the SC have been implemented: 1 a SC intended for representing the process of choosing an optimal hydroelectric power plant model (on the principles of meta-modeling and 2 a SC intended for representing the logic-sense relations between the models of a set of discourse formations in the discourse meta-model.

  13. Radar Design to Protect Against Surprise

    Energy Technology Data Exchange (ETDEWEB)

    Doerry, Armin W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Technological and doctrinal surprise is about rendering preparations for conflict as irrelevant or ineffective . For a sensor, this means essentially rendering the sensor as irrelevant or ineffective in its ability to help determine truth. Recovery from this sort of surprise is facilitated by flexibility in our own technology and doctrine. For a sensor, this mean s flexibility in its architecture, design, tactics, and the designing organizations ' processes. - 4 - Acknowledgements This report is the result of a n unfunded research and development activity . Sandia National Laboratories is a multi - program laboratory manage d and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE - AC04 - 94AL85000.

  14. Metamodeling-based approach for risk assessment and cost estimation: Application to geological carbon sequestration planning

    Science.gov (United States)

    Sun, Alexander Y.; Jeong, Hoonyoung; González-Nicolás, Ana; Templeton, Thomas C.

    2018-04-01

    Carbon capture and storage (CCS) is being evaluated globally as a geoengineering measure for significantly reducing greenhouse emission. However, long-term liability associated with potential leakage from these geologic repositories is perceived as a main barrier of entry to site operators. Risk quantification and impact assessment help CCS operators to screen candidate sites for suitability of CO2 storage. Leakage risks are highly site dependent, and a quantitative understanding and categorization of these risks can only be made possible through broad participation and deliberation of stakeholders, with the use of site-specific, process-based models as the decision basis. Online decision making, however, requires that scenarios be run in real time. In this work, a Python based, Leakage Assessment and Cost Estimation (PyLACE) web application was developed for quantifying financial risks associated with potential leakage from geologic carbon sequestration sites. PyLACE aims to assist a collaborative, analytic-deliberative decision making processes by automating metamodel creation, knowledge sharing, and online collaboration. In PyLACE, metamodeling, which is a process of developing faster-to-run surrogates of process-level models, is enabled using a special stochastic response surface method and the Gaussian process regression. Both methods allow consideration of model parameter uncertainties and the use of that information to generate confidence intervals on model outputs. Training of the metamodels is delegated to a high performance computing cluster and is orchestrated by a set of asynchronous job scheduling tools for job submission and result retrieval. As a case study, workflow and main features of PyLACE are demonstrated using a multilayer, carbon storage model.

  15. Pupil size tracks perceptual content and surprise.

    Science.gov (United States)

    Kloosterman, Niels A; Meindertsma, Thomas; van Loon, Anouk M; Lamme, Victor A F; Bonneh, Yoram S; Donner, Tobias H

    2015-04-01

    Changes in pupil size at constant light levels reflect the activity of neuromodulatory brainstem centers that control global brain state. These endogenously driven pupil dynamics can be synchronized with cognitive acts. For example, the pupil dilates during the spontaneous switches of perception of a constant sensory input in bistable perceptual illusions. It is unknown whether this pupil dilation only indicates the occurrence of perceptual switches, or also their content. Here, we measured pupil diameter in human subjects reporting the subjective disappearance and re-appearance of a physically constant visual target surrounded by a moving pattern ('motion-induced blindness' illusion). We show that the pupil dilates during the perceptual switches in the illusion and a stimulus-evoked 'replay' of that illusion. Critically, the switch-related pupil dilation encodes perceptual content, with larger amplitude for disappearance than re-appearance. This difference in pupil response amplitude enables prediction of the type of report (disappearance vs. re-appearance) on individual switches (receiver-operating characteristic: 61%). The amplitude difference is independent of the relative durations of target-visible and target-invisible intervals and subjects' overt behavioral report of the perceptual switches. Further, we show that pupil dilation during the replay also scales with the level of surprise about the timing of switches, but there is no evidence for an interaction between the effects of surprise and perceptual content on the pupil response. Taken together, our results suggest that pupil-linked brain systems track both the content of, and surprise about, perceptual events. © 2015 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  16. A surprising palmar nevus: A case report

    Directory of Open Access Journals (Sweden)

    Rana Rafiei

    2018-02-01

    Full Text Available Raised palmar or plantar nevus especially in white people is an unusual feature. We present an uncommon palmar compound nevus in a 26-year-old woman with a large diameter (6 mm which had a collaret-shaped margin. In histopathologic evaluation intralymphatic protrusions of nevic nests were noted. This case was surprising to us for these reasons: size, shape, location and histopathology of the lesion. Palmar nevi are usually junctional (flat and below 3 mm diameter and intra lymphatic protrusion or invasion in nevi is an extremely rare phenomenon.

  17. Equation of state for dense nucleonic matter from metamodeling. II. Predictions for neutron star properties

    Science.gov (United States)

    Margueron, Jérôme; Hoffmann Casali, Rudiney; Gulminelli, Francesca

    2018-02-01

    Employing recently proposed metamodeling for the nucleonic matter equation of state, we analyze neutron star global properties such as masses, radii, momentum of inertia, and others. The impact of the uncertainty on empirical parameters on these global properties is analyzed in a Bayesian statistical approach. Physical constraints, such as causality and stability, are imposed on the equation of state and different hypotheses for the direct Urca (dUrca) process are investigated. In addition, only metamodels with maximum masses above 2 M⊙ are selected. Our main results are the following: the equation of state exhibits a universal behavior against the dUrca hypothesis under the condition of charge neutrality and β equilibrium; neutron stars, if composed exclusively of nucleons and leptons, have a radius of 12.7 ±0.4 km for masses ranging from 1 up to 2 M⊙ ; a small radius lower than 11 km is very marginally compatible with our present knowledge of the nuclear empirical parameters; and finally, the most important empirical parameters which are still affected by large uncertainties and play an important role in determining the radius of neutrons stars are the slope and curvature of the symmetry energy (Lsym and Ksym) and, to a lower extent, the skewness parameters (Qsat /sym).

  18. Improved metamodel-based importance sampling for the performance assessment of radioactive waste repositories

    International Nuclear Information System (INIS)

    Cadini, F.; Gioletta, A.; Zio, E.

    2015-01-01

    In the context of a probabilistic performance assessment of a radioactive waste repository, the estimation of the probability of exceeding the dose threshold set by a regulatory body is a fundamental task. This may become difficult when the probabilities involved are very small, since the classically used sampling-based Monte Carlo methods may become computationally impractical. This issue is further complicated by the fact that the computer codes typically adopted in this context requires large computational efforts, both in terms of time and memory. This work proposes an original use of a Monte Carlo-based algorithm for (small) failure probability estimation in the context of the performance assessment of a near surface radioactive waste repository. The algorithm, developed within the context of structural reliability, makes use of an estimated optimal importance density and a surrogate, kriging-based metamodel approximating the system response. On the basis of an accurate analytic analysis of the algorithm, a modification is proposed which allows further reducing the computational efforts by a more effective training of the metamodel. - Highlights: • We tackle uncertainty propagation in a radwaste repository performance assessment. • We improve a kriging-based importance sampling for estimating failure probabilities. • We justify the modification by an analytic, comparative analysis of the algorithms. • The probability of exceeding dose thresholds in radwaste repositories is estimated. • The algorithm is further improved reducing the number of its free parameters

  19. Increasing the efficiency of designing hemming processes by using an element-based metamodel approach

    Science.gov (United States)

    Kaiser, C.; Roll, K.; Volk, W.

    2017-09-01

    In the automotive industry, the manufacturing of automotive outer panels requires hemming processes in which two sheet metal parts are joined together by bending the flange of the outer part over the inner part. Because of decreasing development times and the steadily growing number of vehicle derivatives, an efficient digital product and process validation is necessary. Commonly used simulations, which are based on the finite element method, demand significant modelling effort, which results in disadvantages especially in the early product development phase. To increase the efficiency of designing hemming processes this paper presents a hemming-specific metamodel approach. The approach includes a part analysis in which the outline of the automotive outer panels is initially split into individual segments. By doing a para-metrization of each of the segments and assigning basic geometric shapes, the outline of the part is approximated. Based on this, the hemming parameters such as flange length, roll-in, wrinkling and plastic strains are calculated for each of the geometric basic shapes by performing a meta-model-based segmental product validation. The metamodel is based on an element similar formulation that includes a reference dataset of various geometric basic shapes. A random automotive outer panel can now be analysed and optimized based on the hemming-specific database. By implementing this approach into a planning system, an efficient optimization of designing hemming processes will be enabled. Furthermore, valuable time and cost benefits can be realized in a vehicle’s development process.

  20. Mapping ground water vulnerability to pesticide leaching with a process-based metamodel of EuroPEARL

    NARCIS (Netherlands)

    Tiktak, A.; Boesten, J.J.T.I.; Linden, van der A.M.A.; Vanclooster, M.

    2006-01-01

    To support EU policy, indicators of pesticide leaching at the European level are required. For this reason, a metamodel of the spatially distributed European pesticide leaching model EuroPEARL was developed. EuroPEARL considers transient flow and solute transport and assumes Freundlich adsorption,

  1. idSpace D2.2 – Semantic meta-model, integration and transformations v1

    DEFF Research Database (Denmark)

    Dolog, Peter; Lin, Yujian; Dols, Roger

    2009-01-01

    This report introduces a topic maps based meta-model for creativity techniques, creativity process, and idea maps as results from creativity process. It proposes a graph based and hierarchical graph based transformation of idea maps for combination and integration of results of different creativi...

  2. Primary Care Practice: Uncertainty and Surprise

    Science.gov (United States)

    Crabtree, Benjamin F.

    I will focus my comments on uncertainty and surprise in primary care practices. I am a medical anthropologist by training, and have been a full-time researcher in family medicine for close to twenty years. In this talk I want to look at primary care practices as complex systems, particularly taking the perspective of translating evidence into practice. I am going to discuss briefly the challenges we have in primary care, and in medicine in general, of translating new evidence into the everyday care of patients. To do this, I will look at two studies that we have conducted on family practices, then think about how practices can be best characterized as complex adaptive systems. Finally, I will focus on the implications of this portrayal for disseminating new knowledge into practice.

  3. Surprises and counterexamples in real function theory

    CERN Document Server

    Rajwade, A R

    2007-01-01

    This book presents a variety of intriguing, surprising and appealing topics and nonroutine theorems in real function theory. It is a reference book to which one can turn for finding that arise while studying or teaching analysis.Chapter 1 is an introduction to algebraic, irrational and transcendental numbers and contains the Cantor ternary set. Chapter 2 contains functions with extraordinary properties; functions that are continuous at each point but differentiable at no point. Chapters 4 and intermediate value property, periodic functions, Rolle's theorem, Taylor's theorem, points of tangents. Chapter 6 discusses sequences and series. It includes the restricted harmonic series, of alternating harmonic series and some number theoretic aspects. In Chapter 7, the infinite peculiar range of convergence is studied. Appendix I deal with some specialized topics. Exercises at the end of chapters and their solutions are provided in Appendix II.This book will be useful for students and teachers alike.

  4. The conceptualization model problem—surprise

    Science.gov (United States)

    Bredehoeft, John

    2005-03-01

    The foundation of model analysis is the conceptual model. Surprise is defined as new data that renders the prevailing conceptual model invalid; as defined here it represents a paradigm shift. Limited empirical data indicate that surprises occur in 20-30% of model analyses. These data suggest that groundwater analysts have difficulty selecting the appropriate conceptual model. There is no ready remedy to the conceptual model problem other than (1) to collect as much data as is feasible, using all applicable methods—a complementary data collection methodology can lead to new information that changes the prevailing conceptual model, and (2) for the analyst to remain open to the fact that the conceptual model can change dramatically as more information is collected. In the final analysis, the hydrogeologist makes a subjective decision on the appropriate conceptual model. The conceptualization problem does not render models unusable. The problem introduces an uncertainty that often is not widely recognized. Conceptual model uncertainty is exacerbated in making long-term predictions of system performance. C'est le modèle conceptuel qui se trouve à base d'une analyse sur un modèle. On considère comme une surprise lorsque le modèle est invalidé par des données nouvelles; dans les termes définis ici la surprise est équivalente à un change de paradigme. Des données empiriques limitées indiquent que les surprises apparaissent dans 20 à 30% des analyses effectuées sur les modèles. Ces données suggèrent que l'analyse des eaux souterraines présente des difficultés lorsqu'il s'agit de choisir le modèle conceptuel approprié. Il n'existe pas un autre remède au problème du modèle conceptuel que: (1) rassembler autant des données que possible en utilisant toutes les méthodes applicables—la méthode des données complémentaires peut conduire aux nouvelles informations qui vont changer le modèle conceptuel, et (2) l'analyste doit rester ouvert au fait

  5. AMFIBIA: A Meta-Model for the Integration of Business Process Modelling Aspects

    DEFF Research Database (Denmark)

    Axenath, Björn; Kindler, Ekkart; Rubin, Vladimir

    2007-01-01

    AMFIBIA is a meta-model that formalises the essential aspects and concepts of business processes. Though AMFIBIA is not the first approach to formalising the aspects and concepts of business processes, it is more ambitious in the following respects: Firstly, it is independent from particular...... modelling formalisms of business processes and it is designed in such a way that any formalism for modelling some aspect of a business process can be plugged into AMFIBIA. Therefore, AMFIBIA is formalism-independent. Secondly, it is not biased toward any aspect of business processes; the different aspects...... can be considered and modelled independently of each other. Moreover, AMFIBIA is not restricted to a fixed set of aspects; new aspects of business processes can be easily integrated. Thirdly, AMFIBIA does not only name and relate the concepts of business process modelling, as it is typically done...

  6. A Meta-Model of Inter-Organisational Cooperation for the Transition to a Circular Economy

    Directory of Open Access Journals (Sweden)

    Alessandro Ruggieri

    2016-11-01

    Full Text Available The transition to a circular economy bodes well for a future of environmentally sustainable growth and economic development. The implications and advantages of a shift to a circular economy have been extensively demonstrated by the literature on the subject. What has not been sufficiently investigated is how this paradigm can be enabled through the inter-organisational cooperation among different business enterprises. In order to illustrate this point, in this paper we aim to contribute to the circular economy debate by describing and discussing such a meta-model of inter-organisational cooperation. The present study is therefore based on the analysis of three cases from an equal number of industries, from which we identified factors of potential impact for the stimulation of cooperation in a circular economy perspective. Last, but not least, we discuss the relations between the case studies and try to formulate all possible implications for both managers and research.

  7. Implementing a collaborative virtual environment — specification for a usability metamodel

    Directory of Open Access Journals (Sweden)

    Maria L Villegas R

    2009-01-01

    Full Text Available This research presents the results of the first phase of a macro-project for constructing a collaborative virtual environment. It was aimed at selecting a graphical interface from five proposed for such environment, considering each one’s level of usability. Seve- ral standards of usability and user-centered design patterns were studied for specifying interface measurment criteria for speci- fying a usability metamodel; this defined the variables and rules to be taken into accout when measuring graphic user interface (GUI usability level for collaborative virtual environments. The use of metaphors when specifying graphic user interfaces is also briefly looked at as a source of new usability and satisfaction related to such interface use.

  8. Uncertainty in Bus Arrival Time Predictions: Treating Heteroscedasticity With a Metamodel Approach

    DEFF Research Database (Denmark)

    O'Sullivan, Aidan; Pereira, Francisco Camara; Zhao, Jinhua

    2016-01-01

    Arrival time predictions for the next available bus or train are a key component of modern traveler information systems (TISs). A great deal of research has been conducted within the intelligent transportation system community in developing an assortment of different algorithms that seek...... sources. In this paper, we tackle the issue of uncertainty in bus arrival time predictions using an alternative approach. Rather than endeavor to develop a superior method for prediction, we take existing predictions from a TIS and treat the algorithm generating them as a black box. The presence...... of heteroscedasticity in the predictions is demonstrated and then a metamodel approach is deployed, which augments existing predictive systems using quantile regression to place bounds on the associated error. As a case study, this approach is applied to data from a real-world TIS in Boston. This method allows bounds...

  9. Developing and applying metamodels of high resolution process-based simulations for high throughput exposure assessment of organic chemicals in riverine ecosystems.

    Science.gov (United States)

    Craig Barber, M; Isaacs, Kristin K; Tebes-Stevens, Caroline

    2017-12-15

    As defined by Wikipedia (https://en.wikipedia.org/wiki/Metamodeling), "(a) metamodel or surrogate model is a model of a model, and metamodeling is the process of generating such metamodels." The goals of metamodeling include, but are not limited to (1) developing functional or statistical relationships between a model's input and output variables for model analysis, interpretation, or information consumption by users' clients; (2) quantifying a model's sensitivity to alternative or uncertain forcing functions, initial conditions, or parameters; and (3) characterizing the model's response or state space. Using five models developed by the US Environmental Protection Agency, we generate a metamodeling database of the expected environmental and biological concentrations of 644 organic chemicals released into nine US rivers from wastewater treatment works (WTWs) assuming multiple loading rates and sizes of populations serviced. The chemicals of interest have log n-octanol/water partition coefficients (logK OW ) ranging from 3 to 14, and the rivers of concern have mean annual discharges ranging from 1.09 to 3240m 3 /s. Log-linear regression models are derived to predict mean annual dissolved and total water concentrations and total sediment concentrations of chemicals of concern based on their logK OW, Henry's Law Constant, and WTW loading rate and on the mean annual discharges of the receiving rivers. Metamodels are also derived to predict mean annual chemical concentrations in fish, invertebrates, and periphyton. We corroborate a subset of these metamodels using field studies focused on brominated flame retardants and discuss their application for high throughput screening of exposures to human and ecological populations and for analysis and interpretation of field data. Published by Elsevier B.V.

  10. Dynamic sensitivity analysis of long running landslide models through basis set expansion and meta-modelling

    Science.gov (United States)

    Rohmer, Jeremy

    2016-04-01

    Predicting the temporal evolution of landslides is typically supported by numerical modelling. Dynamic sensitivity analysis aims at assessing the influence of the landslide properties on the time-dependent predictions (e.g., time series of landslide displacements). Yet two major difficulties arise: 1. Global sensitivity analysis require running the landslide model a high number of times (> 1000), which may become impracticable when the landslide model has a high computation time cost (> several hours); 2. Landslide model outputs are not scalar, but function of time, i.e. they are n-dimensional vectors with n usually ranging from 100 to 1000. In this article, I explore the use of a basis set expansion, such as principal component analysis, to reduce the output dimensionality to a few components, each of them being interpreted as a dominant mode of variation in the overall structure of the temporal evolution. The computationally intensive calculation of the Sobol' indices for each of these components are then achieved through meta-modelling, i.e. by replacing the landslide model by a "costless-to-evaluate" approximation (e.g., a projection pursuit regression model). The methodology combining "basis set expansion - meta-model - Sobol' indices" is then applied to the La Frasse landslide to investigate the dynamic sensitivity analysis of the surface horizontal displacements to the slip surface properties during the pore pressure changes. I show how to extract information on the sensitivity of each main modes of temporal behaviour using a limited number (a few tens) of long running simulations. In particular, I identify the parameters, which trigger the occurrence of a turning point marking a shift between a regime of low values of landslide displacements and one of high values.

  11. Genomic prediction of complex human traits: relatedness, trait architecture and predictive meta-models

    Science.gov (United States)

    Spiliopoulou, Athina; Nagy, Reka; Bermingham, Mairead L.; Huffman, Jennifer E.; Hayward, Caroline; Vitart, Veronique; Rudan, Igor; Campbell, Harry; Wright, Alan F.; Wilson, James F.; Pong-Wong, Ricardo; Agakov, Felix; Navarro, Pau; Haley, Chris S.

    2015-01-01

    We explore the prediction of individuals' phenotypes for complex traits using genomic data. We compare several widely used prediction models, including Ridge Regression, LASSO and Elastic Nets estimated from cohort data, and polygenic risk scores constructed using published summary statistics from genome-wide association meta-analyses (GWAMA). We evaluate the interplay between relatedness, trait architecture and optimal marker density, by predicting height, body mass index (BMI) and high-density lipoprotein level (HDL) in two data cohorts, originating from Croatia and Scotland. We empirically demonstrate that dense models are better when all genetic effects are small (height and BMI) and target individuals are related to the training samples, while sparse models predict better in unrelated individuals and when some effects have moderate size (HDL). For HDL sparse models achieved good across-cohort prediction, performing similarly to the GWAMA risk score and to models trained within the same cohort, which indicates that, for predicting traits with moderately sized effects, large sample sizes and familial structure become less important, though still potentially useful. Finally, we propose a novel ensemble of whole-genome predictors with GWAMA risk scores and demonstrate that the resulting meta-model achieves higher prediction accuracy than either model on its own. We conclude that although current genomic predictors are not accurate enough for diagnostic purposes, performance can be improved without requiring access to large-scale individual-level data. Our methodologically simple meta-model is a means of performing predictive meta-analysis for optimizing genomic predictions and can be easily extended to incorporate multiple population-level summary statistics or other domain knowledge. PMID:25918167

  12. A Shocking Surprise in Stephan's Quintet

    Science.gov (United States)

    2006-01-01

    This false-color composite image of the Stephan's Quintet galaxy cluster clearly shows one of the largest shock waves ever seen (green arc). The wave was produced by one galaxy falling toward another at speeds of more than one million miles per hour. The image is made up of data from NASA's Spitzer Space Telescope and a ground-based telescope in Spain. Four of the five galaxies in this picture are involved in a violent collision, which has already stripped most of the hydrogen gas from the interiors of the galaxies. The centers of the galaxies appear as bright yellow-pink knots inside a blue haze of stars, and the galaxy producing all the turmoil, NGC7318b, is the left of two small bright regions in the middle right of the image. One galaxy, the large spiral at the bottom left of the image, is a foreground object and is not associated with the cluster. The titanic shock wave, larger than our own Milky Way galaxy, was detected by the ground-based telescope using visible-light wavelengths. It consists of hot hydrogen gas. As NGC7318b collides with gas spread throughout the cluster, atoms of hydrogen are heated in the shock wave, producing the green glow. Spitzer pointed its infrared spectrograph at the peak of this shock wave (middle of green glow) to learn more about its inner workings. This instrument breaks light apart into its basic components. Data from the instrument are referred to as spectra and are displayed as curving lines that indicate the amount of light coming at each specific wavelength. The Spitzer spectrum showed a strong infrared signature for incredibly turbulent gas made up of hydrogen molecules. This gas is caused when atoms of hydrogen rapidly pair-up to form molecules in the wake of the shock wave. Molecular hydrogen, unlike atomic hydrogen, gives off most of its energy through vibrations that emit in the infrared. This highly disturbed gas is the most turbulent molecular hydrogen ever seen. Astronomers were surprised not only by the turbulence

  13. A Developed Meta-model for Selection of Cotton Fabrics Using Design of Experiments and TOPSIS Method

    Science.gov (United States)

    Chakraborty, Shankar; Chatterjee, Prasenjit

    2017-12-01

    Selection of cotton fabrics for providing optimal clothing comfort is often considered as a multi-criteria decision making problem consisting of an array of candidate alternatives to be evaluated based of several conflicting properties. In this paper, design of experiments and technique for order preference by similarity to ideal solution (TOPSIS) are integrated so as to develop regression meta-models for identifying the most suitable cotton fabrics with respect to the computed TOPSIS scores. The applicability of the adopted method is demonstrated using two real time examples. These developed models can also identify the statistically significant fabric properties and their interactions affecting the measured TOPSIS scores and final selection decisions. There exists good degree of congruence between the ranking patterns as derived using these meta-models and the existing methods for cotton fabric ranking and subsequent selection.

  14. Surprise: Dwarf Galaxy Harbors Supermassive Black Hole

    Science.gov (United States)

    2011-01-01

    The surprising discovery of a supermassive black hole in a small nearby galaxy has given astronomers a tantalizing look at how black holes and galaxies may have grown in the early history of the Universe. Finding a black hole a million times more massive than the Sun in a star-forming dwarf galaxy is a strong indication that supermassive black holes formed before the buildup of galaxies, the astronomers said. The galaxy, called Henize 2-10, 30 million light-years from Earth, has been studied for years, and is forming stars very rapidly. Irregularly shaped and about 3,000 light-years across (compared to 100,000 for our own Milky Way), it resembles what scientists think were some of the first galaxies to form in the early Universe. "This galaxy gives us important clues about a very early phase of galaxy evolution that has not been observed before," said Amy Reines, a Ph.D. candidate at the University of Virginia. Supermassive black holes lie at the cores of all "full-sized" galaxies. In the nearby Universe, there is a direct relationship -- a constant ratio -- between the masses of the black holes and that of the central "bulges" of the galaxies, leading them to conclude that the black holes and bulges affected each others' growth. Two years ago, an international team of astronomers found that black holes in young galaxies in the early Universe were more massive than this ratio would indicate. This, they said, was strong evidence that black holes developed before their surrounding galaxies. "Now, we have found a dwarf galaxy with no bulge at all, yet it has a supermassive black hole. This greatly strengthens the case for the black holes developing first, before the galaxy's bulge is formed," Reines said. Reines, along with Gregory Sivakoff and Kelsey Johnson of the University of Virginia and the National Radio Astronomy Observatory (NRAO), and Crystal Brogan of the NRAO, observed Henize 2-10 with the National Science Foundation's Very Large Array radio telescope and

  15. Old Star's "Rebirth" Gives Astronomers Surprises

    Science.gov (United States)

    2005-04-01

    Astronomers using the National Science Foundation's Very Large Array (VLA) radio telescope are taking advantage of a once-in-a-lifetime opportunity to watch an old star suddenly stir back into new activity after coming to the end of its normal life. Their surprising results have forced them to change their ideas of how such an old, white dwarf star can re-ignite its nuclear furnace for one final blast of energy. Sakurai's Object Radio/Optical Images of Sakurai's Object: Color image shows nebula ejected thousands of years ago. Contours indicate radio emission. Inset is Hubble Space Telescope image, with contours indicating radio emission; this inset shows just the central part of the region. CREDIT: Hajduk et al., NRAO/AUI/NSF, ESO, StSci, NASA Computer simulations had predicted a series of events that would follow such a re-ignition of fusion reactions, but the star didn't follow the script -- events moved 100 times more quickly than the simulations predicted. "We've now produced a new theoretical model of how this process works, and the VLA observations have provided the first evidence supporting our new model," said Albert Zijlstra, of the University of Manchester in the United Kingdom. Zijlstra and his colleagues presented their findings in the April 8 issue of the journal Science. The astronomers studied a star known as V4334 Sgr, in the constellation Sagittarius. It is better known as "Sakurai's Object," after Japanese amateur astronomer Yukio Sakurai, who discovered it on February 20, 1996, when it suddenly burst into new brightness. At first, astronomers thought the outburst was a common nova explosion, but further study showed that Sakurai's Object was anything but common. The star is an old white dwarf that had run out of hydrogen fuel for nuclear fusion reactions in its core. Astronomers believe that some such stars can undergo a final burst of fusion in a shell of helium that surrounds a core of heavier nuclei such as carbon and oxygen. However, the

  16. Meta-modeling of the pesticide fate model MACRO for groundwater exposure assessments using artificial neural networks

    Science.gov (United States)

    Stenemo, Fredrik; Lindahl, Anna M. L.; Gärdenäs, Annemieke; Jarvis, Nicholas

    2007-08-01

    Several simple index methods that use easily accessible data have been developed and included in decision-support systems to estimate pesticide leaching across larger areas. However, these methods often lack important process descriptions (e.g. macropore flow), which brings into question their reliability. Descriptions of macropore flow have been included in simulation models, but these are too complex and demanding for spatial applications. To resolve this dilemma, a neural network simulation meta-model of the dual-permeability macropore flow model MACRO was created for pesticide groundwater exposure assessment. The model was parameterized using pedotransfer functions that require as input the clay and sand content of the topsoil and subsoil, and the topsoil organic carbon content. The meta-model also requires the topsoil pesticide half-life and the soil organic carbon sorption coefficient as input. A fully connected feed-forward multilayer perceptron classification network with two hidden layers, linked to fully connected feed-forward multilayer perceptron neural networks with one hidden layer, trained on sub-sets of the target variable, was shown to be a suitable meta-model for the intended purpose. A Fourier amplitude sensitivity test showed that the model output (the 80th percentile average yearly pesticide concentration at 1 m depth for a 20 year simulation period) was sensitive to all input parameters. The two input parameters related to pesticide characteristics (i.e. soil organic carbon sorption coefficient and topsoil pesticide half-life) were the most influential, but texture in the topsoil was also quite important since it was assumed to control the mass exchange coefficient that regulates the strength of macropore flow. This is in contrast to models based on the advection-dispersion equation where soil texture is relatively unimportant. The use of the meta-model is exemplified with a case-study where the spatial variability of pesticide leaching is

  17. The Influence of Negative Surprise on Hedonic Adaptation

    Directory of Open Access Journals (Sweden)

    Ana Paula Kieling

    2016-01-01

    Full Text Available After some time using a product or service, the consumer tends to feel less pleasure with consumption. This reduction of pleasure is known as hedonic adaptation. One of the emotions that interfere in this process is surprise. Based on two experiments, we suggest that negative surprise – differently to positive – influences with the level of pleasure foreseen and experienced by the consumer. Study 1 analyzes the influence of negative (vs. positive surprise on the consumer’s post-purchase hedonic adaptation expectation. Results showed that negative surprise influences the intensity of adaptation, augmenting its strength. Study 2 verifies the influence of negative (vs positive surprise over hedonic adaptation. The findings suggested that negative surprise makes adaptation happen more intensively and faster as time goes by, which brings consequences to companies and consumers in the post-purchase process, such as satisfaction and loyalty.

  18. A Dichotomic Analysis of the Surprise Examination Paradox

    OpenAIRE

    Franceschi, Paul

    2002-01-01

    This paper presents a dichotomic analysis of the surprise examination paradox. In section 1, I analyse the surprise notion in detail. I introduce then in section 2, the distinction between a monist and dichotomic analysis of the paradox. I also present there a dichotomy leading to distinguish two basically and structurally different versions of the paradox, respectively based on a conjoint and a disjoint definition of the surprise. In section 3, I describe the solution to SEP corresponding to...

  19. Constraints on the nuclear equation of state from nuclear masses and radii in a Thomas-Fermi meta-modeling approach

    Science.gov (United States)

    Chatterjee, D.; Gulminelli, F.; Raduta, Ad. R.; Margueron, J.

    2017-12-01

    The question of correlations among empirical equation of state (EoS) parameters constrained by nuclear observables is addressed in a Thomas-Fermi meta-modeling approach. A recently proposed meta-modeling for the nuclear EoS in nuclear matter is augmented with a single finite size term to produce a minimal unified EoS functional able to describe the smooth part of the nuclear ground state properties. This meta-model can reproduce the predictions of a large variety of models, and interpolate continuously between them. An analytical approximation to the full Thomas-Fermi integrals is further proposed giving a fully analytical meta-model for nuclear masses. The parameter space is sampled and filtered through the constraint of nuclear mass reproduction with Bayesian statistical tools. We show that this simple analytical meta-modeling has a predictive power on masses, radii, and skins comparable to full Hartree-Fock or extended Thomas-Fermi calculations with realistic energy functionals. The covariance analysis on the posterior distribution shows that no physical correlation is present between the different EoS parameters. Concerning nuclear observables, a strong correlation between the slope of the symmetry energy and the neutron skin is observed, in agreement with previous studies.

  20. Verifiable metamodels for nitrate losses to drains and groundwater in the Corn Belt, USA

    Science.gov (United States)

    Nolan, Bernard T.; Malone, Robert W.; Gronberg, Jo Ann M.; Thorp, K.R.; Ma, Liwang

    2012-01-01

    Nitrate leaching in the unsaturated zone poses a risk to groundwater, whereas nitrate in tile drainage is conveyed directly to streams. We developed metamodels (MMs) consisting of artificial neural networks to simplify and upscale mechanistic fate and transport models for prediction of nitrate losses by drains and leaching in the Corn Belt, USA. The two final MMs predicted nitrate concentration and flux, respectively, in the shallow subsurface. Because each MM considered both tile drainage and leaching, they represent an integrated approach to vulnerability assessment. The MMs used readily available data comprising farm fertilizer nitrogen (N), weather data, and soil properties as inputs; therefore, they were well suited for regional extrapolation. The MMs effectively related the outputs of the underlying mechanistic model (Root Zone Water Quality Model) to the inputs (R2 = 0.986 for the nitrate concentration MM). Predicted nitrate concentration was compared with measured nitrate in 38 samples of recently recharged groundwater, yielding a Pearson’s r of 0.466 (p = 0.003). Predicted nitrate generally was higher than that measured in groundwater, possibly as a result of the time-lag for modern recharge to reach well screens, denitrification in groundwater, or interception of recharge by tile drains. In a qualitative comparison, predicted nitrate concentration also compared favorably with results from a previous regression model that predicted total N in streams.

  1. Parameter identification and global sensitivity analysis of Xin'anjiang model using meta-modeling approach

    Directory of Open Access Journals (Sweden)

    Xiao-meng Song

    2013-01-01

    Full Text Available Parameter identification, model calibration, and uncertainty quantification are important steps in the model-building process, and are necessary for obtaining credible results and valuable information. Sensitivity analysis of hydrological model is a key step in model uncertainty quantification, which can identify the dominant parameters, reduce the model calibration uncertainty, and enhance the model optimization efficiency. There are, however, some shortcomings in classical approaches, including the long duration of time and high computation cost required to quantitatively assess the sensitivity of a multiple-parameter hydrological model. For this reason, a two-step statistical evaluation framework using global techniques is presented. It is based on (1 a screening method (Morris for qualitative ranking of parameters, and (2 a variance-based method integrated with a meta-model for quantitative sensitivity analysis, i.e., the Sobol method integrated with the response surface model (RSMSobol. First, the Morris screening method was used to qualitatively identify the parameters' sensitivity, and then ten parameters were selected to quantify the sensitivity indices. Subsequently, the RSMSobol method was used to quantify the sensitivity, i.e., the first-order and total sensitivity indices based on the response surface model (RSM were calculated. The RSMSobol method can not only quantify the sensitivity, but also reduce the computational cost, with good accuracy compared to the classical approaches. This approach will be effective and reliable in the global sensitivity analysis of a complex large-scale distributed hydrological model.

  2. The Value of Surprising Findings for Research on Marketing

    OpenAIRE

    JS Armstrong

    2004-01-01

    In the work of Armstrong (Journal of Business Research, 2002), I examined empirical research on the scientific process and related these to marketing science. The findings of some studies were surprising. In this reply, I address surprising findings and other issues raised by commentators.

  3. Corrugator Activity Confirms Immediate Negative Affect in Surprise

    Directory of Open Access Journals (Sweden)

    Sascha eTopolinski

    2015-02-01

    Full Text Available The emotion of surprise entails a complex of immediate responses, such as cognitive interruption, attention allocation to, and more systematic processing of the surprising stimulus. All these processes serve the ultimate function to increase processing depth and thus cognitively master the surprising stimulus. The present account introduces phasic negative affect as the underlying mechanism responsible for these consequences. Surprising stimuli are schema-discrepant and thus entail cognitive disfluency, which elicits immediate negative affect. This affect in turn works like a phasic cognitive tuning switching the current processing mode from more automatic and heuristic to more systematic and reflective processing. Directly testing the initial elicitation of negative affect by suprising events, the present experiment presented high and low surprising neutral trivia statements to N = 28 participants while assessing their spontaneous facial expressions via facial electromyography. High compared to low suprising trivia elicited higher corrugator activity, indicative of negative affect and mental effort, while leaving zygomaticus (positive affect and frontalis (cultural surprise expression activity unaffected. Future research shall investigate the mediating role of negative affect in eliciting surprise-related outcomes.

  4. Polynomial meta-models with canonical low-rank approximations: Numerical insights and comparison to sparse polynomial chaos expansions

    Energy Technology Data Exchange (ETDEWEB)

    Konakli, Katerina, E-mail: konakli@ibk.baug.ethz.ch; Sudret, Bruno

    2016-09-15

    The growing need for uncertainty analysis of complex computational models has led to an expanding use of meta-models across engineering and sciences. The efficiency of meta-modeling techniques relies on their ability to provide statistically-equivalent analytical representations based on relatively few evaluations of the original model. Polynomial chaos expansions (PCE) have proven a powerful tool for developing meta-models in a wide range of applications; the key idea thereof is to expand the model response onto a basis made of multivariate polynomials obtained as tensor products of appropriate univariate polynomials. The classical PCE approach nevertheless faces the “curse of dimensionality”, namely the exponential increase of the basis size with increasing input dimension. To address this limitation, the sparse PCE technique has been proposed, in which the expansion is carried out on only a few relevant basis terms that are automatically selected by a suitable algorithm. An alternative for developing meta-models with polynomial functions in high-dimensional problems is offered by the newly emerged low-rank approximations (LRA) approach. By exploiting the tensor–product structure of the multivariate basis, LRA can provide polynomial representations in highly compressed formats. Through extensive numerical investigations, we herein first shed light on issues relating to the construction of canonical LRA with a particular greedy algorithm involving a sequential updating of the polynomial coefficients along separate dimensions. Specifically, we examine the selection of optimal rank, stopping criteria in the updating of the polynomial coefficients and error estimation. In the sequel, we confront canonical LRA to sparse PCE in structural-mechanics and heat-conduction applications based on finite-element solutions. Canonical LRA exhibit smaller errors than sparse PCE in cases when the number of available model evaluations is small with respect to the input

  5. Polynomial meta-models with canonical low-rank approximations: Numerical insights and comparison to sparse polynomial chaos expansions

    International Nuclear Information System (INIS)

    Konakli, Katerina; Sudret, Bruno

    2016-01-01

    The growing need for uncertainty analysis of complex computational models has led to an expanding use of meta-models across engineering and sciences. The efficiency of meta-modeling techniques relies on their ability to provide statistically-equivalent analytical representations based on relatively few evaluations of the original model. Polynomial chaos expansions (PCE) have proven a powerful tool for developing meta-models in a wide range of applications; the key idea thereof is to expand the model response onto a basis made of multivariate polynomials obtained as tensor products of appropriate univariate polynomials. The classical PCE approach nevertheless faces the “curse of dimensionality”, namely the exponential increase of the basis size with increasing input dimension. To address this limitation, the sparse PCE technique has been proposed, in which the expansion is carried out on only a few relevant basis terms that are automatically selected by a suitable algorithm. An alternative for developing meta-models with polynomial functions in high-dimensional problems is offered by the newly emerged low-rank approximations (LRA) approach. By exploiting the tensor–product structure of the multivariate basis, LRA can provide polynomial representations in highly compressed formats. Through extensive numerical investigations, we herein first shed light on issues relating to the construction of canonical LRA with a particular greedy algorithm involving a sequential updating of the polynomial coefficients along separate dimensions. Specifically, we examine the selection of optimal rank, stopping criteria in the updating of the polynomial coefficients and error estimation. In the sequel, we confront canonical LRA to sparse PCE in structural-mechanics and heat-conduction applications based on finite-element solutions. Canonical LRA exhibit smaller errors than sparse PCE in cases when the number of available model evaluations is small with respect to the input

  6. A proposed metamodel for the implementation of object oriented software through the automatic generation of source code

    Directory of Open Access Journals (Sweden)

    CARVALHO, J. S. C.

    2008-12-01

    Full Text Available During the development of software one of the most visible risks and perhaps the biggest implementation obstacle relates to the time management. All delivery deadlines software versions must be followed, but it is not always possible, sometimes due to delay in coding. This paper presents a metamodel for software implementation, which will rise to a development tool for automatic generation of source code, in order to make any development pattern transparent to the programmer, significantly reducing the time spent in coding artifacts that make up the software.

  7. FLCNDEMF: An Event Metamodel for Flood Process Information Management under the Sensor Web Environment

    Directory of Open Access Journals (Sweden)

    Nengcheng Chen

    2015-06-01

    Full Text Available Significant economic losses, large affected populations, and serious environmental damage caused by recurrent natural disaster events (NDE worldwide indicate insufficiency in emergency preparedness and response. The barrier of full life cycle data preparation and information support is one of the main reasons. This paper adopts the method of integrated environmental modeling, incorporates information from existing event protocols, languages, and models, analyzes observation demands from different event stages, and forms the abstract full life cycle natural disaster event metamodel (FLCNDEM based on meta-object facility. Then task library and knowledge base for floods are built to instantiate FLCNDEM, forming the FLCNDEM for floods (FLCNDEMF. FLCNDEMF is formalized according to Event Pattern Markup Language, and a prototype system, Natural Disaster Event Manager, is developed to assist in the template-based modeling and management. The flood in Liangzi (LZ Lake of Hubei, China on 16 July 2010 is adopted to illustrate how to apply FLCNDEM in real scenarios. FLCNDEM-based modeling is realized, and the candidate remote sensing (RS dataset for different observing missions are provided for LZ Lake flood. Taking the mission of flood area extraction as an example, the appropriate RS data are selected via the model of simplified general perturbation version 4, and the flood area in different phases are calculated and displayed on the map. The phase-based modeling and visualization intuitively display the spatial-temporal distribution and the evolution process of the LZ Lake flood, and it is of great significance for flood responding. In addition, through the extension mechanism, FLCNDEM can also be applied in other environmental applications, providing important support for full life cycle information sharing and rapid responding.

  8. Optimization model of a system of crude oil distillation units with heat integration and metamodeling

    International Nuclear Information System (INIS)

    Lopez, Diana C; Mahecha, Cesar A; Hoyos, Luis J; Acevedo, Leonardo; Villamizar Jaime F

    2010-01-01

    The process of crude distillation impacts the economy of any refinery in a considerable manner. Therefore, it is necessary to improve it taking good advantage of the available infrastructure, generating products that conform to the specifications without violating the equipment operating constraints or plant restrictions at industrial units. The objective of this paper is to present the development of an optimization model for a Crude Distillation Unit (CDU) system at a ECOPETROL S.A. refinery in Barrancabermeja, involving the typical restrictions (flow according to pipeline capacity, pumps, distillation columns, etc) and a restriction that has not been included in bibliographic reports for this type of models: the heat integration of streams from Atmospheric Distillation Towers (ADTs) and Vacuum Distillation Towers (VDT) with the heat exchanger networks for crude pre-heating. On the other hand, ADTs were modeled with Metamodels in function of column temperatures and pressures, pump a rounds flows and return temperatures, stripping steam flows, Jet EBP ASTM D-86 and Diesel EBP ASTM D-86. Pre-heating trains were modeled with mass and energy balances, and design equation of each heat exchanger. The optimization model is NLP, maximizing the system profit. This model was implemented in GAMSide 22,2 using the CONOPT solver and it found new operating points with better economic results than those obtained with the normal operation in the real plants. It predicted optimum operation conditions of 3 ADTs for constant composition crude and calculated the yields and properties of atmospheric products, additional to temperatures and duties of 27 Crude Oil exchangers.

  9. Managing Uncertainity: Soviet Views on Deception, Surprise, and Control

    National Research Council Canada - National Science Library

    Hull, Andrew

    1989-01-01

    .... In the first two cases (deception and surprise), the emphasis is on how the Soviets seek to sow uncertainty in the minds of the enemy and how the Soviets then plan to use that uncertainty to gain military advantage...

  10. Dividend announcements reconsidered: Dividend changes versus dividend surprises

    OpenAIRE

    Andres, Christian; Betzer, André; van den Bongard, Inga; Haesner, Christian; Theissen, Erik

    2012-01-01

    This paper reconsiders the issue of share price reactions to dividend announcements. Previous papers rely almost exclusively on a naive dividend model in which the dividend change is used as a proxy for the dividend surprise. We use the difference between the actual dividend and the analyst consensus forecast as obtained from I/B/E/S as a proxy for the dividend surprise. Using data from Germany, we find significant share price reactions after dividend announcements. Once we control for analys...

  11. The Surprise Examination Paradox and the Second Incompleteness Theorem

    OpenAIRE

    Kritchman, Shira; Raz, Ran

    2010-01-01

    We give a new proof for Godel's second incompleteness theorem, based on Kolmogorov complexity, Chaitin's incompleteness theorem, and an argument that resembles the surprise examination paradox. We then go the other way around and suggest that the second incompleteness theorem gives a possible resolution of the surprise examination paradox. Roughly speaking, we argue that the flaw in the derivation of the paradox is that it contains a hidden assumption that one can prove the consistency of the...

  12. Extending MAM5 Meta-Model and JaCalIV E Framework to Integrate Smart Devices from Real Environments

    Science.gov (United States)

    2016-01-01

    This paper presents the extension of a meta-model (MAM5) and a framework based on the model (JaCalIVE) for developing intelligent virtual environments. The goal of this extension is to develop augmented mirror worlds that represent a real and virtual world coupled, so that the virtual world not only reflects the real one, but also complements it. A new component called a smart resource artifact, that enables modelling and developing devices to access the real physical world, and a human in the loop agent to place a human in the system have been included in the meta-model and framework. The proposed extension of MAM5 has been tested by simulating a light control system where agents can access both virtual and real sensor/actuators through the smart resources developed. The results show that the use of real environment interactive elements (smart resource artifacts) in agent-based simulations allows to minimize the error between simulated and real system. PMID:26926691

  13. Extending MAM5 Meta-Model and JaCalIV E Framework to Integrate Smart Devices from Real Environments.

    Science.gov (United States)

    Rincon, J A; Poza-Lujan, Jose-Luis; Julian, V; Posadas-Yagüe, Juan-Luis; Carrascosa, C

    2016-01-01

    This paper presents the extension of a meta-model (MAM5) and a framework based on the model (JaCalIVE) for developing intelligent virtual environments. The goal of this extension is to develop augmented mirror worlds that represent a real and virtual world coupled, so that the virtual world not only reflects the real one, but also complements it. A new component called a smart resource artifact, that enables modelling and developing devices to access the real physical world, and a human in the loop agent to place a human in the system have been included in the meta-model and framework. The proposed extension of MAM5 has been tested by simulating a light control system where agents can access both virtual and real sensor/actuators through the smart resources developed. The results show that the use of real environment interactive elements (smart resource artifacts) in agent-based simulations allows to minimize the error between simulated and real system.

  14. A META-MODELLING SERVICE PARADIGM FOR CLOUD COMPUTING AND ITS IMPLEMENTATION

    Directory of Open Access Journals (Sweden)

    F. Cheng

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT:Service integrators seek opportunities to align the way they manage resources in the service supply chain. Many business organisations can operate new, more flexible business processes that harness the value of a service approach from the customer’s perspective. As a relatively new concept, cloud computing and related technologies have rapidly gained momentum in the IT world. This article seeks to shed light on service supply chain issues associated with cloud computing by examining several interrelated questions: service supply chain architecture from a service perspective; the basic clouds of service supply chain; managerial insights into these clouds; and the commercial value of implementing cloud computing. In particular, to show how those services can be used, and involved in their utilisation processes, a hypothetical meta-modelling service of cloud computing is proposed. Moreover, the paper defines the managed cloud architecture for a service vendor or service integrator in the cloud computing infrastructure in the service supply chain: IT services, business services, business processes, which create atomic and composite software services that are used to perform business processes with business service choreographies.

    AFRIKAANSE OPSOMMING: Diensintegreeders is op soek na geleenthede om die bestuur van hulpbronne in die diensketting te belyn. Talle organisasies kan nuwe, meer buigsame besigheidprosesse, wat die waarde van ‘n diensaanslag uit die kliënt se oogpunt inspan, gebruik. As ‘n relatiewe nuwe konsep het wolkberekening en verwante tegnologie vinnig momentum gekry in die IT-wêreld. Die artikel poog om lig te werp op kwessies van die diensketting wat verband hou met wolkberekening deur verskeie verwante vrae te ondersoek: dienkettingargitektuur uit ‘n diensoogpunt; die basiese wolk van die diensketting; bestuursinsigte oor sodanige wolke; en die kommersiële waarde van die implementering van

  15. An efficient community detection algorithm using greedy surprise maximization

    International Nuclear Information System (INIS)

    Jiang, Yawen; Jia, Caiyan; Yu, Jian

    2014-01-01

    Community detection is an important and crucial problem in complex network analysis. Although classical modularity function optimization approaches are widely used for identifying communities, the modularity function (Q) suffers from its resolution limit. Recently, the surprise function (S) was experimentally proved to be better than the Q function. However, up until now, there has been no algorithm available to perform searches to directly determine the maximal surprise values. In this paper, considering the superiority of the S function over the Q function, we propose an efficient community detection algorithm called AGSO (algorithm based on greedy surprise optimization) and its improved version FAGSO (fast-AGSO), which are based on greedy surprise optimization and do not suffer from the resolution limit. In addition, (F)AGSO does not need the number of communities K to be specified in advance. Tests on experimental networks show that (F)AGSO is able to detect optimal partitions in both simple and even more complex networks. Moreover, algorithms based on surprise maximization perform better than those algorithms based on modularity maximization, including Blondel–Guillaume–Lambiotte–Lefebvre (BGLL), Clauset–Newman–Moore (CNM) and the other state-of-the-art algorithms such as Infomap, order statistics local optimization method (OSLOM) and label propagation algorithm (LPA). (paper)

  16. Providing the meta-model of development of competency using the meta-ethnography approach: Part 2. Synthesis of the available competency development models

    Directory of Open Access Journals (Sweden)

    Shahram Yazdani

    2016-12-01

    Full Text Available Background and Purpose: ConsideringBackground and Purpose: Considering the importance and necessity of competency-based education at a global level and with respect to globalization and the requirement of minimum competencies in medical fields, medical education communities and organizations worldwide have tried to determine the competencies, present frameworks and education models to respond to be sure of the ability of all graduates. In the literature, we observed numerous competency development models that refer to the same issues with different terminologies. It seems that evaluation and synthesis of all these models can finally result in designing a comprehensive meta-model for competency development.Methods: Meta-ethnography is a useful method for synthesis of qualitative research that is used to develop models that interpret the results in several studies. Considering that the aim of this study is to ultimately provide a competency development meta-model, in the previous section of the study, the literature review was conducted to achieve competency development models. Models obtained through the search were studied in details, and the key concepts of the models and overarching concepts were extracted in this section, models’ concepts were reciprocally translated and the available competency development models were synthesized.Results: A presentation of the competency development meta-model and providing a redefinition of the Dreyfus brothers model.Conclusions: Given the importance of competency-based education at a global level and the need to review curricula and competency-based curriculum design, it is required to provide competency development as well as meta-model to be the basis for curriculum development. As there are a variety of competency development models available, in this study, it was tried to develop the curriculum using them.Keywords: Meta-ethnography, Competency development, Meta-model, Qualitative synthesis

  17. Surprise and Memory as Indices of Concrete Operational Development

    Science.gov (United States)

    Achenbach, Thomas M.

    1973-01-01

    Normal and retarded children's use of color, number, length and continuous quantity as attributes of identification was assessed by presenting them with contrived changes in three properties. Surprise and correct memory responses for color preceded those to number, which preceded logical verbal responses to a conventional number-conservation task.…

  18. Effects of surprisal and locality on Danish sentence processing

    DEFF Research Database (Denmark)

    Balling, Laura Winther; Kizach, Johannes

    2017-01-01

    An eye-tracking experiment in Danish investigates two dominant accounts of sentence processing: locality-based theories that predict a processing advantage for sentences where the distance between the major syntactic heads is minimized, and the surprisal theory which predicts that processing time...

  19. Surprisal analysis and probability matrices for rotational energy transfer

    International Nuclear Information System (INIS)

    Levine, R.D.; Bernstein, R.B.; Kahana, P.; Procaccia, I.; Upchurch, E.T.

    1976-01-01

    The information-theoretic approach is applied to the analysis of state-to-state rotational energy transfer cross sections. The rotational surprisal is evaluated in the usual way, in terms of the deviance of the cross sections from their reference (''prior'') values. The surprisal is found to be an essentially linear function of the energy transferred. This behavior accounts for the experimentally observed exponential gap law for the hydrogen halide systems. The data base here analyzed (taken from the literature) is largely computational in origin: quantal calculations for the hydrogenic systems H 2 +H, He, Li + ; HD+He; D 2 +H and for the N 2 +Ar system; and classical trajectory results for H 2 +Li + ; D 2 +Li + and N 2 +Ar. The surprisal analysis not only serves to compact a large body of data but also aids in the interpretation of the results. A single surprisal parameter theta/subR/ suffices to account for the (relative) magnitude of all state-to-state inelastic cross sections at a given energy

  20. Things may not be as expected: Surprising findings when updating ...

    African Journals Online (AJOL)

    2015-05-14

    May 14, 2015 ... Things may not be as expected: Surprising findings when updating .... (done at the end of three months after the first review month) ..... Allen G. Getting beyond form filling: The role of institutional governance in human research ...

  1. Automation surprise : results of a field survey of Dutch pilots

    NARCIS (Netherlands)

    de Boer, R.J.; Hurts, Karel

    2017-01-01

    Automation surprise (AS) has often been associated with aviation safety incidents. Although numerous laboratory studies have been conducted, few data are available from routine flight operations. A survey among a representative sample of 200 Dutch airline pilots was used to determine the prevalence

  2. DNA Barcoding the Geometrid Fauna of Bavaria (Lepidoptera): Successes, Surprises, and Questions

    Science.gov (United States)

    Hausmann, Axel; Haszprunar, Gerhard; Hebert, Paul D. N.

    2011-01-01

    Background The State of Bavaria is involved in a research program that will lead to the construction of a DNA barcode library for all animal species within its territorial boundaries. The present study provides a comprehensive DNA barcode library for the Geometridae, one of the most diverse of insect families. Methodology/Principal Findings This study reports DNA barcodes for 400 Bavarian geometrid species, 98 per cent of the known fauna, and approximately one per cent of all Bavarian animal species. Although 98.5% of these species possess diagnostic barcode sequences in Bavaria, records from neighbouring countries suggest that species-level resolution may be compromised in up to 3.5% of cases. All taxa which apparently share barcodes are discussed in detail. One case of modest divergence (1.4%) revealed a species overlooked by the current taxonomic system: Eupithecia goossensiata Mabille, 1869 stat.n. is raised from synonymy with Eupithecia absinthiata (Clerck, 1759) to species rank. Deep intraspecific sequence divergences (>2%) were detected in 20 traditionally recognized species. Conclusions/Significance The study emphasizes the effectiveness of DNA barcoding as a tool for monitoring biodiversity. Open access is provided to a data set that includes records for 1,395 geometrid specimens (331 species) from Bavaria, with 69 additional species from neighbouring regions. Taxa with deep intraspecific sequence divergences are undergoing more detailed analysis to ascertain if they represent cases of cryptic diversity. PMID:21423340

  3. Sleeping beauties in theoretical physics 26 surprising insights

    CERN Document Server

    Padmanabhan, Thanu

    2015-01-01

    This book addresses a fascinating set of questions in theoretical physics which will both entertain and enlighten all students, teachers and researchers and other physics aficionados. These range from Newtonian mechanics to quantum field theory and cover several puzzling issues that do not appear in standard textbooks. Some topics cover conceptual conundrums, the solutions to which lead to surprising insights; some correct popular misconceptions in the textbook discussion of certain topics; others illustrate deep connections between apparently unconnected domains of theoretical physics; and a few provide remarkably simple derivations of results which are not often appreciated. The connoisseur of theoretical physics will enjoy a feast of pleasant surprises skilfully prepared by an internationally acclaimed theoretical physicist. Each topic is introduced with proper background discussion and special effort is taken to make the discussion self-contained, clear and comprehensible to anyone with an undergraduate e...

  4. The June surprises: balls, strikes, and the fog of war.

    Science.gov (United States)

    Fried, Charles

    2013-04-01

    At first, few constitutional experts took seriously the argument that the Patient Protection and Affordable Care Act exceeded Congress's power under the commerce clause. The highly political opinions of two federal district judges - carefully chosen by challenging plaintiffs - of no particular distinction did not shake that confidence that the act was constitutional. This disdain for the challengers' arguments was only confirmed when the act was upheld by two highly respected conservative court of appeals judges in two separate circuits. But after the hostile, even mocking questioning of the government's advocate in the Supreme Court by the five Republican-appointed justices, the expectation was that the act would indeed be struck down on that ground. So it came as no surprise when the five opined the act did indeed exceed Congress's commerce clause power. But it came as a great surprise when Chief Justice John Roberts, joined by the four Democrat-appointed justices, ruled that the act could be sustained as an exercise of Congress's taxing power - a ground urged by the government almost as an afterthought. It was further surprising, even shocking, that Justices Antonin Scalia, Anthony Kennedy, Clarence Thomas, and Samuel Alito not only wrote a joint opinion on the commerce clause virtually identical to that of their chief, but that in writing it they did not refer to or even acknowledge his opinion. Finally surprising was the fact that Justices Ruth Bader Ginsburg and Stephen Breyer joined the chief in holding that aspects of the act's Medicaid expansion were unconstitutional. This essay ponders and tries to unravel some of these puzzles.

  5. ORMS IN SURPRISING PLACES: CLINICAL AND MORPHOLOGICAL FEATURES

    Directory of Open Access Journals (Sweden)

    Myroshnychenko MS

    2013-06-01

    Full Text Available Helminthes are the most common human diseases, which are characterized by involvement in the pathological process of all organs and systems. In this article, the authors discuss a few cases of typical and atypical localizations for parasitic worms such as filarial and pinworms which were recovered from surprising places in the bodies of patients in Kharkiv region. This article will allow the doctors of practical health care to pay special attention to the timely prevention and diagnostics of this pathology.

  6. Influence of partially known parameter on flaw characterization in Eddy Current Testing by using a random walk MCMC method based on metamodeling

    International Nuclear Information System (INIS)

    Cai, Caifang; Lambert, Marc; Rodet, Thomas

    2014-01-01

    First, we present the implementation of a random walk Metropolis-within-Gibbs (MWG) sampling method in flaw characterization based on a metamodeling method. The role of metamodeling is to reduce the computational time cost in Eddy Current Testing (ECT) forward model calculation. In such a way, the use of Markov Chain Monte Carlo (MCMC) methods becomes possible. Secondly, we analyze the influence of partially known parameters in Bayesian estimation. The objective is to evaluate the importance of providing more specific prior information. Simulation results show that even partially known information has great interest in providing more accurate flaw parameter estimations. The improvement ratio depends on the parameter dependence and the interest shows only when the provided information is specific enough

  7. On the surprising rigidity of the Pauli exclusion principle

    International Nuclear Information System (INIS)

    Greenberg, O.W.

    1989-01-01

    I review recent attempts to construct a local quantum field theory of small violations of the Pauli exclusion principle and suggest a qualitative reason for the surprising rigidity of the Pauli principle. I suggest that small violations can occur in our four-dimensional world as a consequence of the compactification of a higher-dimensional theory in which the exclusion principle is exactly valid. I briefly mention a recent experiment which places a severe limit on possible violations of the exclusion principle. (orig.)

  8. Teacher Supply and Demand: Surprises from Primary Research

    Directory of Open Access Journals (Sweden)

    Andrew J. Wayne

    2000-09-01

    Full Text Available An investigation of primary research studies on public school teacher supply and demand revealed four surprises. Projections show that enrollments are leveling off. Relatedly, annual hiring increases should be only about two or three percent over the next few years. Results from studies of teacher attrition also yield unexpected results. Excluding retirements, only about one in 20 teachers leaves each year, and the novice teachers who quit mainly cite personal and family reasons, not job dissatisfaction. Each of these findings broadens policy makers' options for teacher supply.

  9. Efficiency enhancement of optimized Latin hypercube sampling strategies: Application to Monte Carlo uncertainty analysis and meta-modeling

    Science.gov (United States)

    Rajabi, Mohammad Mahdi; Ataie-Ashtiani, Behzad; Janssen, Hans

    2015-02-01

    The majority of literature regarding optimized Latin hypercube sampling (OLHS) is devoted to increasing the efficiency of these sampling strategies through the development of new algorithms based on the combination of innovative space-filling criteria and specialized optimization schemes. However, little attention has been given to the impact of the initial design that is fed into the optimization algorithm, on the efficiency of OLHS strategies. Previous studies, as well as codes developed for OLHS, have relied on one of the following two approaches for the selection of the initial design in OLHS: (1) the use of random points in the hypercube intervals (random LHS), and (2) the use of midpoints in the hypercube intervals (midpoint LHS). Both approaches have been extensively used, but no attempt has been previously made to compare the efficiency and robustness of their resulting sample designs. In this study we compare the two approaches and show that the space-filling characteristics of OLHS designs are sensitive to the initial design that is fed into the optimization algorithm. It is also illustrated that the space-filling characteristics of OLHS designs based on midpoint LHS are significantly better those based on random LHS. The two approaches are compared by incorporating their resulting sample designs in Monte Carlo simulation (MCS) for uncertainty propagation analysis, and then, by employing the sample designs in the selection of the training set for constructing non-intrusive polynomial chaos expansion (NIPCE) meta-models which subsequently replace the original full model in MCSs. The analysis is based on two case studies involving numerical simulation of density dependent flow and solute transport in porous media within the context of seawater intrusion in coastal aquifers. We show that the use of midpoint LHS as the initial design increases the efficiency and robustness of the resulting MCSs and NIPCE meta-models. The study also illustrates that this

  10. Metamodeling as a tool to size vegetative filter strips for surface runoff pollution control in European watersheds.

    Science.gov (United States)

    Lauvernet, Claire; Muñoz-Carpena, Rafael; Carluer, Nadia

    2015-04-01

    influence and interactions, and set priorities for data collecting and management. Based on GSA results, we compared several mathematical methods to compute the metamodel, and then validated it on an agricultural watershed with real data in the North-West of France. The analysis procedure allows for a robust and validated metamodel, before extending it on other climatic conditions in order to make the application on a large range of european watersheds possible. The tool will allow comparison of field scenarios, and to validate/improve actual existing placements and VFS sizing.

  11. Estimations of expectedness and potential surprise in possibility theory

    Science.gov (United States)

    Prade, Henri; Yager, Ronald R.

    1992-01-01

    This note investigates how various ideas of 'expectedness' can be captured in the framework of possibility theory. Particularly, we are interested in trying to introduce estimates of the kind of lack of surprise expressed by people when saying 'I would not be surprised that...' before an event takes place, or by saying 'I knew it' after its realization. In possibility theory, a possibility distribution is supposed to model the relative levels of mutually exclusive alternatives in a set, or equivalently, the alternatives are assumed to be rank-ordered according to their level of possibility to take place. Four basic set-functions associated with a possibility distribution, including standard possibility and necessity measures, are discussed from the point of view of what they estimate when applied to potential events. Extensions of these estimates based on the notions of Q-projection or OWA operators are proposed when only significant parts of the possibility distribution are retained in the evaluation. The case of partially-known possibility distributions is also considered. Some potential applications are outlined.

  12. Physics Nobel prize 2004: Surprising theory wins physics Nobel

    CERN Multimedia

    2004-01-01

    From left to right: David Politzer, David Gross and Frank Wilczek. For their understanding of counter-intuitive aspects of the strong force, which governs quarks inside protons and neutrons, on 5 October three American physicists were awarded the 2004 Nobel Prize in Physics. David J. Gross (Kavli Institute of Theoretical Physics, University of California, Santa Barbara), H. David Politzer (California Institute of Technology), and Frank Wilczek (Massachusetts Institute of Technology) made a key theoretical discovery with a surprising result: the closer quarks are together, the weaker the force - opposite to what is seen with electromagnetism and gravity. Rather, the strong force is analogous to a rubber band stretching, where the force increases as the quarks get farther apart. These physicists discovered this property of quarks, known as asymptotic freedom, in 1976. It later became a key part of the theory of quantum chromodynamics (QCD) and the Standard Model, the current best theory to describe the interac...

  13. Surprises in the suddenly-expanded infinite well

    International Nuclear Information System (INIS)

    Aslangul, Claude

    2008-01-01

    I study the time evolution of a particle prepared in the ground state of an infinite well after the latter is suddenly expanded. It turns out that the probability density |Ψ(x, t)| 2 shows up quite a surprising behaviour: for definite times, plateaux appear for which |Ψ(x, t)| 2 is constant on finite intervals for x. Elements of theoretical explanation are given by analysing the singular component of the second derivative ∂ xx Ψ(x, t). Analytical closed expressions are obtained for some specific times, which easily allow us to show that, at these times, the density organizes itself into regular patterns provided the size of the box is large enough; more, above some critical size depending on the specific time, the density patterns are independent of the expansion parameter. It is seen how the density at these times simply results from a construction game with definite rules acting on the pieces of the initial density

  14. Hepatobiliary fascioliasis in non-endemic zones: a surprise diagnosis.

    Science.gov (United States)

    Jha, Ashish Kumar; Goenka, Mahesh Kumar; Goenka, Usha; Chakrabarti, Amrita

    2013-03-01

    Fascioliasis is a zoonotic infection caused by Fasciola hepatica. Because of population migration and international food trade, human fascioliasis is being an increasingly recognised entity in nonendemic zones. In most parts of Asia, hepatobiliary fascioliasis is sporadic. Human hepatobiliary infection by this trematode has two distinct phases: an acute hepatic phase and a chronic biliary phase. Hepatobiliary infection is mostly associated with intense peripheral eosinophilia. In addition to classically defined hepatic phase and biliary phase fascioliasis, some cases may have an overlap of these two phases. Chronic liver abscess formation is a rare presentation. We describe a surprise case of hepatobiliary fascioliasis who presented to us with liver abscess without intense peripheral eosinophilia, a rare presentation of human fascioliasis especially in non-endemic zones. Copyright © 2013 Arab Journal of Gastroenterology. Published by Elsevier Ltd. All rights reserved.

  15. The Value of Change: Surprises and Insights in Stellar Evolution

    Science.gov (United States)

    Bildsten, Lars

    2018-01-01

    Astronomers with large-format cameras regularly scan the sky many times per night to detect what's changing, and telescopes in space such as Kepler and, soon, TESS obtain very accurate brightness measurements of nearly a million stars over time periods of years. These capabilities, in conjunction with theoretical and computational efforts, have yielded surprises and remarkable new insights into the internal properties of stars and how they end their lives. I will show how asteroseismology reveals the properties of the deep interiors of red giants, and highlight how astrophysical transients may be revealing unusual thermonuclear outcomes from exploding white dwarfs and the births of highly magnetic neutron stars. All the while, stellar science has been accelerated by the availability of open source tools, such as Modules for Experiments in Stellar Astrophysics (MESA), and the nearly immediate availability of observational results.

  16. Exploring the concept of climate surprises. A review of the literature on the concept of surprise and how it is related to climate change

    International Nuclear Information System (INIS)

    Glantz, M.H.; Moore, C.M.; Streets, D.G.; Bhatti, N.; Rosa, C.H.

    1998-01-01

    This report examines the concept of climate surprise and its implications for environmental policymaking. Although most integrated assessment models of climate change deal with average values of change, it is usually the extreme events or surprises that cause the most damage to human health and property. Current models do not help the policymaker decide how to deal with climate surprises. This report examines the literature of surprise in many aspects of human society: psychology, military, health care, humor, agriculture, etc. It draws together various ways to consider the concept of surprise and examines different taxonomies of surprise that have been proposed. In many ways, surprise is revealed to be a subjective concept, triggered by such factors as prior experience, belief system, and level of education. How policymakers have reacted to specific instances of climate change or climate surprise in the past is considered, particularly with regard to the choices they made between proactive and reactive measures. Finally, the report discusses techniques used in the current generation of assessment models and makes suggestions as to how climate surprises might be included in future models. The report concludes that some kinds of surprises are simply unpredictable, but there are several types that could in some way be anticipated and assessed, and their negative effects forestalled

  17. Exploring the concept of climate surprises. A review of the literature on the concept of surprise and how it is related to climate change

    Energy Technology Data Exchange (ETDEWEB)

    Glantz, M.H.; Moore, C.M. [National Center for Atmospheric Research, Boulder, CO (United States); Streets, D.G.; Bhatti, N.; Rosa, C.H. [Argonne National Lab., IL (United States). Decision and Information Sciences Div.; Stewart, T.R. [State Univ. of New York, Albany, NY (United States)

    1998-01-01

    This report examines the concept of climate surprise and its implications for environmental policymaking. Although most integrated assessment models of climate change deal with average values of change, it is usually the extreme events or surprises that cause the most damage to human health and property. Current models do not help the policymaker decide how to deal with climate surprises. This report examines the literature of surprise in many aspects of human society: psychology, military, health care, humor, agriculture, etc. It draws together various ways to consider the concept of surprise and examines different taxonomies of surprise that have been proposed. In many ways, surprise is revealed to be a subjective concept, triggered by such factors as prior experience, belief system, and level of education. How policymakers have reacted to specific instances of climate change or climate surprise in the past is considered, particularly with regard to the choices they made between proactive and reactive measures. Finally, the report discusses techniques used in the current generation of assessment models and makes suggestions as to how climate surprises might be included in future models. The report concludes that some kinds of surprises are simply unpredictable, but there are several types that could in some way be anticipated and assessed, and their negative effects forestalled.

  18. Mono and multi-objective optimization techniques applied to a large range of industrial test cases using Metamodel assisted Evolutionary Algorithms

    Science.gov (United States)

    Fourment, Lionel; Ducloux, Richard; Marie, Stéphane; Ejday, Mohsen; Monnereau, Dominique; Massé, Thomas; Montmitonnet, Pierre

    2010-06-01

    The use of material processing numerical simulation allows a strategy of trial and error to improve virtual processes without incurring material costs or interrupting production and therefore save a lot of money, but it requires user time to analyze the results, adjust the operating conditions and restart the simulation. Automatic optimization is the perfect complement to simulation. Evolutionary Algorithm coupled with metamodelling makes it possible to obtain industrially relevant results on a very large range of applications within a few tens of simulations and without any specific automatic optimization technique knowledge. Ten industrial partners have been selected to cover the different area of the mechanical forging industry and provide different examples of the forming simulation tools. It aims to demonstrate that it is possible to obtain industrially relevant results on a very large range of applications within a few tens of simulations and without any specific automatic optimization technique knowledge. The large computational time is handled by a metamodel approach. It allows interpolating the objective function on the entire parameter space by only knowing the exact function values at a reduced number of "master points". Two algorithms are used: an evolution strategy combined with a Kriging metamodel and a genetic algorithm combined with a Meshless Finite Difference Method. The later approach is extended to multi-objective optimization. The set of solutions, which corresponds to the best possible compromises between the different objectives, is then computed in the same way. The population based approach allows using the parallel capabilities of the utilized computer with a high efficiency. An optimization module, fully embedded within the Forge2009 IHM, makes possible to cover all the defined examples, and the use of new multi-core hardware to compute several simulations at the same time reduces the needed time dramatically. The presented examples

  19. Atom Surprise: Using Theatre in Primary Science Education

    Science.gov (United States)

    Peleg, Ran; Baram-Tsabari, Ayelet

    2011-10-01

    Early exposure to science may have a lifelong effect on children's attitudes towards science and their motivation to learn science in later life. Out-of-class environments can play a significant role in creating favourable attitudes, while contributing to conceptual learning. Educational science theatre is one form of an out-of-class environment, which has received little research attention. This study aims to describe affective and cognitive learning outcomes of watching such a play and to point to connections between theatrical elements and specific outcomes. "Atom Surprise" is a play portraying several concepts on the topic of matter. A mixed methods approach was adopted to investigate the knowledge and attitudes of children (grades 1-6) from two different school settings who watched the play. Data were gathered using questionnaires and in-depth interviews. Analysis suggested that in both schools children's knowledge on the topic of matter increased after the play with younger children gaining more conceptual knowledge than their older peers. In the public school girls showed greater gains in conceptual knowledge than boys. No significant changes in students' general attitudes towards science were found, however, students demonstrated positive changes towards science learning. Theatrical elements that seemed to be important in children's recollection of the play were the narrative, props and stage effects, and characters. In the children's memory, science was intertwined with the theatrical elements. Nonetheless, children could distinguish well between scientific facts and the fictive narrative.

  20. X-rays from comets - a surprising discovery

    CERN Document Server

    CERN. Geneva

    2000-01-01

    Comets are kilometre-size aggregates of ice and dust, which remained from the formation of the solar system. It was not obvious to expect X-ray emission from such objects. Nevertheless, when comet Hyakutake (C/1996 B2) was observed with the ROSAT X-ray satellite during its close approach to Earth in March 1996, bright X-ray emission from this comet was discovered. This finding triggered a search in archival ROSAT data for comets, which might have accidentally crossed the field of view during observations of unrelated targets. To increase the surprise even more, X-ray emission was detected from four additional comets, which were optically 300 to 30 000 times fainter than Hyakutake. For one of them, comet Arai (C/1991 A2), X-ray emission was even found in data which were taken six weeks before the comet was optically discovered. These findings showed that comets represent a new class of celestial X-ray sources. The subsequent detection of X-ray emission from several other comets in dedicated observations confir...

  1. Hierarchical cluster-based partial least squares regression (HC-PLSR) is an efficient tool for metamodelling of nonlinear dynamic models.

    Science.gov (United States)

    Tøndel, Kristin; Indahl, Ulf G; Gjuvsland, Arne B; Vik, Jon Olav; Hunter, Peter; Omholt, Stig W; Martens, Harald

    2011-06-01

    Deterministic dynamic models of complex biological systems contain a large number of parameters and state variables, related through nonlinear differential equations with various types of feedback. A metamodel of such a dynamic model is a statistical approximation model that maps variation in parameters and initial conditions (inputs) to variation in features of the trajectories of the state variables (outputs) throughout the entire biologically relevant input space. A sufficiently accurate mapping can be exploited both instrumentally and epistemically. Multivariate regression methodology is a commonly used approach for emulating dynamic models. However, when the input-output relations are highly nonlinear or non-monotone, a standard linear regression approach is prone to give suboptimal results. We therefore hypothesised that a more accurate mapping can be obtained by locally linear or locally polynomial regression. We present here a new method for local regression modelling, Hierarchical Cluster-based PLS regression (HC-PLSR), where fuzzy C-means clustering is used to separate the data set into parts according to the structure of the response surface. We compare the metamodelling performance of HC-PLSR with polynomial partial least squares regression (PLSR) and ordinary least squares (OLS) regression on various systems: six different gene regulatory network models with various types of feedback, a deterministic mathematical model of the mammalian circadian clock and a model of the mouse ventricular myocyte function. Our results indicate that multivariate regression is well suited for emulating dynamic models in systems biology. The hierarchical approach turned out to be superior to both polynomial PLSR and OLS regression in all three test cases. The advantage, in terms of explained variance and prediction accuracy, was largest in systems with highly nonlinear functional relationships and in systems with positive feedback loops. HC-PLSR is a promising approach for

  2. Hierarchical Cluster-based Partial Least Squares Regression (HC-PLSR is an efficient tool for metamodelling of nonlinear dynamic models

    Directory of Open Access Journals (Sweden)

    Omholt Stig W

    2011-06-01

    Full Text Available Abstract Background Deterministic dynamic models of complex biological systems contain a large number of parameters and state variables, related through nonlinear differential equations with various types of feedback. A metamodel of such a dynamic model is a statistical approximation model that maps variation in parameters and initial conditions (inputs to variation in features of the trajectories of the state variables (outputs throughout the entire biologically relevant input space. A sufficiently accurate mapping can be exploited both instrumentally and epistemically. Multivariate regression methodology is a commonly used approach for emulating dynamic models. However, when the input-output relations are highly nonlinear or non-monotone, a standard linear regression approach is prone to give suboptimal results. We therefore hypothesised that a more accurate mapping can be obtained by locally linear or locally polynomial regression. We present here a new method for local regression modelling, Hierarchical Cluster-based PLS regression (HC-PLSR, where fuzzy C-means clustering is used to separate the data set into parts according to the structure of the response surface. We compare the metamodelling performance of HC-PLSR with polynomial partial least squares regression (PLSR and ordinary least squares (OLS regression on various systems: six different gene regulatory network models with various types of feedback, a deterministic mathematical model of the mammalian circadian clock and a model of the mouse ventricular myocyte function. Results Our results indicate that multivariate regression is well suited for emulating dynamic models in systems biology. The hierarchical approach turned out to be superior to both polynomial PLSR and OLS regression in all three test cases. The advantage, in terms of explained variance and prediction accuracy, was largest in systems with highly nonlinear functional relationships and in systems with positive feedback

  3. Sensitivity analysis and metamodeling of a toolchain of models to help sizing vetetative filter strips in a watershed.

    Science.gov (United States)

    Lauvernet, Claire; Noll, Dorothea; Muñoz-Carpena, Rafael; Carluer, Nadia

    2014-05-01

    agricultural field and the VFS characteristics. These scenarios are based on: 2 types of climates (North and South-west of France), different rainfall intensities and durations, different lengths and slopes of hillslope, different humidity conditions, 4 soil types (silt loam, sandy loam, clay loam, sandy clay loam), 2 crops (wheat and corn) for the contributive area, 2 water table depths (1m and 2.5m) and 4 soil types for the VFS. The sizing method was applied for all these scenarios, and a sensitivity analysis of the VFS optimal length was performed for all the input parameters in order to understand their influence, and to identify for which a special care has to be given. Based on that sensitivity analysis, a metamodel has been developed. The idea is to simplify the whole toolchain and to make it possible to perform the buffer sizing by using a unique tool and a smaller set of parameters, given the available information from the end users. We first compared several mathematical methods to compute the metamodel, and then validated them on an agricultural watershed with real data in the North-West of France.

  4. Beyond surprise : A longitudinal study on the experience of visual-tactual incongruities in products

    NARCIS (Netherlands)

    Ludden, G.D.S.; Schifferstein, H.N.J.; Hekkert, P.

    2012-01-01

    When people encounter products with visual-tactual incongruities, they are likely to be surprised because the product feels different than expected. In this paper, we investigate (1) the relationship between surprise and the overall liking of the products, (2) the emotions associated with surprise,

  5. Surprising Incentive: An Instrument for Promoting Safety Performance of Construction Employees

    Directory of Open Access Journals (Sweden)

    Fakhradin Ghasemi

    2015-09-01

    Conclusion: The results of this study proved that the surprising incentive would improve the employees' safety performance just in the short term because the surprising value of the incentives dwindle over time. For this reason and to maintain the surprising value of the incentive system, the amount and types of incentives need to be evaluated and modified annually or biannually.

  6. The Role of Surprise in Game-Based Learning for Mathematics

    NARCIS (Netherlands)

    Wouters, Pieter; van Oostendorp, Herre; ter Vrugte, Judith; Vandercruysse, Sylke; de Jong, Anthonius J.M.; Elen, Jan; De Gloria, Alessandro; Veltkamp, Remco

    2016-01-01

    In this paper we investigate the potential of surprise on learning with prevocational students in the domain of proportional reasoning. Surprise involves an emotional reaction, but it also serves a cognitive goal as it directs attention to explain why the surprising event occurred and to learn for

  7. Human amygdala response to dynamic facial expressions of positive and negative surprise.

    Science.gov (United States)

    Vrticka, Pascal; Lordier, Lara; Bediou, Benoît; Sander, David

    2014-02-01

    Although brain imaging evidence accumulates to suggest that the amygdala plays a key role in the processing of novel stimuli, only little is known about its role in processing expressed novelty conveyed by surprised faces, and even less about possible interactive encoding of novelty and valence. Those investigations that have already probed human amygdala involvement in the processing of surprised facial expressions either used static pictures displaying negative surprise (as contained in fear) or "neutral" surprise, and manipulated valence by contextually priming or subjectively associating static surprise with either negative or positive information. Therefore, it still remains unresolved how the human amygdala differentially processes dynamic surprised facial expressions displaying either positive or negative surprise. Here, we created new artificial dynamic 3-dimensional facial expressions conveying surprise with an intrinsic positive (wonderment) or negative (fear) connotation, but also intrinsic positive (joy) or negative (anxiety) emotions not containing any surprise, in addition to neutral facial displays either containing ("typical surprise" expression) or not containing ("neutral") surprise. Results showed heightened amygdala activity to faces containing positive (vs. negative) surprise, which may either correspond to a specific wonderment effect as such, or to the computation of a negative expected value prediction error. Findings are discussed in the light of data obtained from a closely matched nonsocial lottery task, which revealed overlapping activity within the left amygdala to unexpected positive outcomes. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  8. Metamodel-based design optimization of injection molding process variables and gates of an automotive glove box for enhancing its quality

    International Nuclear Information System (INIS)

    Kang, Gyung Ju; Park, Chang Hyun; Choi, Dong Hoon

    2016-01-01

    Injection molding process variables and gates of an automotive glove box were optimally determined to enhance its injection molding quality. We minimized warpage with satisfying constraints on clamp force, weldline, and profiles of filling and packing. Design variables concerning the injection molding process are temperatures of the mold and the resin, ram speeds, and packing pressures and durations; design variables concerning the gates are the shape of the center gate and locations of two side gates. To optimally determine the design variables in an efficient way, we adopted metamodel-based design optimization, sequentially using an optimal Latin hypercube design as a design of experiment, Kriging models as metamodels that replace time-consuming injection molding simulations, and a micro genetic algorithm as an optimization algorithm. In the optimization process, a commercial injection molding analysis software, MoldflowTM, was employed to evaluate the injection molding quality at design points specified. Using the proposed design approach, the warpage was found reduced by 20.5% compared to the initial warpage, while all the design constraints were satisfied, which clearly shows the validity of the proposed design approach

  9. Flexible software process lines in practice: A metamodel-based approach to effectively construct and manage families of software process models

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Ternité, Thomas; Friedrich, Jan

    2017-01-01

    Process flexibility and adaptability is frequently discussed, and several proposals aim to improve software processes for a given organization-/project context. A software process line (SPrL) is an instrument to systematically construct and manage variable software processes, by combining pre-def......: A metamodel-based approach to effectively construct and manage families of software process models [Ku16]. This paper was published as original research article in the Journal of Systems and Software.......Process flexibility and adaptability is frequently discussed, and several proposals aim to improve software processes for a given organization-/project context. A software process line (SPrL) is an instrument to systematically construct and manage variable software processes, by combining pre...... to construct flexible SPrLs and show its practical application in the German V-Modell XT. We contribute a proven approach that is presented as metamodel fragment for reuse and implementation in further process modeling approaches. This summary refers to the paper Flexible software process lines in practice...

  10. Metamodel-based design optimization of injection molding process variables and gates of an automotive glove box for enhancing its quality

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Gyung Ju [Pusan National University, Busan (Korea, Republic of); Park, Chang Hyun; Choi, Dong Hoon [Hanyang University, Seoul (Korea, Republic of)

    2016-04-15

    Injection molding process variables and gates of an automotive glove box were optimally determined to enhance its injection molding quality. We minimized warpage with satisfying constraints on clamp force, weldline, and profiles of filling and packing. Design variables concerning the injection molding process are temperatures of the mold and the resin, ram speeds, and packing pressures and durations; design variables concerning the gates are the shape of the center gate and locations of two side gates. To optimally determine the design variables in an efficient way, we adopted metamodel-based design optimization, sequentially using an optimal Latin hypercube design as a design of experiment, Kriging models as metamodels that replace time-consuming injection molding simulations, and a micro genetic algorithm as an optimization algorithm. In the optimization process, a commercial injection molding analysis software, MoldflowTM, was employed to evaluate the injection molding quality at design points specified. Using the proposed design approach, the warpage was found reduced by 20.5% compared to the initial warpage, while all the design constraints were satisfied, which clearly shows the validity of the proposed design approach.

  11. Stars Form Surprisingly Close to Milky Way's Black Hole

    Science.gov (United States)

    2005-10-01

    The supermassive black hole at the center of the Milky Way has surprisingly helped spawn a new generation of stars, according to observations from NASA's Chandra X-ray Observatory. This novel mode of star formation may solve several mysteries about the supermassive black holes that reside at the centers of nearly all galaxies. "Massive black holes are usually known for violence and destruction," said Sergei Nayakshin of the University of Leicester, United Kingdom, and coauthor of a paper on this research in an upcoming issue of the Monthly Notices of the Royal Astronomical Society. "So it's remarkable that this black hole helped create new stars, not just destroy them." Black holes have earned their fearsome reputation because any material -- including stars -- that falls within the so-called event horizon is never seen again. However, these new results indicate that the immense disks of gas known to orbit many black holes at a "safe" distance from the event horizon can help nurture the formation of new stars. Animation of Stars Forming Around Black Hole Animation of Stars Forming Around Black Hole This conclusion came from new clues that could only be revealed in X-rays. Until the latest Chandra results, astronomers have disagreed about the origin of a mysterious group of massive stars discovered by infrared astronomers to be orbiting less than a light year from the Milky Way's central black hole, a.k.a. Sagittarius A*, or Sgr A*. At such close distances to Sgr A*, the standard model for star formation predicts that gas clouds from which stars form should have been ripped apart by tidal forces from the black hole. Two models to explain this puzzle have been proposed. In the disk model, the gravity of a dense disk of gas around Sgr A* offsets the tidal forces and allows stars to form; in the migration model, the stars formed in a star cluster far away from the black hole and migrated in to form the ring of massive stars. The migration scenario predicts about a

  12. Emotional Intelligence and Successful Leadership.

    Science.gov (United States)

    Maulding, Wanda S.

    Cognitive intelligence is often equated with eventual success in many areas. However, there are many instances where people of high IQ flounder whereas those of modest IQ do surprisingly well. Author and renowned psychologist Daniel Goleman believes that the explanation for this fact lies in abilities called "emotional intelligence,"…

  13. Reflection, A Meta-Model for Learning, and a Proposal To Improve the Quality of University Teaching = Reflexion, el meta-modelo del aprendizaje, y la propuesta del mejoramiento de la calidad de la docencia.

    Science.gov (United States)

    Montgomery, Joel R.

    This paper, in both English and Spanish, offers a meta-model of the learning process which focuses on the importance of the reflective learning process in enhancing the quality of learning in higher education. This form of innovative learning is offered as a means of helping learners to realize the relevance of what they are learning to their life…

  14. Carbon Dioxide: Surprising Effects on Decision Making and Neurocognitive Performance

    Science.gov (United States)

    James, John T.

    2013-01-01

    The occupants of modern submarines and the International Space Station (ISS) have much in common as far as their air quality is concerned. Air is polluted by materials offgassing, use of utility compounds, leaks of systems chemicals, and anthropogenic sources. The primary anthropogenic compound of concern to submariners and astronauts has been carbon dioxide (CO2). NASA and the US Navy rely on the National Research Council Committee on Toxicology (NRC-COT) to help formulate exposure levels to CO2 that are thought to be safe for exposures of 3-6 months. NASA calls its limits Spacecraft Maximum Allowable Concentrations (SMACs). Years of experience aboard the ISS and a recent publication on deficits in decision making in ground-based subjects exposed briefly to 0.25% CO2 suggest that exposure levels that have been presumed acceptable to preserve health and performance need to be reevaluated. The current CO2 exposure limits for 3-6 months set by NASA and the UK Navy are 0.7%, and the limit for US submariners is 0.5%, although the NRC-COT recommended a 90-day level of 0.8% as safe a few years ago. NASA has set a 1000-day SMAC at 0.5% for exploration-class missions. Anecdotal experience with ISS operations approaching the current 180-day SMAC of 0.7% suggest that this limit is too high. Temporarily, NASA has limited exposures to 0.5% until further peer-reviewed data become available. In the meantime, a study published last year in the journal Environmental Health Perspectives (Satish U, et al. 2012) demonstrated that complexdecision- making performance is somewhat affected at 0.1% CO2 and becomes "dysfunctional" for at least half of the 9 indices of performance at concentrations approaching 0.25% CO2. The investigators used the Strategic Management Simulation (SMS) method of testing for decisionmaking ability, and the results were so surprising to the investigators that they declared that their findings need to be independently confirmed. NASA has responded to the

  15. Are seismic hazard assessment errors and earthquake surprises unavoidable?

    Science.gov (United States)

    Kossobokov, Vladimir

    2013-04-01

    Why earthquake occurrences bring us so many surprises? The answer seems evident if we review the relationships that are commonly used to assess seismic hazard. The time-span of physically reliable Seismic History is yet a small portion of a rupture recurrence cycle at an earthquake-prone site, which makes premature any kind of reliable probabilistic statements about narrowly localized seismic hazard. Moreover, seismic evidences accumulated to-date demonstrate clearly that most of the empirical relations commonly accepted in the early history of instrumental seismology can be proved erroneous when testing statistical significance is applied. Seismic events, including mega-earthquakes, cluster displaying behaviors that are far from independent or periodic. Their distribution in space is possibly fractal, definitely, far from uniform even in a single segment of a fault zone. Such a situation contradicts generally accepted assumptions used for analytically tractable or computer simulations and complicates design of reliable methodologies for realistic earthquake hazard assessment, as well as search and definition of precursory behaviors to be used for forecast/prediction purposes. As a result, the conclusions drawn from such simulations and analyses can MISLEAD TO SCIENTIFICALLY GROUNDLESS APPLICATION, which is unwise and extremely dangerous in assessing expected societal risks and losses. For example, a systematic comparison of the GSHAP peak ground acceleration estimates with those related to actual strong earthquakes, unfortunately, discloses gross inadequacy of this "probabilistic" product, which appears UNACCEPTABLE FOR ANY KIND OF RESPONSIBLE SEISMIC RISK EVALUATION AND KNOWLEDGEABLE DISASTER PREVENTION. The self-evident shortcomings and failures of GSHAP appeals to all earthquake scientists and engineers for an urgent revision of the global seismic hazard maps from the first principles including background methodologies involved, such that there becomes: (a) a

  16. Surprisingly high substrate specificities observed in complex biofilms

    DEFF Research Database (Denmark)

    Nierychlo, Marta; Kindaichi, Tomonori; Kragelund, Caroline

    The behavior of microorganisms in natural ecosystems (e.g. biofilms) differs significantly from laboratory studies. In nature microorganisms experience alternating periods of surplus nutrients, nutrient-limitation, and starvation. Literature data suggests that to survive and compete successfully......, microorganisms can regulate their metabolism expressing wide range of uptake and catabolic systems. However, ecophysiological studies of natural biofilms indicate that bacteria are very specialized in their choice of substrate, so even minor changes in substrate composition can affect the community composition...... by selection for different specialized species. We hypothesized that bacteria growing in natural environment express strongly conserved substrate specificity which is independent on short-term (few hours) variations in growth conditions. In this study, biofilm from Aalborg wastewater treatment plant was used...

  17. The Surprising Impact of Seat Location on Student Performance

    Science.gov (United States)

    Perkins, Katherine K.; Wieman, Carl E.

    2005-01-01

    Every physics instructor knows that the most engaged and successful students tend to sit at the front of the class and the weakest students tend to sit at the back. However, it is normally assumed that this is merely an indication of the respective seat location preferences of weaker and stronger students. Here we present evidence suggesting that in fact this may be mixing up the cause and effect. It may be that the seat selection itself contributes to whether the student does well or poorly, rather than the other way around. While a number of studies have looked at the effect of seat location on students, the results are often inconclusive, and few, if any, have studied the effects in college classrooms with randomly assigned seats. In this paper, we report on our observations of a large introductory physics course in which we randomly assigned students to particular seat locations at the beginning of the semester. Seat location during the first half of the semester had a noticeable impact on student success in the course, particularly in the top and bottom parts of the grade distribution. Students sitting in the back of the room for the first half of the term were nearly six times as likely to receive an F as students who started in the front of the room. A corresponding but less dramatic reversal was evident in the fractions of students receiving As. These effects were in spite of many unusual efforts to engage students at the back of the class and a front-to-back reversal of seat location halfway through the term. These results suggest there may be inherent detrimental effects of large physics lecture halls that need to be further explored.

  18. Chandra Finds Surprising Black Hole Activity In Galaxy Cluster

    Science.gov (United States)

    2002-09-01

    Scientists at the Carnegie Observatories in Pasadena, California, have uncovered six times the expected number of active, supermassive black holes in a single viewing of a cluster of galaxies, a finding that has profound implications for theories as to how old galaxies fuel the growth of their central black holes. The finding suggests that voracious, central black holes might be as common in old, red galaxies as they are in younger, blue galaxies, a surprise to many astronomers. The team made this discovery with NASA'S Chandra X-ray Observatory. They also used Carnegie's 6.5-meter Walter Baade Telescope at the Las Campanas Observatory in Chile for follow-up optical observations. "This changes our view of galaxy clusters as the retirement homes for old and quiet black holes," said Dr. Paul Martini, lead author on a paper describing the results that appears in the September 10 issue of The Astrophysical Journal Letters. "The question now is, how do these black holes produce bright X-ray sources, similar to what we see from much younger galaxies?" Typical of the black hole phenomenon, the cores of these active galaxies are luminous in X-ray radiation. Yet, they are obscured, and thus essentially undetectable in the radio, infrared and optical wavebands. "X rays can penetrate obscuring gas and dust as easily as they penetrate the soft tissue of the human body to look for broken bones," said co-author Dr. Dan Kelson. "So, with Chandra, we can peer through the dust and we have found that even ancient galaxies with 10-billion-year-old stars can have central black holes still actively pulling in copious amounts of interstellar gas. This activity has simply been hidden from us all this time. This means these galaxies aren't over the hill after all and our theories need to be revised." Scientists say that supermassive black holes -- having the mass of millions to billions of suns squeezed into a region about the size of our Solar System -- are the engines in the cores of

  19. A Neural Mechanism for Surprise-related Interruptions of Visuospatial Working Memory.

    Science.gov (United States)

    Wessel, Jan R

    2018-01-01

    Surprising perceptual events recruit a fronto-basal ganglia mechanism for inhibition, which suppresses motor activity following surprise. A recent study found that this inhibitory mechanism also disrupts the maintenance of verbal working memory (WM) after surprising tones. However, it is unclear whether this same mechanism also relates to surprise-related interruptions of non-verbal WM. We tested this hypothesis using a change-detection task, in which surprising tones impaired visuospatial WM. Participants also performed a stop-signal task (SST). We used independent component analysis and single-trial scalp-electroencephalogram to test whether the same inhibitory mechanism that reflects motor inhibition in the SST relates to surprise-related visuospatial WM decrements, as was the case for verbal WM. As expected, surprising tones elicited activity of the inhibitory mechanism, and this activity correlated strongly with the trial-by-trial level of surprise. However, unlike for verbal WM, the activity of this mechanism was unrelated to visuospatial WM accuracy. Instead, inhibition-independent activity that immediately succeeded the inhibitory mechanism was increased when visuospatial WM was disrupted. This shows that surprise-related interruptions of visuospatial WM are not effected by the same inhibitory mechanism that interrupts verbal WM, and instead provides evidence for a 2-stage model of distraction. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  20. Data for developing metamodels to assess the fate, transport, and bioaccumulation of organic chemicals in rivers. Chemicals have log Kow ranging from 3 to 14, and rivers have mean annual discharges ranging from 1.09 to 3240 m3/s.

    Data.gov (United States)

    U.S. Environmental Protection Agency — This dataset was developed to demonstrate how metamodels of high resolution, process-based models that simulate the fate, transport, and bioaccumulation of organic...

  1. Distinct medial temporal networks encode surprise during motivation by reward versus punishment

    Science.gov (United States)

    Murty, Vishnu P.; LaBar, Kevin S.; Adcock, R. Alison

    2016-01-01

    Adaptive motivated behavior requires predictive internal representations of the environment, and surprising events are indications for encoding new representations of the environment. The medial temporal lobe memory system, including the hippocampus and surrounding cortex, encodes surprising events and is influenced by motivational state. Because behavior reflects the goals of an individual, we investigated whether motivational valence (i.e., pursuing rewards versus avoiding punishments) also impacts neural and mnemonic encoding of surprising events. During functional magnetic resonance imaging (fMRI), participants encountered perceptually unexpected events either during the pursuit of rewards or avoidance of punishments. Despite similar levels of motivation across groups, reward and punishment facilitated the processing of surprising events in different medial temporal lobe regions. Whereas during reward motivation, perceptual surprises enhanced activation in the hippocampus, during punishment motivation surprises instead enhanced activation in parahippocampal cortex. Further, we found that reward motivation facilitated hippocampal coupling with ventromedial PFC, whereas punishment motivation facilitated parahippocampal cortical coupling with orbitofrontal cortex. Behaviorally, post-scan testing revealed that reward, but not punishment, motivation resulted in greater memory selectivity for surprising events encountered during goal pursuit. Together these findings demonstrate that neuromodulatory systems engaged by anticipation of reward and punishment target separate components of the medial temporal lobe, modulating medial temporal lobe sensitivity and connectivity. Thus, reward and punishment motivation yield distinct neural contexts for learning, with distinct consequences for how surprises are incorporated into predictive mnemonic models of the environment. PMID:26854903

  2. Distinct medial temporal networks encode surprise during motivation by reward versus punishment.

    Science.gov (United States)

    Murty, Vishnu P; LaBar, Kevin S; Adcock, R Alison

    2016-10-01

    Adaptive motivated behavior requires predictive internal representations of the environment, and surprising events are indications for encoding new representations of the environment. The medial temporal lobe memory system, including the hippocampus and surrounding cortex, encodes surprising events and is influenced by motivational state. Because behavior reflects the goals of an individual, we investigated whether motivational valence (i.e., pursuing rewards versus avoiding punishments) also impacts neural and mnemonic encoding of surprising events. During functional magnetic resonance imaging (fMRI), participants encountered perceptually unexpected events either during the pursuit of rewards or avoidance of punishments. Despite similar levels of motivation across groups, reward and punishment facilitated the processing of surprising events in different medial temporal lobe regions. Whereas during reward motivation, perceptual surprises enhanced activation in the hippocampus, during punishment motivation surprises instead enhanced activation in parahippocampal cortex. Further, we found that reward motivation facilitated hippocampal coupling with ventromedial PFC, whereas punishment motivation facilitated parahippocampal cortical coupling with orbitofrontal cortex. Behaviorally, post-scan testing revealed that reward, but not punishment, motivation resulted in greater memory selectivity for surprising events encountered during goal pursuit. Together these findings demonstrate that neuromodulatory systems engaged by anticipation of reward and punishment target separate components of the medial temporal lobe, modulating medial temporal lobe sensitivity and connectivity. Thus, reward and punishment motivation yield distinct neural contexts for learning, with distinct consequences for how surprises are incorporated into predictive mnemonic models of the environment. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Beyond interests and institutions: US health policy reform and the surprising silence of big business.

    Science.gov (United States)

    Smyrl, Marc E

    2014-02-01

    Interest-based arguments do not provide satisfying explanations for the surprising reticence of major US employers to take a more active role in the debate surrounding the 2010 Patient Protection and Affordable Care Act (ACA). Through focused comparison with the Bismarckian systems of France and Germany, on the one hand, and with the 1950s and 1960s in the United States, on the other, this article concludes that while institutional elements do account for some of the observed behavior of big business, a necessary complement to this is a fuller understanding of the historically determined legitimating ideology of US firms. From the era of the "corporate commonwealth," US business inherited the principles of private welfare provision and of resistance to any expansion of government control. Once complementary, these principles are now mutually exclusive: employer-provided health insurance increasingly is possible only at the cost of ever-increasing government subsidy and regulation. Paralyzed by the uncertainty that followed from this clash of legitimate ideas, major employers found themselves unable to take a coherent and unified stand for or against the law. As a consequence, they failed either to oppose it successfully or to secure modifications to it that would have been useful to them.

  4. A Contrast-Based Computational Model of Surprise and Its Applications.

    Science.gov (United States)

    Macedo, Luis; Cardoso, Amílcar

    2017-11-19

    We review our work on a contrast-based computational model of surprise and its applications. The review is contextualized within related research from psychology, philosophy, and particularly artificial intelligence. Influenced by psychological theories of surprise, the model assumes that surprise-eliciting events initiate a series of cognitive processes that begin with the appraisal of the event as unexpected, continue with the interruption of ongoing activity and the focusing of attention on the unexpected event, and culminate in the analysis and evaluation of the event and the revision of beliefs. It is assumed that the intensity of surprise elicited by an event is a nonlinear function of the difference or contrast between the subjective probability of the event and that of the most probable alternative event (which is usually the expected event); and that the agent's behavior is partly controlled by actual and anticipated surprise. We describe applications of artificial agents that incorporate the proposed surprise model in three domains: the exploration of unknown environments, creativity, and intelligent transportation systems. These applications demonstrate the importance of surprise for decision making, active learning, creative reasoning, and selective attention. Copyright © 2017 Cognitive Science Society, Inc.

  5. A Statistical Analysis of the Relationship between Harmonic Surprise and Preference in Popular Music.

    Science.gov (United States)

    Miles, Scott A; Rosen, David S; Grzywacz, Norberto M

    2017-01-01

    Studies have shown that some musical pieces may preferentially activate reward centers in the brain. Less is known, however, about the structural aspects of music that are associated with this activation. Based on the music cognition literature, we propose two hypotheses for why some musical pieces are preferred over others. The first, the Absolute-Surprise Hypothesis, states that unexpected events in music directly lead to pleasure. The second, the Contrastive-Surprise Hypothesis, proposes that the juxtaposition of unexpected events and subsequent expected events leads to an overall rewarding response. We tested these hypotheses within the framework of information theory, using the measure of "surprise." This information-theoretic variable mathematically describes how improbable an event is given a known distribution. We performed a statistical investigation of surprise in the harmonic structure of songs within a representative corpus of Western popular music, namely, the McGill Billboard Project corpus. We found that chords of songs in the top quartile of the Billboard chart showed greater average surprise than those in the bottom quartile. We also found that the different sections within top-quartile songs varied more in their average surprise than the sections within bottom-quartile songs. The results of this study are consistent with both the Absolute- and Contrastive-Surprise Hypotheses. Although these hypotheses seem contradictory to one another, we cannot yet discard the possibility that both absolute and contrastive types of surprise play roles in the enjoyment of popular music. We call this possibility the Hybrid-Surprise Hypothesis. The results of this statistical investigation have implications for both music cognition and the human neural mechanisms of esthetic judgments.

  6. Summit surprises.

    Science.gov (United States)

    Myers, N

    1994-01-01

    A New Delhi Population Summit, organized by the Royal Society, the US National Academy of Sciences, the Royal Swedish Academy of Sciences, and the Indian National Science Academy, was convened with representation of 120 (only 10% women) scientists from 50 countries and about 12 disciplines and 43 national scientific academies. Despite the common assumption that scientists never agree, a 3000 word statement was signed by 50 prominent national figures and supported by 25 professional papers on diverse subjects. The statement proclaimed that stable world population and "prodigious planning efforts" are required for dealing with global social, economic, and environmental problems. The target should be zero population growth by the next generation. The statement, although containing many uncompromising assertions, was not as strong as a statement by the Royal Society and the US National Academy of Sciences released last year: that, in the future, science and technology may not be able to prevent "irreversible degradation of the environment and continued poverty," and that the capacity to sustain life on the planet may be permanently jeopardized. The Delhi statement was backed by professional papers highlighting several important issues. Dr Mahmoud Fathalla of the Rockefeller Foundation claimed that the 500,000 annual maternal deaths worldwide, of which perhaps 33% are due to "coathanger" abortions, are given far less attention than a one-day political event of 500 deaths would receive. Although biologically women have been given a greater survival advantage, which is associated with their reproductive capacity, socially disadvantaged females are relegated to low status. There is poorer nutrition and overall health care for females, female infanticide, and female fetuses are increasingly aborted in China, India, and other countries. The sex ratio in developed countries is 95-97 males to every 100 females, but in developing Asian countries the ratio is 105 males to 100 females. There are reports of 60-100 million missing females. The human species 12,000 years ago had a population of 6 million, a life expectancy of 20 years, and a doubling time of 8000 years; high birth rates were important for preservation of the species. Profertility attitudes are still prevalent today. Insufficient funds go to contraceptive research.

  7. Distinct medial temporal networks encode surprise during motivation by reward versus punishment

    OpenAIRE

    Murty, Vishnu P.; LaBar, Kevin S.; Adcock, R. Alison

    2016-01-01

    Adaptive motivated behavior requires predictive internal representations of the environment, and surprising events are indications for encoding new representations of the environment. The medial temporal lobe memory system, including the hippocampus and surrounding cortex, encodes surprising events and is influenced by motivational state. Because behavior reflects the goals of an individual, we investigated whether motivational valence (i.e., pursuing rewards versus avoiding punishments) also...

  8. Ignorance, Vulnerability and the Occurrence of "Radical Surprises": Theoretical Reflections and Empirical Findings

    Science.gov (United States)

    Kuhlicke, C.

    2009-04-01

    By definition natural disasters always contain a moment of surprise. Their occurrence is mostly unforeseen and unexpected. They hit people unprepared, overwhelm them and expose their helplessness. Yet, there is surprisingly little known on the reasons for their being surprised. Aren't natural disasters expectable and foreseeable after all? Aren't the return rates of most hazards well known and shouldn't people be better prepared? The central question of this presentation is hence: Why do natural disasters so often radically surprise people at all (and how can we explain this being surprised)? In the first part of the presentation, it is argued that most approaches to vulnerability are not able to grasp this moment of surprise. On the contrary, they have their strength in unravelling the expectable: A person who is marginalized or even oppressed in everyday life is also vulnerable during times of crisis and stress, at least this is the central assumption of most vulnerability studies. In the second part, an understanding of vulnerability is developed, which allows taking into account such radical surprises. First, two forms of the unknown are differentiated: An area of the unknown an actor is more or less aware of (ignorance), and an area, which is not even known to be not known (nescience). The discovery of the latter is mostly associated with a "radical surprise", since it is per definition impossible to prepare for it. Second, a definition of vulnerability is proposed, which allows capturing the dynamics of surprises: People are vulnerable when they discover their nescience exceeding by definition previously established routines, stocks of knowledge and resources—in a general sense their capacities—to deal with their physical and/or social environment. This definition explicitly takes the view of different actors serious and departs from their being surprised. In the third part findings of a case study are presented, the 2002 flood in Germany. It is shown

  9. NR sulphur vulcanization: Interaction study between TBBS and DPG by means of a combined experimental rheometer and meta-model best fitting strategy

    Energy Technology Data Exchange (ETDEWEB)

    Milani, G., E-mail: gabriele.milani@polimi.it [Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milan (Italy); Hanel, T.; Donetti, R. [Pirelli Tyre, Via Alberto e Piero Pirelli 25, 20126 Milan (Italy); Milani, F. [Chem. Co, Via J.F. Kennedy 2, 45030 Occhiobello (Italy)

    2016-06-08

    The paper is aimed at studying the possible interaction between two different accelerators (DPG and TBBS) in the chemical kinetic of Natural Rubber (NR) vulcanized with sulphur. The same blend with several DPG and TBBS concentrations is deeply analyzed from an experimental point of view, varying the curing temperature in the range 150-180°C and obtaining rheometer curves with a step of 10°C. In order to study any possible interaction between the two accelerators –and eventually evaluating its engineering relevance-rheometer data are normalized by means of the well known Sun and Isayev normalization approach and two output parameters are assumed as meaningful to have an insight into the possible interaction, namely time at maximum torque and reversion percentage. Two different numerical meta-models, which belong to the family of the so-called response surfaces RS are compared. The first is linear against TBBS and DPG and therefore well reproduces no interaction between the accelerators, whereas the latter is a non-linear RS with bilinear term. Both RS are deduced from standard best fitting of experimental data available. It is found that, generally, there is a sort of interaction between TBBS and DPG, but that the error introduced making use of a linear model (no interaction) is generally lower than 10%, i.e. fully acceptable from an engineering standpoint.

  10. What is a surprise earthquake? The example of the 2002, San Giuliano (Italy event

    Directory of Open Access Journals (Sweden)

    M. Mucciarelli

    2005-06-01

    Full Text Available Both in scientific literature and in the mass media, some earthquakes are defined as «surprise earthquakes». Based on his own judgment, probably any geologist, seismologist or engineer may have his own list of past «surprise earthquakes». This paper tries to quantify the underlying individual perception that may lead a scientist to apply such a definition to a seismic event. The meaning is different, depending on the disciplinary approach. For geologists, the Italian database of seismogenic sources is still too incomplete to allow for a quantitative estimate of the subjective degree of belief. For seismologists, quantification is possible defining the distance between an earthquake and its closest previous neighbor. Finally, for engineers, the San Giuliano quake could not be considered a surprise, since probabilistic site hazard estimates reveal that the change before and after the earthquake is just 4%.

  11. Conference of “Uncertainty and Surprise: Questions on Working with the Unexpected and Unknowable”

    CERN Document Server

    McDaniel, Reuben R; Uncertainty and Surprise in Complex Systems : Questions on Working with the Unexpected

    2005-01-01

    Complexity science has been a source of new insight in physical and social systems and has demonstrated that unpredictability and surprise are fundamental aspects of the world around us. This book is the outcome of a discussion meeting of leading scholars and critical thinkers with expertise in complex systems sciences and leaders from a variety of organizations sponsored by the Prigogine Center at The University of Texas at Austin and the Plexus Institute to explore strategies for understanding uncertainty and surprise. Besides distributions to the conference it includes a key digest by the editors as well as a commentary by the late nobel laureat Ilya Prigogine, "Surprises in half of a century". The book is intended for researchers and scientists in complexity science as well as for a broad interdisciplinary audience of both practitioners and scholars. It will well serve those interested in the research issues and in the application of complexity science to physical and social systems.

  12. Salience and attention in surprisal-based accounts of language processing

    Directory of Open Access Journals (Sweden)

    Alessandra eZarcone

    2016-06-01

    Full Text Available The notion of salience has been singled out as the explanatory factor for a diverse range oflinguistic phenomena. In particular, perceptual salience (e.g. visual salience of objects in the world,acoustic prominence of linguistic sounds and semantic-pragmatic salience (e.g. prominence ofrecently mentioned or topical referents have been shown to influence language comprehensionand production. A different line of research has sought to account for behavioral correlates ofcognitive load during comprehension as well as for certain patterns in language usage usinginformation-theoretic notions, such as surprisal. Surprisal and salience both affect languageprocessing at different levels, but the relationship between the two has not been adequatelyelucidated, and the question of whether salience can be reduced to surprisal / predictability isstill open. Our review identifies two main challenges in addressing this question: terminologicalinconsistency and lack of integration between high and low levels of representations in salience-based accounts and surprisal-based accounts. We capitalise upon work in visual cognition inorder to orient ourselves in surveying the different facets of the notion of salience in linguisticsand their relation with models of surprisal. We find that work on salience highlights aspects oflinguistic communication that models of surprisal tend to overlook, namely the role of attentionand relevance to current goals, and we argue that the Predictive Coding framework provides aunified view which can account for the role played by attention and predictability at different levelsof processing and which can clarify the interplay between low and high levels of processes andbetween predictability-driven expectation and attention-driven focus.

  13. Salience and Attention in Surprisal-Based Accounts of Language Processing.

    Science.gov (United States)

    Zarcone, Alessandra; van Schijndel, Marten; Vogels, Jorrig; Demberg, Vera

    2016-01-01

    The notion of salience has been singled out as the explanatory factor for a diverse range of linguistic phenomena. In particular, perceptual salience (e.g., visual salience of objects in the world, acoustic prominence of linguistic sounds) and semantic-pragmatic salience (e.g., prominence of recently mentioned or topical referents) have been shown to influence language comprehension and production. A different line of research has sought to account for behavioral correlates of cognitive load during comprehension as well as for certain patterns in language usage using information-theoretic notions, such as surprisal. Surprisal and salience both affect language processing at different levels, but the relationship between the two has not been adequately elucidated, and the question of whether salience can be reduced to surprisal / predictability is still open. Our review identifies two main challenges in addressing this question: terminological inconsistency and lack of integration between high and low levels of representations in salience-based accounts and surprisal-based accounts. We capitalize upon work in visual cognition in order to orient ourselves in surveying the different facets of the notion of salience in linguistics and their relation with models of surprisal. We find that work on salience highlights aspects of linguistic communication that models of surprisal tend to overlook, namely the role of attention and relevance to current goals, and we argue that the Predictive Coding framework provides a unified view which can account for the role played by attention and predictability at different levels of processing and which can clarify the interplay between low and high levels of processes and between predictability-driven expectation and attention-driven focus.

  14. Salience and Attention in Surprisal-Based Accounts of Language Processing

    Science.gov (United States)

    Zarcone, Alessandra; van Schijndel, Marten; Vogels, Jorrig; Demberg, Vera

    2016-01-01

    The notion of salience has been singled out as the explanatory factor for a diverse range of linguistic phenomena. In particular, perceptual salience (e.g., visual salience of objects in the world, acoustic prominence of linguistic sounds) and semantic-pragmatic salience (e.g., prominence of recently mentioned or topical referents) have been shown to influence language comprehension and production. A different line of research has sought to account for behavioral correlates of cognitive load during comprehension as well as for certain patterns in language usage using information-theoretic notions, such as surprisal. Surprisal and salience both affect language processing at different levels, but the relationship between the two has not been adequately elucidated, and the question of whether salience can be reduced to surprisal / predictability is still open. Our review identifies two main challenges in addressing this question: terminological inconsistency and lack of integration between high and low levels of representations in salience-based accounts and surprisal-based accounts. We capitalize upon work in visual cognition in order to orient ourselves in surveying the different facets of the notion of salience in linguistics and their relation with models of surprisal. We find that work on salience highlights aspects of linguistic communication that models of surprisal tend to overlook, namely the role of attention and relevance to current goals, and we argue that the Predictive Coding framework provides a unified view which can account for the role played by attention and predictability at different levels of processing and which can clarify the interplay between low and high levels of processes and between predictability-driven expectation and attention-driven focus. PMID:27375525

  15. Risk, surprises and black swans fundamental ideas and concepts in risk assessment and risk management

    CERN Document Server

    Aven, Terje

    2014-01-01

    Risk, Surprises and Black Swans provides an in depth analysis of the risk concept with a focus on the critical link to knowledge; and the lack of knowledge, that risk and probability judgements are based on.Based on technical scientific research, this book presents a new perspective to help you understand how to assess and manage surprising, extreme events, known as 'Black Swans'. This approach looks beyond the traditional probability-based principles to offer a broader insight into the important aspects of uncertain events and in doing so explores the ways to manage them.

  16. Surprise Gift” Purchases of Small Electric Appliances: A Pilot Study

    NARCIS (Netherlands)

    J. Vanhamme (Joëlle); C.J.P.M. de Bont (Cees)

    2005-01-01

    textabstractUnderstanding decision-making processes for gifts is of strategic importance for companies selling small electrical appliances as gifts account for a large part of their sales. Among all gifts, the ones that are surprising are the most valued by recipients. However, research about

  17. Dealing with unexpected events on the flight deck : A conceptual model of startle and surprise

    NARCIS (Netherlands)

    Landman, H.M.; Groen, E.L.; Paassen, M.M. van; Bronkhorst, A.W.; Mulder, M.

    2017-01-01

    Objective: A conceptual model is proposed in order to explain pilot performance in surprising and startling situations. Background: Today’s debate around loss of control following in-flight events and the implementation of upset prevention and recovery training has highlighted the importance of

  18. Bagpipes and Artichokes: Surprise as a Stimulus to Learning in the Elementary Music Classroom

    Science.gov (United States)

    Jacobi, Bonnie Schaffhauser

    2016-01-01

    Incorporating surprise into music instruction can stimulate student attention, curiosity, and interest. Novelty focuses attention in the reticular activating system, increasing the potential for brain memory storage. Elementary ages are ideal for introducing novel instruments, pieces, composers, or styles of music. Young children have fewer…

  19. The Educational Philosophies of Mordecai Kaplan and Michael Rosenak: Surprising Similarities and Illuminating Differences

    Science.gov (United States)

    Schein, Jeffrey; Caplan, Eric

    2014-01-01

    The thoughts of Mordecai Kaplan and Michael Rosenak present surprising commonalities as well as illuminating differences. Similarities include the perception that Judaism and Jewish education are in crisis, the belief that Jewish peoplehood must include commitment to meaningful content, the need for teachers to teach from a position of…

  20. Models of Automation surprise : results of a field survey in aviation

    NARCIS (Netherlands)

    De Boer, Robert; Dekker, Sidney

    2017-01-01

    Automation surprises in aviation continue to be a significant safety concern and the community’s search for effective strategies to mitigate them are ongoing. The literature has offered two fundamentally divergent directions, based on different ideas about the nature of cognition and collaboration

  1. Decision-making under surprise and uncertainty: Arsenic contamination of water supplies

    Science.gov (United States)

    Randhir, Timothy O.; Mozumder, Pallab; Halim, Nafisa

    2018-05-01

    With ignorance and potential surprise dominating decision making in water resources, a framework for dealing with such uncertainty is a critical need in hydrology. We operationalize the 'potential surprise' criterion proposed by Shackle, Vickers, and Katzner (SVK) to derive decision rules to manage water resources under uncertainty and ignorance. We apply this framework to managing water supply systems in Bangladesh that face severe, naturally occurring arsenic contamination. The uncertainty involved with arsenic in water supplies makes the application of conventional analysis of decision-making ineffective. Given the uncertainty and surprise involved in such cases, we find that optimal decisions tend to favor actions that avoid irreversible outcomes instead of conventional cost-effective actions. We observe that a diversification of the water supply system also emerges as a robust strategy to avert unintended outcomes of water contamination. Shallow wells had a slight higher optimal level (36%) compare to deep wells and surface treatment which had allocation levels of roughly 32% under each. The approach can be applied in a variety of other cases that involve decision making under uncertainty and surprise, a frequent situation in natural resources management.

  2. Surprising results: HIV testing and changes in contraceptive practices among young women in Malawi

    Science.gov (United States)

    Sennott, Christie; Yeatman, Sara

    2015-01-01

    This study uses eight waves of data from the population-based Tsogolo la Thanzi study (2009–2011) in rural Malawi to examine changes in young women’s contraceptive practices, including the use of condoms, non-barrier contraceptive methods, and abstinence, following positive and negative HIV tests. The analysis factors in women’s prior perceptions of their HIV status that may already be shaping their behaviour and separates surprise HIV test results from those that merely confirm what was already believed. Fixed effects logistic regression models show that HIV testing frequently affects the contraceptive practices of young Malawian women, particularly when the test yields an unexpected result. Specifically, women who are surprised to test HIV positive increase their condom use and are more likely to use condoms consistently. Following an HIV negative test (whether a surprise or expected), women increase their use of condoms and decrease their use of non-barrier contraceptives; the latter may be due to an increase in abstinence following a surprise negative result. Changes in condom use following HIV testing are robust to the inclusion of potential explanatory mechanisms including fertility preferences, relationship status, and the perception that a partner is HIV positive. The results demonstrate that both positive and negative tests can influence women’s sexual and reproductive behaviours, and emphasise the importance of conceptualizing of HIV testing as offering new information only insofar as results deviate from prior perceptions of HIV status. PMID:26160156

  3. Surprise, Memory, and Retrospective Judgment Making: Testing Cognitive Reconstruction Theories of the Hindsight Bias Effect

    Science.gov (United States)

    Ash, Ivan K.

    2009-01-01

    Hindsight bias has been shown to be a pervasive and potentially harmful decision-making bias. A review of 4 competing cognitive reconstruction theories of hindsight bias revealed conflicting predictions about the role and effect of expectation or surprise in retrospective judgment formation. Two experiments tested these predictions examining the…

  4. Successful ageing

    DEFF Research Database (Denmark)

    Bülow, Morten Hillgaard; Söderqvist, Thomas

    2014-01-01

    Since the late 1980s, the concept of ‘ successful ageing’ has set the frame for discourse about contemporary ageing research. Through an analysis of the reception to John W. Rowe and Robert L. Kahn's launch of the concept of ‘ successful ageing’ in 1987, this article maps out the important themes...... and discussions that have emerged from the interdisciplinary field of ageing research. These include an emphasis on interdisciplinarity; the interaction between biology, psycho-social contexts and lifestyle choices; the experiences of elderly people; life-course perspectives; optimisation and prevention...... strategies; and the importance of individual, societal and scientific conceptualisations and understandings of ageing. By presenting an account of the recent historical uses, interpretations and critiques of the concept, the article unfolds the practical and normative complexities of ‘ successful ageing’....

  5. Citation Success

    DEFF Research Database (Denmark)

    Vaio, Gianfranco Di; Waldenström, Daniel; Weisdorf, Jacob Louis

    2012-01-01

    This study examines the determinants of citation success among authors who have recently published their work in economic history journals. Besides offering clues about how to improve one's scientific impact, our citation analysis also sheds light on the state of the field of economic history...... find similar patterns when assessing the same authors' citation success in economics journals. As a novel feature, we demonstrate that the diffusion of research — publication of working papers, as well as conference and workshop presentations — has a first-order positive impact on the citation rate........ Consistent with our expectations, we find that full professors, authors appointed at economics and history departments, and authors working in Anglo-Saxon and German countries are more likely to receive citations than other scholars. Long and co-authored articles are also a factor for citation success. We...

  6. Citation Success

    DEFF Research Database (Denmark)

    Di Vaio, Gianfranco; Waldenström, Daniel; Weisdorf, Jacob Louis

    affects citations. In regard to author-specific characteristics, male authors, full professors and authors working economics or history departments, and authors employed in Anglo-Saxon countries, are more likely to get cited than others. As a ‘shortcut' to citation success, we find that research diffusion...

  7. Successful modeling?

    Science.gov (United States)

    Lomnitz, Cinna

    Tichelaar and Ruff [1989] propose to “estimate model variance in complicated geophysical problems,” including the determination of focal depth in earthquakes, by means of unconventional statistical methods such as bootstrapping. They are successful insofar as they are able to duplicate the results from more conventional procedures.

  8. Successful ageing

    DEFF Research Database (Denmark)

    Kusumastuti, Sasmita; Derks, Marloes G. M.; Tellier, Siri

    2016-01-01

    BACKGROUND: Ageing is accompanied by an increased risk of disease and a loss of functioning on several bodily and mental domains and some argue that maintaining health and functioning is essential for a successful old age. Paradoxically, studies have shown that overall wellbeing follows a curvili...

  9. Models of Automation Surprise: Results of a Field Survey in Aviation

    Directory of Open Access Journals (Sweden)

    Robert De Boer

    2017-09-01

    Full Text Available Automation surprises in aviation continue to be a significant safety concern and the community’s search for effective strategies to mitigate them are ongoing. The literature has offered two fundamentally divergent directions, based on different ideas about the nature of cognition and collaboration with automation. In this paper, we report the results of a field study that empirically compared and contrasted two models of automation surprises: a normative individual-cognition model and a sensemaking model based on distributed cognition. Our data prove a good fit for the sense-making model. This finding is relevant for aviation safety, since our understanding of the cognitive processes that govern human interaction with automation drive what we need to do to reduce the frequency of automation-induced events.

  10. Human Amygdala Tracks a Feature-Based Valence Signal Embedded within the Facial Expression of Surprise.

    Science.gov (United States)

    Kim, M Justin; Mattek, Alison M; Bennett, Randi H; Solomon, Kimberly M; Shin, Jin; Whalen, Paul J

    2017-09-27

    Human amygdala function has been traditionally associated with processing the affective valence (negative vs positive) of an emotionally charged event, especially those that signal fear or threat. However, this account of human amygdala function can be explained by alternative views, which posit that the amygdala might be tuned to either (1) general emotional arousal (activation vs deactivation) or (2) specific emotion categories (fear vs happy). Delineating the pure effects of valence independent of arousal or emotion category is a challenging task, given that these variables naturally covary under many circumstances. To circumvent this issue and test the sensitivity of the human amygdala to valence values specifically, we measured the dimension of valence within the single facial expression category of surprise. Given the inherent valence ambiguity of this category, we show that surprised expression exemplars are attributed valence and arousal values that are uniquely and naturally uncorrelated. We then present fMRI data from both sexes, showing that the amygdala tracks these consensus valence values. Finally, we provide evidence that these valence values are linked to specific visual features of the mouth region, isolating the signal by which the amygdala detects this valence information. SIGNIFICANCE STATEMENT There is an open question as to whether human amygdala function tracks the valence value of cues in the environment, as opposed to either a more general emotional arousal value or a more specific emotion category distinction. Here, we demonstrate the utility of surprised facial expressions because exemplars within this emotion category take on valence values spanning the dimension of bipolar valence (positive to negative) at a consistent level of emotional arousal. Functional neuroimaging data showed that amygdala responses tracked the valence of surprised facial expressions, unconfounded by arousal. Furthermore, a machine learning classifier identified

  11. Prediction, Expectation, and Surprise: Methods, Designs, and Study of a Deployed Traffic Forecasting Service

    OpenAIRE

    Horvitz, Eric J.; Apacible, Johnson; Sarin, Raman; Liao, Lin

    2012-01-01

    We present research on developing models that forecast traffic flow and congestion in the Greater Seattle area. The research has led to the deployment of a service named JamBayes, that is being actively used by over 2,500 users via smartphones and desktop versions of the system. We review the modeling effort and describe experiments probing the predictive accuracy of the models. Finally, we present research on building models that can identify current and future surprises, via efforts on mode...

  12. The effect of emotionally valenced eye region images on visuocortical processing of surprised faces.

    Science.gov (United States)

    Li, Shuaixia; Li, Ping; Wang, Wei; Zhu, Xiangru; Luo, Wenbo

    2018-05-01

    In this study, we presented pictorial representations of happy, neutral, and fearful expressions projected in the eye regions to determine whether the eye region alone is sufficient to produce a context effect. Participants were asked to judge the valence of surprised faces that had been preceded by a picture of an eye region. Behavioral results showed that affective ratings of surprised faces were context dependent. Prime-related ERPs with presentation of happy eyes elicited a larger P1 than those for neutral and fearful eyes, likely due to the recognition advantage provided by a happy expression. Target-related ERPs showed that surprised faces in the context of fearful and happy eyes elicited dramatically larger C1 than those in the neutral context, which reflected the modulation by predictions during the earliest stages of face processing. There were larger N170 with neutral and fearful eye contexts compared to the happy context, suggesting faces were being integrated with contextual threat information. The P3 component exhibited enhanced brain activity in response to faces preceded by happy and fearful eyes compared with neutral eyes, indicating motivated attention processing may be involved at this stage. Altogether, these results indicate for the first time that the influence of isolated eye regions on the perception of surprised faces involves preferential processing at the early stages and elaborate processing at the late stages. Moreover, higher cognitive processes such as predictions and attention can modulate face processing from the earliest stages in a top-down manner. © 2017 Society for Psychophysiological Research.

  13. Analysis of physiological signals for recognition of boredom, pain, and surprise emotions.

    Science.gov (United States)

    Jang, Eun-Hye; Park, Byoung-Jun; Park, Mi-Sook; Kim, Sang-Hyeob; Sohn, Jin-Hun

    2015-06-18

    The aim of the study was to examine the differences of boredom, pain, and surprise. In addition to that, it was conducted to propose approaches for emotion recognition based on physiological signals. Three emotions, boredom, pain, and surprise, are induced through the presentation of emotional stimuli and electrocardiography (ECG), electrodermal activity (EDA), skin temperature (SKT), and photoplethysmography (PPG) as physiological signals are measured to collect a dataset from 217 participants when experiencing the emotions. Twenty-seven physiological features are extracted from the signals to classify the three emotions. The discriminant function analysis (DFA) as a statistical method, and five machine learning algorithms (linear discriminant analysis (LDA), classification and regression trees (CART), self-organizing map (SOM), Naïve Bayes algorithm, and support vector machine (SVM)) are used for classifying the emotions. The result shows that the difference of physiological responses among emotions is significant in heart rate (HR), skin conductance level (SCL), skin conductance response (SCR), mean skin temperature (meanSKT), blood volume pulse (BVP), and pulse transit time (PTT), and the highest recognition accuracy of 84.7% is obtained by using DFA. This study demonstrates the differences of boredom, pain, and surprise and the best emotion recognizer for the classification of the three emotions by using physiological signals.

  14. Spatiotemporal neural characterization of prediction error valence and surprise during reward learning in humans.

    Science.gov (United States)

    Fouragnan, Elsa; Queirazza, Filippo; Retzler, Chris; Mullinger, Karen J; Philiastides, Marios G

    2017-07-06

    Reward learning depends on accurate reward associations with potential choices. These associations can be attained with reinforcement learning mechanisms using a reward prediction error (RPE) signal (the difference between actual and expected rewards) for updating future reward expectations. Despite an extensive body of literature on the influence of RPE on learning, little has been done to investigate the potentially separate contributions of RPE valence (positive or negative) and surprise (absolute degree of deviation from expectations). Here, we coupled single-trial electroencephalography with simultaneously acquired fMRI, during a probabilistic reversal-learning task, to offer evidence of temporally overlapping but largely distinct spatial representations of RPE valence and surprise. Electrophysiological variability in RPE valence correlated with activity in regions of the human reward network promoting approach or avoidance learning. Electrophysiological variability in RPE surprise correlated primarily with activity in regions of the human attentional network controlling the speed of learning. Crucially, despite the largely separate spatial extend of these representations our EEG-informed fMRI approach uniquely revealed a linear superposition of the two RPE components in a smaller network encompassing visuo-mnemonic and reward areas. Activity in this network was further predictive of stimulus value updating indicating a comparable contribution of both signals to reward learning.

  15. A development process meta-model for Web based expert systems: The Web engineering point of view

    DEFF Research Database (Denmark)

    Dokas, I.M.; Alapetite, Alexandre

    2006-01-01

    raised their complexity. Unfortunately, there is so far no clear answer to the question: How may the methods and experience of Web engineering and expert systems be combined and applied in order todevelop effective and successful Web based expert systems? In an attempt to answer this question...... on Web based expert systems – will be presented. The idea behind the presentation of theaccessibility evaluation and its conclusions is to show to Web based expert system developers, who typically have little Web engineering background, that Web engineering issues must be considered when developing Web......Similar to many legacy computer systems, expert systems can be accessed via the Web, forming a set of Web applications known as Web based expert systems. The tough Web competition, the way people and organizations rely on Web applications and theincreasing user requirements for better services have...

  16. 'Surprise': Outbreak of Campylobacter infection associated with chicken liver pâté at a surprise birthday party, Adelaide, Australia, 2012.

    Science.gov (United States)

    Parry, Amy; Fearnley, Emily; Denehy, Emma

    2012-10-01

    In July 2012, an outbreak of Campylobacter infection was investigated by the South Australian Communicable Disease Control Branch and Food Policy and Programs Branch. The initial notification identified illness at a surprise birthday party held at a restaurant on 14 July 2012. The objective of the investigation was to identify the potential source of infection and institute appropriate intervention strategies to prevent further illness. A guest list was obtained and a retrospective cohort study undertaken. A combination of paper-based and telephone questionnaires were used to collect exposure and outcome information. An environmental investigation was conducted by Food Policy and Programs Branch at the implicated premises. All 57 guests completed the questionnaire (100% response rate), and 15 met the case definition. Analysis showed a significant association between illness and consumption of chicken liver pâté (relative risk: 16.7, 95% confidence interval: 2.4-118.6). No other food or beverage served at the party was associated with illness. Three guests submitted stool samples; all were positive for Campylobacter. The environmental investigation identified that the cooking process used in the preparation of chicken liver pâté may have been inconsistent, resulting in some portions not cooked adequately to inactivate potential Campylobacter contamination. Chicken liver products are a known source of Campylobacter infection; therefore, education of food handlers remains a high priority. To better identify outbreaks among the large number of Campylobacter notifications, routine typing of Campylobacter isolates is recommended.

  17. Surprising transformation of a block copolymer into a high performance polystyrene ultrafiltration membrane with a hierarchically organized pore structure

    KAUST Repository

    Shevate, Rahul

    2018-02-08

    We describe the preparation of hierarchical polystyrene nanoporous membranes with a very narrow pore size distribution and an extremely high porosity. The nanoporous structure is formed as a result of unusual degradation of the poly(4-vinyl pyridine) block from self-assembled poly(styrene)-b-poly(4-vinyl pyridine) (PS-b-P4VP) membranes through the formation of an unstable pyridinium intermediate in an alkaline medium. During this process, the confined swelling and controlled degradation produced a tunable pore size. We unequivocally confirmed the successful elimination of the P4VP block from a PS-b-P4VPVP membrane using 1D/2D NMR spectroscopy and other characterization techniques. Surprisingly, the long range ordered surface porosity was preserved even after degradation of the P4VP block from the main chain of the diblock copolymer, as revealed by SEM. Aside from a drastically improved water flux (∼67% increase) compared to the PS-b-P4VP membrane, the hydraulic permeability measurements validated pH independent behaviour of the isoporous PS membrane over a wide pH range from 3 to 10. The effect of the pore size on protein transport rate and selectivity (a) was investigated for lysozyme (Lys), bovine serum albumin (BSA) and globulin-γ (IgG). A high selectivity of 42 (Lys/IgG) and 30 (BSA/IgG) was attained, making the membranes attractive for size selective separation of biomolecules from their synthetic model mixture solutions.

  18. How to reach clients of female sex workers: a survey by surprise in brothels in Dakar, Senegal.

    Science.gov (United States)

    Espirito Santo, M. E. Gomes do; Etheredge, G. D.

    2002-01-01

    OBJECTIVE: To describe the sampling techniques and survey procedures used in identifying male clients who frequent brothels to buy sexual services from female sex workers in Dakar, Senegal, with the aim of measuring the prevalence of human immunodeficiency virus (HIV) infection and investigating related risk behaviours. METHODS: Surveys were conducted in seven brothels in Dakar, Senegal. Clients were identified "by surprise" and interviewed and requested to donate saliva for HIV testing. RESULTS: Of the 1450 clients of prostitutes who were solicited to enter the study, 1140 (79.8%) agreed to be interviewed; 1083 (95%) of these clients provided saliva samples for testing. Of the samples tested, 47 were positive for HIV-1 or HIV-2, giving an HIV prevalence of 4.4%. CONCLUSION: The procedures adopted were successful in reaching the target population. Men present in the brothels could not deny being there, and it proved possible to explain the purpose of the study and to gain their confidence. Collection of saliva samples was shown to be an excellent method for performing HIV testing in difficult field conditions where it is hard to gain access to the population under study. The surveying of prostitution sites is recommended as a means of identifying core groups for HIV infection with a view to targeting education programmes more effectively. In countries such as Senegal, where the prevalence of HIV infection is still low, interventions among commercial sex workers and their clients may substantially delay the onset of a larger epidemic in the general population. PMID:12378288

  19. The influence of the surprising decay properties of element 108 on search experiments for new elements

    International Nuclear Information System (INIS)

    Hofmann, S.; Armbruster, P.; Muenzenberg, G.; Reisdorf, W.; Schmidt, K.H.; Burkhard, H.G.; Hessberger, F.P.; Schoett, H.J.; Agarwal, Y.K.; Berthes, G.; Gollerthan, U.; Folger, H.; Hingmann, J.G.; Keller, J.G.; Leino, M.E.; Lemmertz, P.; Montoya, M.; Poppensieker, K.; Quint, B.; Zychor, I.

    1986-01-01

    Results of experiments to synthesize the heaviest elements are reported. Surprising is the high stability against fission not only of the odd and odd-odd nuclei but also of even isotopes of even elements. Alpha decay data gave an increasing stability of nuclei by shell effects up to 266 109, the heaviest known element. Theoretically, the high stability is explained by an island of nuclei with big quadrupole and hexadecapole deformations around Z=109 and N=162. Future experiments will be planned to prove the island character of these heavy nuclei. (orig.)

  20. Surprisal analysis of Glioblastoma Multiform (GBM) microRNA dynamics unveils tumor specific phenotype.

    Science.gov (United States)

    Zadran, Sohila; Remacle, Francoise; Levine, Raphael

    2014-01-01

    Gliomablastoma multiform (GBM) is the most fatal form of all brain cancers in humans. Currently there are limited diagnostic tools for GBM detection. Here, we applied surprisal analysis, a theory grounded in thermodynamics, to unveil how biomolecule energetics, specifically a redistribution of free energy amongst microRNAs (miRNAs), results in a system deviating from a non-cancer state to the GBM cancer -specific phenotypic state. Utilizing global miRNA microarray expression data of normal and GBM patients tumors, surprisal analysis characterizes a miRNA system response capable of distinguishing GBM samples from normal tissue biopsy samples. We indicate that the miRNAs contributing to this system behavior is a disease phenotypic state specific to GBM and is therefore a unique GBM-specific thermodynamic signature. MiRNAs implicated in the regulation of stochastic signaling processes crucial in the hallmarks of human cancer, dominate this GBM-cancer phenotypic state. With this theory, we were able to distinguish with high fidelity GBM patients solely by monitoring the dynamics of miRNAs present in patients' biopsy samples. We anticipate that the GBM-specific thermodynamic signature will provide a critical translational tool in better characterizing cancer types and in the development of future therapeutics for GBM.

  1. Cloud Surprises in Moving NASA EOSDIS Applications into Amazon Web Services

    Science.gov (United States)

    Mclaughlin, Brett

    2017-01-01

    NASA ESDIS has been moving a variety of data ingest, distribution, and science data processing applications into a cloud environment over the last 2 years. As expected, there have been a number of challenges in migrating primarily on-premises applications into a cloud-based environment, related to architecture and taking advantage of cloud-based services. What was not expected is a number of issues that were beyond purely technical application re-architectures. We ran into surprising network policy limitations, billing challenges in a government-based cost model, and difficulty in obtaining certificates in an NASA security-compliant manner. On the other hand, this approach has allowed us to move a number of applications from local hosting to the cloud in a matter of hours (yes, hours!!), and our CMR application now services 95% of granule searches and an astonishing 99% of all collection searches in under a second. And most surprising of all, well, you'll just have to wait and see the realization that caught our entire team off guard!

  2. Successful Aging

    Directory of Open Access Journals (Sweden)

    Taufiqurrahman Nasihun

    2015-06-01

    Full Text Available The emerging concept of successful aging is based on evidence that in healthy individual when they get aged, there are  considerable variations in physiological functions alteration. Some people exhibiting greater, but others very few or no age related alteration. The first is called poor aging and the later is called successful pattern of aging (Lambert SW, 2008. Thus, in the simple words the successful aging concept is define as an opportunity of old people to stay  active and productive condition despite they get aged chronologically. Aging itself might be defined as the progressive accumulation of changes with time associated with or responsible for the ever-increasing susceptibility to disease and death which accompanies advancing age (Harman D, 1981. The time needed to accumulate changes is attributable to aging process. The marked emerging questions are how does aging happen and where does aging start? To answer these questions and because of the complexity of aging process, there are more than 300 aging theories have been proposed to explain how and where aging occured and started respectively. There are too many to enumerate theories and classification of aging process. In summary, all of these aging theories can be grouped into three clusters: 1. Genetics program theory, this theory suggests that aging is resulted from program directed by the genes; 2. Epigenetic theory, in these theory aging is resulted from environmental random events not determined by the genes; 3. Evolutionary theory, which propose that aging is a medium for disposal mortal soma in order to avoid competition between organism and their progeny for food and space, did not try to explain how aging occur, but possibly answer why aging occur (De la Fuente. 2009. Among the three groups of aging theories, the epigenetic theory is useful to explain and try to solve the enigma of aging which is prominently caused by internal and external environmental influences

  3. Vascular legacy: HOPE ADVANCEs to EMPA-REG and LEADER: A Surprising similarity

    Directory of Open Access Journals (Sweden)

    Sanjay Kalra

    2017-01-01

    Full Text Available Recently reported cardiovascular outcome studies on empagliflozin (EMPA-REG and liraglutide (LEADER have spurred interest in this field of diabetology. This commentary compares and contrasts these studies with two equally important outcome trials conducted using blood pressure lowering agents. A comparison with MICROHOPE (using ramipril and ADVANCE (using perindopril + indapamide blood pressure arms throws up interesting facts. The degree of blood pressure lowering, dissociation between cardiovascular and cerebrovascular benefits, and discordance between renal and retinal outcomes are surprisingly similar in these trials, conducted using disparate molecules. The time taken to achieve such benefits is similar for all drugs except empagliflozin. Such discussion helps inform rational and evidence-based choice of therapy and forms the framework for future research.

  4. Self-organizing weights for Internet AS-graphs and surprisingly simple routing metrics

    DEFF Research Database (Denmark)

    Scholz, Jan Carsten; Greiner, Martin

    2011-01-01

    The transport capacity of Internet-like communication networks and hence their efficiency may be improved by a factor of 5–10 through the use of highly optimized routing metrics, as demonstrated previously. The numerical determination of such routing metrics can be computationally demanding...... to an extent that prohibits both investigation of and application to very large networks. In an attempt to find a numerically less expensive way of constructing a metric with a comparable performance increase, we propose a local, self-organizing iteration scheme and find two surprisingly simple and efficient...... metrics. The new metrics have negligible computational cost and result in an approximately 5-fold performance increase, providing distinguished competitiveness with the computationally costly counterparts. They are applicable to very large networks and easy to implement in today's Internet routing...

  5. Probing Critical Point Energies of Transition Metal Dichalcogenides: Surprising Indirect Gap of Single Layer WSe 2

    KAUST Repository

    Zhang, Chendong

    2015-09-21

    By using a comprehensive form of scanning tunneling spectroscopy, we have revealed detailed quasi-particle electronic structures in transition metal dichalcogenides, including the quasi-particle gaps, critical point energy locations, and their origins in the Brillouin zones. We show that single layer WSe surprisingly has an indirect quasi-particle gap with the conduction band minimum located at the Q-point (instead of K), albeit the two states are nearly degenerate. We have further observed rich quasi-particle electronic structures of transition metal dichalcogenides as a function of atomic structures and spin-orbit couplings. Such a local probe for detailed electronic structures in conduction and valence bands will be ideal to investigate how electronic structures of transition metal dichalcogenides are influenced by variations of local environment.

  6. Probing Critical Point Energies of Transition Metal Dichalcogenides: Surprising Indirect Gap of Single Layer WSe 2

    KAUST Repository

    Zhang, Chendong; Chen, Yuxuan; Johnson, Amber; Li, Ming-yang; Li, Lain-Jong; Mende, Patrick C.; Feenstra, Randall M.; Shih, Chih Kang

    2015-01-01

    By using a comprehensive form of scanning tunneling spectroscopy, we have revealed detailed quasi-particle electronic structures in transition metal dichalcogenides, including the quasi-particle gaps, critical point energy locations, and their origins in the Brillouin zones. We show that single layer WSe surprisingly has an indirect quasi-particle gap with the conduction band minimum located at the Q-point (instead of K), albeit the two states are nearly degenerate. We have further observed rich quasi-particle electronic structures of transition metal dichalcogenides as a function of atomic structures and spin-orbit couplings. Such a local probe for detailed electronic structures in conduction and valence bands will be ideal to investigate how electronic structures of transition metal dichalcogenides are influenced by variations of local environment.

  7. Surprising judgments about robot drivers: Experiments on rising expectations and blaming humans

    Directory of Open Access Journals (Sweden)

    Peter Danielson

    2015-05-01

    Full Text Available N-Reasons is an experimental Internet survey platform designed to enhance public participation in applied ethics and policy. N-Reasons encourages individuals to generate reasons to support their judgments, and groups to converge on a common set of reasons pro and con various issues.  In the Robot Ethics Survey some of the reasons contributed surprising judgments about autonomous machines. Presented with a version of the trolley problem with an autonomous train as the agent, participants gave unexpected answers, revealing high expectations for the autonomous machine and shifting blame from the automated device to the humans in the scenario. Further experiments with a standard pair of human-only trolley problems refine these results. While showing the high expectations even when no autonomous machine is involved, human bystanders are only blamed in the machine case. A third experiment explicitly aimed at responsibility for driverless cars confirms our findings about shifting blame in the case of autonomous machine agents. We conclude methodologically that both results point to the power of an experimental survey based approach to public participation to explore surprising assumptions and judgments in applied ethics. However, both results also support using caution when interpreting survey results in ethics, demonstrating the importance of qualitative data to provide further context for evaluating judgments revealed by surveys. On the ethics side, the result about shifting blame to humans interacting with autonomous machines suggests caution about the unintended consequences of intuitive principles requiring human responsibility.http://dx.doi.org/10.5324/eip.v9i1.1727

  8. SPEM: Software Process Engineering Metamodel

    OpenAIRE

    Víctor Hugo Menéndez Domínguez; María Enriqueta Castellanos Bolaños

    2015-01-01

    Todas las organizaciones involucradas en el desarrollo de software necesitan establecer, gestionar y soportar el trabajo de desarrollo. El término “proceso de desarrollo de software” tiende a unificar todas las actividades y prácticas que cubren esas necesidades. Modelar el proceso de software es una forma para mejorar el desarrollo y la calidad de las aplicaciones resultantes. De entre todos los lenguajes existentes para el modelado de procesos, aquellos basados en productos de trabajo son l...

  9. Hillslope, river, and Mountain: some surprises in Landscape evolution (Ralph Alger Bagnold Medal Lecture)

    Science.gov (United States)

    Tucker, G. E.

    2012-04-01

    Geomorphology, like the rest of geoscience, has always had two major themes: a quest to understand the earth's history and 'products' - its landscapes and seascapes - and, in parallel, a quest to understand its formative processes. This dualism is manifest in the remarkable career of R. A. Bagnold, who was inspired by landforms such as dunes, and dedicated to understanding the physical processes that shaped them. His legacy inspires us to emulate two principles at the heart of his contributions: the benefits of rooting geomorphic theory in basic physics, and the importance of understanding geomorphic systems in terms of simple equations framed around energy or force. Today, following Bagnold's footsteps, the earth-surface process community is engaged in a quest to build, test, and refine an ever-improving body of theory to describe our planet's surface and its evolution. In this lecture, I review a small sample of some of the fruits of that quest, emphasizing the value of surprises encountered along the way. The first example involves models of long-term river incision into bedrock. When the community began to grapple with how to represent this process mathematically, several different ideas emerged. Some were based on the assumption that sediment transport is the limiting factor; others assumed that hydraulic stress on rock is the key, while still others treated rivers as first-order 'reactors.' Thanks in part to advances in digital topography and numerical computing, the predictions of these models can be tested using natural-experiment case studies. Examples from the King Range, USA, the Central Apennines, Italy, and the fold-thrust belt of Taiwan, illustrate that independent knowledge of history and/or tectonics makes it possible to quantify how the rivers have responded to external forcing. Some interesting surprises emerge, such as: that the relief-uplift relationship can be highly nonlinear in a steady-state landscape because of grain-entrainment thresholds

  10. Successful and unsuccessful psychopaths: a neurobiological model.

    Science.gov (United States)

    Gao, Yu; Raine, Adrian

    2010-01-01

    Despite increasing interest in psychopathy research, surprisingly little is known about the etiology of non-incarcerated, successful psychopaths. This review provides an analysis of current knowledge on the similarities and differences between successful and unsuccessful psychopaths derived from five population sources: community samples, individuals from employment agencies, college students, industrial psychopaths, and serial killers. An initial neurobiological model of successful and unsuccessful psychopathy is outlined. It is hypothesized that successful psychopaths have intact or enhanced neurobiological functioning that underlies their normal or even superior cognitive functioning, which in turn helps them to achieve their goals using more covert and nonviolent methods. In contrast, in unsuccessful, caught psychopaths, brain structural and functional impairments together with autonomic nervous system dysfunction are hypothesized to underlie cognitive and emotional deficits and more overt violent offending.

  11. Cerebral metastasis masquerading as cerebritis: A case of misguiding history and radiological surprise!

    Directory of Open Access Journals (Sweden)

    Ashish Kumar

    2013-01-01

    Full Text Available Cerebral metastases usually have a characteristic radiological appearance. They can be differentiated rather easily from any infective etiology. Similarly, positive medical history also guides the neurosurgeon towards the possible diagnosis and adds to the diagnostic armamentarium. However, occasionally, similarities on imaging may be encountered where even history could lead us in the wrong direction and tends to bias the clinician. We report a case of a 40-year-old female with a history of mastoidectomy for otitis media presenting to us with a space occupying lesion in the right parietal region, which was thought pre-operatively as an abscess along with the cerebritis. Surprisingly, the histopathology proved it to be a metastatic adenocarcinoma. Hence, a ring enhancing lesion may be a high grade neoplasm/metastasis/abscess, significant gyral enhancement; a feature of cerebritis is not linked with a neoplastic etiology more often. This may lead to delayed diagnosis, incorrect prognostication and treatment in patients having coincidental suggestive history of infection. We review the literature and highlight the key points helping to differentiate an infective from a neoplastic pathology which may look similar at times.

  12. Surprise responses in the human brain demonstrate statistical learning under high concurrent cognitive demand

    Science.gov (United States)

    Garrido, Marta Isabel; Teng, Chee Leong James; Taylor, Jeremy Alexander; Rowe, Elise Genevieve; Mattingley, Jason Brett

    2016-06-01

    The ability to learn about regularities in the environment and to make predictions about future events is fundamental for adaptive behaviour. We have previously shown that people can implicitly encode statistical regularities and detect violations therein, as reflected in neuronal responses to unpredictable events that carry a unique prediction error signature. In the real world, however, learning about regularities will often occur in the context of competing cognitive demands. Here we asked whether learning of statistical regularities is modulated by concurrent cognitive load. We compared electroencephalographic metrics associated with responses to pure-tone sounds with frequencies sampled from narrow or wide Gaussian distributions. We showed that outliers evoked a larger response than those in the centre of the stimulus distribution (i.e., an effect of surprise) and that this difference was greater for physically identical outliers in the narrow than in the broad distribution. These results demonstrate an early neurophysiological marker of the brain's ability to implicitly encode complex statistical structure in the environment. Moreover, we manipulated concurrent cognitive load by having participants perform a visual working memory task while listening to these streams of sounds. We again observed greater prediction error responses in the narrower distribution under both low and high cognitive load. Furthermore, there was no reliable reduction in prediction error magnitude under high-relative to low-cognitive load. Our findings suggest that statistical learning is not a capacity limited process, and that it proceeds automatically even when cognitive resources are taxed by concurrent demands.

  13. Pseudohalide (SCN(-))-Doped MAPbI3 Perovskites: A Few Surprises.

    Science.gov (United States)

    Halder, Ansuman; Chulliyil, Ramya; Subbiah, Anand S; Khan, Tuhin; Chattoraj, Shyamtanu; Chowdhury, Arindam; Sarkar, Shaibal K

    2015-09-03

    Pseudohalide thiocyanate anion (SCN(-)) has been used as a dopant in a methylammonium lead tri-iodide (MAPbI3) framework, aiming for its use as an absorber layer for photovoltaic applications. The substitution of SCN(-) pseudohalide anion, as verified using Fourier transform infrared (FT-IR) spectroscopy, results in a comprehensive effect on the optical properties of the original material. Photoluminescence measurements at room temperature reveal a significant enhancement in the emission quantum yield of MAPbI3-x(SCN)x as compared to MAPbI3, suggestive of suppression of nonradiative channels. This increased intensity is attributed to a highly edge specific emission from MAPbI3-x(SCN)x microcrystals as revealed by photoluminescence microscopy. Fluoresence lifetime imaging measurements further established contrasting carrier recombination dynamics for grain boundaries and the bulk of the doped material. Spatially resolved emission spectroscopy on individual microcrystals of MAPbI3-x(SCN)x reveals that the optical bandgap and density of states at various (local) nanodomains are also nonuniform. Surprisingly, several (local) emissive regions within MAPbI3-x(SCN)x microcrystals are found to be optically unstable under photoirradiation, and display unambiguous temporal intermittency in emission (blinking), which is extremely unusual and intriguing. We find diverse blinking behaviors for the undoped MAPbI3 crystals as well, which leads us to speculate that blinking may be a common phenomenon for most hybrid perovskite materials.

  14. Surprises from a Deep ASCA Spectrum of the Broad Absorption Line Quasar PHL 5200

    Science.gov (United States)

    Mathur, Smita; Matt, G.; Green, P. J.; Elvis, M.; Singh, K. P.

    2002-01-01

    We present a deep (approx. 85 ks) ASCA observation of the prototype broad absorption line quasar (BALQSO) PHL 5200. This is the best X-ray spectrum of a BALQSO yet. We find the following: (1) The source is not intrinsically X-ray weak. (2) The line-of-sight absorption is very strong, with N(sub H) = 5 x 10(exp 23)/sq cm. (3) The absorber does not cover the source completely; the covering fraction is approx. 90%. This is consistent with the large optical polarization observed in this source, implying multiple lines of sight. The most surprising result of this observation is that (4) the spectrum of this BALQSO is not exactly similar to other radio-quiet quasars. The hard X-ray spectrum of PHL 5200 is steep, with the power-law spectral index alpha approx. 1.5. This is similar to the steepest hard X-ray slopes observed so far. At low redshifts, such steep slopes are observed in narrow-line Seyfert 1 (NLS1) galaxies, believed to be accreting at a high Eddington rate. This observation strengthens the analogy between BALQSOs and NLS1 galaxies and supports the hypothesis that BALQSOs represent an early evolutionary state of quasars. It is well accepted that the orientation to the line of sight determines the appearance of a quasar: age seems to play a significant role as well.

  15. From Lithium-Ion to Sodium-Ion Batteries: Advantages, Challenges, and Surprises.

    Science.gov (United States)

    Nayak, Prasant Kumar; Yang, Liangtao; Brehm, Wolfgang; Adelhelm, Philipp

    2018-01-02

    Mobile and stationary energy storage by rechargeable batteries is a topic of broad societal and economical relevance. Lithium-ion battery (LIB) technology is at the forefront of the development, but a massively growing market will likely put severe pressure on resources and supply chains. Recently, sodium-ion batteries (SIBs) have been reconsidered with the aim of providing a lower-cost alternative that is less susceptible to resource and supply risks. On paper, the replacement of lithium by sodium in a battery seems straightforward at first, but unpredictable surprises are often found in practice. What happens when replacing lithium by sodium in electrode reactions? This review provides a state-of-the art overview on the redox behavior of materials when used as electrodes in lithium-ion and sodium-ion batteries, respectively. Advantages and challenges related to the use of sodium instead of lithium are discussed. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Effect of Temperature Shock and Inventory Surprises on Natural Gas and Heating Oil Futures Returns

    Science.gov (United States)

    Hu, John Wei-Shan; Lin, Chien-Yu

    2014-01-01

    The aim of this paper is to examine the impact of temperature shock on both near-month and far-month natural gas and heating oil futures returns by extending the weather and storage models of the previous study. Several notable findings from the empirical studies are presented. First, the expected temperature shock significantly and positively affects both the near-month and far-month natural gas and heating oil futures returns. Next, significant temperature shock has effect on both the conditional mean and volatility of natural gas and heating oil prices. The results indicate that expected inventory surprises significantly and negatively affects the far-month natural gas futures returns. Moreover, volatility of natural gas futures returns is higher on Thursdays and that of near-month heating oil futures returns is higher on Wednesdays than other days. Finally, it is found that storage announcement for natural gas significantly affects near-month and far-month natural gas futures returns. Furthermore, both natural gas and heating oil futures returns are affected more by the weighted average temperature reported by multiple weather reporting stations than that reported by a single weather reporting station. PMID:25133233

  17. Surprises from the resolution of operator mixing in N=4 SYM

    International Nuclear Information System (INIS)

    Bianchi, Massimo; Rossi, Giancarlo; Stanev, Yassen S.

    2004-01-01

    We reexamine the problem of operator mixing in N=4 SYM. Particular attention is paid to the correct definition of composite gauge invariant local operators, which is necessary for the computation of their anomalous dimensions beyond lowest order. As an application we reconsider the case of operators with naive dimension Δ 0 =4, already studied in the literature. Stringent constraints from the resummation of logarithms in power behaviours are exploited and the role of the generalized N=4 Konishi anomaly in the mixing with operators involving fermions is discussed. A general method for the explicit (numerical) resolution of the operator mixing and the computation of anomalous dimensions is proposed. We then resolve the order g 2 mixing for the 15 (purely scalar) singlet operators of naive dimension Δ 0 =6. Rather surprisingly we find one isolated operator which has a vanishing anomalous dimension up to order g 4 , belonging to an apparently long multiplet. We also solve the order g 2 mixing for the 26 operators belonging to the representation 20' of SU(4). We find an operator with the same one-loop anomalous dimension as the Konishi multiplet

  18. The Ultraviolet Surprise. Efficient Soft X-Ray High Harmonic Generation in Multiply-Ionized Plasmas

    International Nuclear Information System (INIS)

    Popmintchev, Dimitar; Hernandez-Garcia, Carlos; Dollar, Franklin; Mancuso, Christopher; Perez-Hernandez, Jose A.; Chen, Ming-Chang; Hankla, Amelia; Gao, Xiaohui; Shim, Bonggu; Gaeta, Alexander L.; Tarazkar, Maryam; Romanov, Dmitri A.; Levis, Robert J.; Gaffney, Jim A.; Foord, Mark; Libby, Stephen B.; Jaron-Becker, Agnieskzka; Becker, Andreas; Plaja, Luis; Muranane, Margaret M.; Kapteyn, Henry C.; Popmintchev, Tenio

    2015-01-01

    High-harmonic generation is a universal response of matter to strong femtosecond laser fields, coherently upconverting light to much shorter wavelengths. Optimizing the conversion of laser light into soft x-rays typically demands a trade-off between two competing factors. Reduced quantum diffusion of the radiating electron wave function results in emission from each species which is highest when a short-wavelength ultraviolet driving laser is used. But, phase matching - the constructive addition of x-ray waves from a large number of atoms - favors longer-wavelength mid-infrared lasers. We identified a regime of high-harmonic generation driven by 40-cycle ultraviolet lasers in waveguides that can generate bright beams in the soft x-ray region of the spectrum, up to photon energies of 280 electron volts. Surprisingly, the high ultraviolet refractive indices of both neutral atoms and ions enabled effective phase matching, even in a multiply ionized plasma. We observed harmonics with very narrow linewidths, while calculations show that the x-rays emerge as nearly time-bandwidt-limited pulse trains of ~100 attoseconds

  19. Ensuring a successful family business management succession

    OpenAIRE

    Desbois, Joris

    2016-01-01

    Succession is the biggest long-term challenge that most family businesses face. Indeed, leaders ‘disposition to plan for their succession is frequently the key factor defining whether their family business subsists or stops. The research seeks to find out how to manage successfully the business management succession over main principles. This work project aims at researching the key points relevant to almost all family firms, to have a viable succession transition and positioni...

  20. A conceptual geochemical model of the geothermal system at Surprise Valley, CA

    Science.gov (United States)

    Fowler, Andrew P. G.; Ferguson, Colin; Cantwell, Carolyn A.; Zierenberg, Robert A.; McClain, James; Spycher, Nicolas; Dobson, Patrick

    2018-03-01

    Characterizing the geothermal system at Surprise Valley (SV), northeastern California, is important for determining the sustainability of the energy resource, and mitigating hazards associated with hydrothermal eruptions that last occurred in 1951. Previous geochemical studies of the area attempted to reconcile different hot spring compositions on the western and eastern sides of the valley using scenarios of dilution, equilibration at low temperatures, surface evaporation, and differences in rock type along flow paths. These models were primarily supported using classical geothermometry methods, and generally assumed that fluids in the Lake City mud volcano area on the western side of the valley best reflect the composition of a deep geothermal fluid. In this contribution, we address controls on hot spring compositions using a different suite of geochemical tools, including optimized multicomponent geochemistry (GeoT) models, hot spring fluid major and trace element measurements, mineralogical observations, and stable isotope measurements of hot spring fluids and precipitated carbonates. We synthesize the results into a conceptual geochemical model of the Surprise Valley geothermal system, and show that high-temperature (quartz, Na/K, Na/K/Ca) classical geothermometers fail to predict maximum subsurface temperatures because fluids re-equilibrated at progressively lower temperatures during outflow, including in the Lake City area. We propose a model where hot spring fluids originate as a mixture between a deep thermal brine and modern meteoric fluids, with a seasonally variable mixing ratio. The deep brine has deuterium values at least 3 to 4‰ lighter than any known groundwater or high-elevation snow previously measured in and adjacent to SV, suggesting it was recharged during the Pleistocene when meteoric fluids had lower deuterium values. The deuterium values and compositional characteristics of the deep brine have only been identified in thermal springs and

  1. Explanatory models of health and disease: surprises from within the former Soviet Union

    Directory of Open Access Journals (Sweden)

    Tatiana I Andreeva

    2013-06-01

    Full Text Available Extract The review of anthropological theories as applied to public health by Jennifer J. Carroll (Carroll, 2013 published in this issue of TCPHEE made me recollect my first and most surprising discoveries of how differently same things can be understood in different parts of the world. Probably less unexpectedly, these impressions concern substance abuse and addiction behaviors, similarly to many examples deployed by Jennifer J. Carroll. The first of these events happened soon after the break-up of the Soviet Union when some of the most active people from the West rushed to discover what was going on behind the opening iron curtain. A director of an addiction clinic, who had just come into contact with a Dutch counterpart, invited me to join the collaboration and the innovation process he planned to launch. Being a participant of the exchange program started within this collaboration, I had an opportunity to discover how addictive behaviors were understood and explained in books (English, 1961; Kooyman, 1992; Viorst, 1986 recommended by the colleagues in the Netherlands and, as I could observe with my own eyes, addressed in everyday practice. This was a jaw-dropping contrast to what I learnt at the soviet medical university and some post-graduate courses, where all the diseases related to alcohol, tobacco, or drug abuse were considered predominantly a result of the substance intake. In the Soviet discourse, the intake itself was understood as 'willful and deliberate' or immoral behavior which, in some cases, was to be rectified in prison-like treatment facilities. In the West, quite oppositely, substance abuse was seen rather as a consequence of a constellation of life-course adversities thoroughly considered by developmental psychology. This approach was obviously deeply ingrained in how practitioners diagnosed and treated their patients.

  2. Metaproteomics of cellulose methanisation under thermophilic conditions reveals a surprisingly high proteolytic activity.

    Science.gov (United States)

    Lü, Fan; Bize, Ariane; Guillot, Alain; Monnet, Véronique; Madigou, Céline; Chapleur, Olivier; Mazéas, Laurent; He, Pinjing; Bouchez, Théodore

    2014-01-01

    Cellulose is the most abundant biopolymer on Earth. Optimising energy recovery from this renewable but recalcitrant material is a key issue. The metaproteome expressed by thermophilic communities during cellulose anaerobic digestion was investigated in microcosms. By multiplying the analytical replicates (65 protein fractions analysed by MS/MS) and relying solely on public protein databases, more than 500 non-redundant protein functions were identified. The taxonomic community structure as inferred from the metaproteomic data set was in good overall agreement with 16S rRNA gene tag pyrosequencing and fluorescent in situ hybridisation analyses. Numerous functions related to cellulose and hemicellulose hydrolysis and fermentation catalysed by bacteria related to Caldicellulosiruptor spp. and Clostridium thermocellum were retrieved, indicating their key role in the cellulose-degradation process and also suggesting their complementary action. Despite the abundance of acetate as a major fermentation product, key methanogenesis enzymes from the acetoclastic pathway were not detected. In contrast, enzymes from the hydrogenotrophic pathway affiliated to Methanothermobacter were almost exclusively identified for methanogenesis, suggesting a syntrophic acetate oxidation process coupled to hydrogenotrophic methanogenesis. Isotopic analyses confirmed the high dominance of the hydrogenotrophic methanogenesis. Very surprising was the identification of an abundant proteolytic activity from Coprothermobacter proteolyticus strains, probably acting as scavenger and/or predator performing proteolysis and fermentation. Metaproteomics thus appeared as an efficient tool to unravel and characterise metabolic networks as well as ecological interactions during methanisation bioprocesses. More generally, metaproteomics provides direct functional insights at a limited cost, and its attractiveness should increase in the future as sequence databases are growing exponentially.

  3. [Fall from height--surprising autopsy diagnosis in primarily unclear initial situations].

    Science.gov (United States)

    Schyma, Christian; Doberentz, Elke; Madea, Burkhard

    2012-01-01

    External post-mortem examination and first police assessments are often not consistent with subsequent autopsy results. This is all the more surprising the more serious the injuries found at autopsy are. Such discrepancies result especially from an absence of gross external injuries, as demonstrated by four examples. A 42-year-old, externally uninjured male was found at night time in a helpless condition in the street and died in spite of resuscitation. Autopsy showed severe polytrauma with traumatic brain injury and lesions of the thoracic and abdominal organs. A jump from the third floor was identified as the cause. At dawn, a twenty-year-old male was found dead on the grounds of the adjacent house. Because of the blood-covered head the police assumed a traumatic head injury by strike impact. The external examination revealed only abrasions on the forehead and to a minor extent on the back. At autopsy a midfacial fracture, a trauma of the thorax and abdomen and fractures of the spine and pelvis were detected. Afterwards investigations showed that the man, intoxicated by alcohol, had fallen from the flat roof of a multistoried house. A 77-year-old man was found unconscious on his terrace at day time; a cerebral seizure was assumed. He was transferred to emergency care where he died. The corpse was externally inconspicuous. Autopsy revealed serious traumatic injuries of the brain, thorax, abdomen and pelvis, which could be explained by a fall from the balcony. A 47-year-old homeless person without any external injuries was found dead in a barn. An alcohol intoxication was assumed. At autopsy severe injuries of the brain and cervical spine were found which were the result of a fall from a height of 5 m. On the basis of an external post-mortem examination alone gross blunt force trauma cannot be reliably excluded.

  4. Virtual Volatility, an Elementary New Concept with Surprising Stock Market Consequences

    Science.gov (United States)

    Prange, Richard; Silva, A. Christian

    2006-03-01

    Textbook investors start by predicting the future price distribution, PDF, of a candidate stock (or portfolio) at horizon T, e.g. a year hence. A (log)normal PDF with center (=drift =expected return) μT and width (=volatility) σT is often assumed on Central Limit Theorem grounds, i.e. by a random walk of daily (log)price increments δs. The standard deviation, stdev, of historical (ex post) δs `s is usually a fair predictor of the coming year's (ex ante) stdev(δs) = σdaily, but the historical mean E(δs) at best roughly limits the true, to be predicted, drift by μtrueT˜ μhistT ± σhistT. Textbooks take a PDF with σ ˜ σdaily and μ as somehow known, as if accurate predictions of μ were possible. It is elementary and presumably new to argue that an average of PDF's over a range of μ values should be taken, e.g. an average over forecasts by different analysts. We estimate that this leads to a PDF with a `virtual' volatility σ ˜ 1.3σdaily. It is indeed clear that uncertainty in the value of the expected gain parameter increases the risk of investment in that security by most measures, e. g. Sharpe's ratio μT/σT will be 30% smaller because of this effect. It is significant and surprising that there are investments which benefit from this 30% virtual increase in the volatility

  5. Technological monitoring radar: a weak signals interpretation tool for the identification of strategic surprises

    Directory of Open Access Journals (Sweden)

    Adalton Ozaki

    2011-07-01

    Full Text Available In the current competitive scenario, marked by rapid and constant changes, it is vital that companies actively monitor the business environment, in search of signs which might anticipate changes. This study poses to propose and discuss a tool called Technological Monitoring Radar, which endeavours to address the following query: “How can a company systematically monitor the environment and capture signs that anticipate opportunities and threats concerning a particular technology?”. The literature review covers Competitive Intelligence, Technological Intelligence, Environmental Analysis and Anticipative Monitoring. Based on the critical analysis of the literature, a tool called Technological Monitoring Radar is proposed comprising five environments to be monitored (political, economical, technological, social and competition each of which with key topics for analysis. To exemplify the use of the tool, it is applied to the smartphone segment in an exclusively reflexive manner, and without the participation of a specific company. One of the suggestions for future research is precisely the application of the proposed methodology in an actual company. Despite the limitation of this being a theoretical study, the example demonstrated the tool´s applicability. The radar prove to be very useful for a company that needs to monitor the environment in search of signs of change. This study´s main contribution is to relate different fields of study (technological intelligence, environmental analysis and anticipative monitoring and different approaches to provide a practical tool that allows a manager to identify and better visualize opportunities and threats, thus avoiding strategic surprises in the technological arena.Key words: Technological monitoring. Technological intelligence. Competitive intelligence. Weak signals.

  6. The genome of Pelobacter carbinolicus reveals surprising metabolic capabilities and physiological features

    Energy Technology Data Exchange (ETDEWEB)

    Aklujkar, Muktak [University of Massachusetts, Amherst; Haveman, Shelley [University of Massachusetts, Amherst; DiDonatoJr, Raymond [University of Massachusetts, Amherst; Chertkov, Olga [Los Alamos National Laboratory (LANL); Han, Cliff [Los Alamos National Laboratory (LANL); Land, Miriam L [ORNL; Brown, Peter [University of Massachusetts, Amherst; Lovley, Derek [University of Massachusetts, Amherst

    2012-01-01

    Background: The bacterium Pelobacter carbinolicus is able to grow by fermentation, syntrophic hydrogen/formate transfer, or electron transfer to sulfur from short-chain alcohols, hydrogen or formate; it does not oxidize acetate and is not known to ferment any sugars or grow autotrophically. The genome of P. carbinolicus was sequenced in order to understand its metabolic capabilities and physiological features in comparison with its relatives, acetate-oxidizing Geobacter species. Results: Pathways were predicted for catabolism of known substrates: 2,3-butanediol, acetoin, glycerol, 1,2-ethanediol, ethanolamine, choline and ethanol. Multiple isozymes of 2,3-butanediol dehydrogenase, ATP synthase and [FeFe]-hydrogenase were differentiated and assigned roles according to their structural properties and genomic contexts. The absence of asparagine synthetase and the presence of a mutant tRNA for asparagine encoded among RNA-active enzymes suggest that P. carbinolicus may make asparaginyl-tRNA in a novel way. Catabolic glutamate dehydrogenases were discovered, implying that the tricarboxylic acid (TCA) cycle can function catabolically. A phosphotransferase system for uptake of sugars was discovered, along with enzymes that function in 2,3-butanediol production. Pyruvate: ferredoxin/flavodoxin oxidoreductase was identified as a potential bottleneck in both the supply of oxaloacetate for oxidation of acetate by the TCA cycle and the connection of glycolysis to production of ethanol. The P. carbinolicus genome was found to encode autotransporters and various appendages, including three proteins with similarity to the geopilin of electroconductive nanowires. Conclusions: Several surprising metabolic capabilities and physiological features were predicted from the genome of P. carbinolicus, suggesting that it is more versatile than anticipated.

  7. A surprisingly simple correlation between the classical and quantum structural networks in liquid water

    Science.gov (United States)

    Hamm, Peter; Fanourgakis, George S.; Xantheas, Sotiris S.

    2017-08-01

    Nuclear quantum effects in liquid water have profound implications for several of its macroscopic properties related to the structure, dynamics, spectroscopy, and transport. Although several of water's macroscopic properties can be reproduced by classical descriptions of the nuclei using interaction potentials effectively parameterized for a narrow range of its phase diagram, a proper account of the nuclear quantum effects is required to ensure that the underlying molecular interactions are transferable across a wide temperature range covering different regions of that diagram. When performing an analysis of the hydrogen-bonded structural networks in liquid water resulting from the classical (class) and quantum (qm) descriptions of the nuclei with two interaction potentials that are at the two opposite ends of the range in describing quantum effects, namely the flexible, pair-wise additive q-TIP4P/F, and the flexible, polarizable TTM3-F, we found that the (class) and (qm) results can be superimposed over the temperature range T = 250-350 K using a surprisingly simple, linear scaling of the two temperatures according to T(qm) = α T(class) + ΔT, where α = 0.99 and ΔT = -6 K for q-TIP4P/F and α = 1.24 and ΔT = -64 K for TTM3-F. This simple relationship suggests that the structural networks resulting from the quantum and classical treatment of the nuclei with those two very different interaction potentials are essentially similar to each other over this extended temperature range once a model-dependent linear temperature scaling law is applied.

  8. A post-genomic surprise. The molecular reinscription of race in science, law and medicine.

    Science.gov (United States)

    Duster, Troy

    2015-03-01

    The completion of the first draft of the Human Genome Map in 2000 was widely heralded as the promise and future of genetics-based medicines and therapies - so much so that pundits began referring to the new century as 'The Century of Genetics'. Moreover, definitive assertions about the overwhelming similarities of all humans' DNA (99.9 per cent) by the leaders of the Human Genome Project were trumpeted as the end of racial thinking about racial taxonomies of human genetic differences. But the first decade of the new century brought unwelcomed surprises. First, gene therapies turned out to be far more complicated than any had anticipated - and instead the pharmaceutical industry turned to a focus on drugs that might be 'related' to population differences based upon genetic markers. While the language of 'personalized medicine' dominated this frame, research on racially and ethnically designated populations differential responsiveness to drugs dominated the empirical work in the field. Ancestry testing and 'admixture research' would play an important role in a new kind of molecular reification of racial categories. Moreover, the capacity of the super-computer to map differences reverberated into personal identification that would affect both the criminal justice system and forensic science, and generate new levels of concern about personal privacy. Social scientists in general, and sociologists in particular, have been caught short by these developments - relying mainly on assertions that racial categories are socially constructed, regionally and historically contingent, and politically arbitrary. While these assertions are true, the imprimatur of scientific legitimacy has shifted the burden, since now 'admixture research' can claim that its results get at the 'reality' of human differentiation, not the admittedly flawed social constructions of racial categories. Yet what was missing from this framing of the problem: 'admixture research' is itself based upon socially

  9. Driving Technological Surprise: DARPA’s Mission in a Changing World

    Science.gov (United States)

    2013-04-01

    fundamental ways. Our research, innovation, and entrepreneurial capacity is the envy of the world, but others are building universities, labs, and...through deep engagement with companies, universities, and DoD and other labs. Our success hinges on having a healthy U.S. R&D ecosystem . Within

  10. Conundrums, paradoxes, and surprises: a brave new world of biodiversity conservation

    Science.gov (United States)

    A.E. Lugo

    2012-01-01

    Anthropogenic activity is altering the global disturbance regime through such processes as urbanization, deforestation, and climate change. These disturbance events alter the environmental conditions under which organisms live and adapt and trigger succession, thus setting the biota in otiion in both ecological and evolutionary space. The result is the mixing of...

  11. For Catholic Colleges, an Important Goal: Don't Surprise the Bishop

    Science.gov (United States)

    Supiano, Beckie

    2009-01-01

    Every college president's success depends on building good relationships with outside groups, whether donors, alumni, or legislators. Presidents of Roman Catholic colleges have one more party to please: the local bishop. In recent months, the bishop of Scranton, Pennsylvania, asked colleges in his diocese to assure him that they were not providing…

  12. Surprising Ripple Effects: How Changing the SAT Score-Sending Policy for Low-Income Students Impacts College Access and Success

    Science.gov (United States)

    Hurwitz, Michael; Mbekeani, Preeya P.; Nipson, Margaret M.; Page, Lindsay C.

    2017-01-01

    Subtle policy adjustments can induce relatively large "ripple effects." We evaluate a College Board initiative that increased the number of free SAT score reports available to low-income students and changed the time horizon for using these score reports. Using a difference-in-differences analytic strategy, we estimate that targeted…

  13. Potential success factors in brand development

    DEFF Research Database (Denmark)

    Esbjerg, Lars; Grunert, Klaus G.; Poulsen, Carsten Stig

    2005-01-01

    to the marketing of the brand." The branding literature mentions many important aspects, factors, issues, brand requirements, steps, building blocks or guidelines for building strong brands. However, these are all quite general and abstract. Given the substantial body of literature on branding, surprisingly few......? This is the question we want to answer. More specifically, we want to identify potential success factors in building strong brands, understood as brands with high consumer-based brand equity. Keller (1993, p. 2) defined customer-based brand equity as "the differential effect of brand knowledge on consumer response...... of this paper is to identify potential success factors in developing strong brands and to test whether these factors can be used to discriminate between strong and weak brands. It does so through a review of the literature for potential success factors. Furthermore, to ensure that important factors have...

  14. Factors favorable to public participation success

    International Nuclear Information System (INIS)

    Peelle, E.; Schweitzer, M.; Munro, J.; Carnes, S.; Wolfe, A.

    1996-01-01

    Categories of factors linked to successful public participation (PP) program outcomes include PP process, organizational context, sociopolitical context, strategic considerations and unique (special circumstances) factors. We re-order the long list factors according to how essential, important, and unique they are and discuss their significance and interrelationships. It is argued that bureacratic structure and operational modes are basically in conflict with features of successful PP programs (openness, two-way education, communication with nonexpert outsiders). If this is so, then it is not surprising that the factors essential for PP success in bureacracies involve extraordinary management efforts by agencies to bypass, compensate for, or overcome structural constraints. We conclude by speculating about the long-term viability of PP practices in the agency setting as well as the consequences for agencies that attempt the problematic task of introducing PP into their complex, mission-oriented organizations

  15. Factors favorable to public participation success

    Energy Technology Data Exchange (ETDEWEB)

    Peelle, E.; Schweitzer, M.; Munro, J.; Carnes, S.; Wolfe, A.

    1996-05-01

    Categories of factors linked to successful public participation (PP) program outcomes include PP process, organizational context, sociopolitical context, strategic considerations and unique (special circumstances) factors. We re-order the long list factors according to how essential, important, and unique they are and discuss their significance and interrelationships. It is argued that bureacratic structure and operational modes are basically in conflict with features of successful PP programs (openness, two-way education, communication with nonexpert outsiders). If this is so, then it is not surprising that the factors essential for PP success in bureacracies involve extraordinary management efforts by agencies to bypass, compensate for, or overcome structural constraints. We conclude by speculating about the long-term viability of PP practices in the agency setting as well as the consequences for agencies that attempt the problematic task of introducing PP into their complex, mission-oriented organizations.

  16. Modern Sedimentation along the SE Bangladesh Coast Reveal Surprisingly Low Accumulation Rates

    Science.gov (United States)

    McHugh, C.; Mustaque, S.; Mondal, D. R.; Akhter, S. H.; Iqbal, M.

    2016-12-01

    Recent sediments recovered along the SE coast of Bangladesh, from Teknaf to Cox's Bazar and drainage basin analyses reveal sediment sources and very low sedimentation rates of 1mm/year. These low rates are surprisingly low given that this coast is adjacent to the Ganges-Brahmaputra Delta with a yearly discharge of 1GT. The Teknaf anticline (elevation 200 m), part of the western Burma fold-thrust belt dominates the topography extending across and along the Teknaf peninsula. It is thought to have begun evolving since the Miocene (Alam et al. 2003 & Allen et al. 2008). Presently the anticline foothills on the west are flanked by uplifted terraces, the youngest linked to coseismic displacement during the 1762 earthquake (Mondal et al. 2015), and a narrow beach 60-200 m in width. Petrography, semi-quantitative bulk mineralogy and SEM/EDX analyses were conducted on sediments recovered along the west coast from 1-4 m deep trenches and three 4-8 m deep drill holes. GIS mapping of drainage basins and quartz-feldspar-lithic (QFL) ternary plots based on grain counting show mixing of sediments from multiple sources: Himalayan provenance of metamorphic and igneous origin (garnet-mostly almandine, tourmaline, rutile, kyanite, zircon, sillimanite and clinopyroxene) similar to Uddin et al. (2007); Brahmaputra provenance of igneous and metamorphic origin (amphibole, epidote, plagioclase 40% Na and 60% Ca, apatite, ilmenite, magnetite, Cr-spinel and garnet-mostly grossular,) as indicated by Garzanti et al. (2010) & Rahman et al. (2016) and Burmese sources (cassiterite and wolframite) (Zaw 1990 & Searle et al. 2007). Low sedimentation rates are the result of two main factors: 1. Strong longshore currents from the south-east that interact with high tidal ranges as evidenced by the morphology of sand waves and ridge and runnel landforms along the beach. 2. Streams draining the Teknaf anticline are dry during the winter and during summer monsoon rains, the sediments bypass the narrow

  17. Farmers Insures Success

    Science.gov (United States)

    Freifeld, Lorri

    2012-01-01

    Farmers Insurance claims the No. 2 spot on the Training Top 125 with a forward-thinking training strategy linked to its primary mission: FarmersFuture 2020. It's not surprising an insurance company would have an insurance policy for the future. But Farmers takes that strategy one step further, setting its sights on 2020 with a far-reaching plan to…

  18. The Most Distant Mature Galaxy Cluster - Young, but surprisingly grown-up

    Science.gov (United States)

    2011-03-01

    Astronomers have used an armada of telescopes on the ground and in space, including the Very Large Telescope at ESO's Paranal Observatory in Chile to discover and measure the distance to the most remote mature cluster of galaxies yet found. Although this cluster is seen when the Universe was less than one quarter of its current age it looks surprisingly similar to galaxy clusters in the current Universe. "We have measured the distance to the most distant mature cluster of galaxies ever found", says the lead author of the study in which the observations from ESO's VLT have been used, Raphael Gobat (CEA, Paris). "The surprising thing is that when we look closely at this galaxy cluster it doesn't look young - many of the galaxies have settled down and don't resemble the usual star-forming galaxies seen in the early Universe." Clusters of galaxies are the largest structures in the Universe that are held together by gravity. Astronomers expect these clusters to grow through time and hence that massive clusters would be rare in the early Universe. Although even more distant clusters have been seen, they appear to be young clusters in the process of formation and are not settled mature systems. The international team of astronomers used the powerful VIMOS and FORS2 instruments on ESO's Very Large Telescope (VLT) to measure the distances to some of the blobs in a curious patch of very faint red objects first observed with the Spitzer space telescope. This grouping, named CL J1449+0856 [1], had all the hallmarks of being a very remote cluster of galaxies [2]. The results showed that we are indeed seeing a galaxy cluster as it was when the Universe was about three billion years old - less than one quarter of its current age [3]. Once the team knew the distance to this very rare object they looked carefully at the component galaxies using both the NASA/ESA Hubble Space Telescope and ground-based telescopes, including the VLT. They found evidence suggesting that most of the

  19. Surprising synthesis of nanodiamond from single-walled carbon nanotubes by the spark plasma sintering process

    Science.gov (United States)

    Mirzaei, Ali; Ham, Heon; Na, Han Gil; Kwon, Yong Jung; Kang, Sung Yong; Choi, Myung Sik; Bang, Jae Hoon; Park, No-Hyung; Kang, Inpil; Kim, Hyoun Woo

    2016-10-01

    Nanodiamond (ND) was successfully synthesized using single-walled carbon nanotubes (SWCNTs) as a pure solid carbon source by means of a spark plasma sintering process. Raman spectra and X-ray diffraction patterns revealed the generation of the cubic diamond phase by means of the SPS process. Lattice-resolved TEM images confirmed that diamond nanoparticles with a diameter of about ˜10 nm existed in the products. The NDs were generated mainly through the gas-phase nucleation of carbon atoms evaporated from the SWCNTs. [Figure not available: see fulltext.

  20. Diagnostic reasoning strategies and diagnostic success.

    Science.gov (United States)

    Coderre, S; Mandin, H; Harasym, P H; Fick, G H

    2003-08-01

    Cognitive psychology research supports the notion that experts use mental frameworks or "schemes", both to organize knowledge in memory and to solve clinical problems. The central purpose of this study was to determine the relationship between problem-solving strategies and the likelihood of diagnostic success. Think-aloud protocols were collected to determine the diagnostic reasoning used by experts and non-experts when attempting to diagnose clinical presentations in gastroenterology. Using logistic regression analysis, the study found that there is a relationship between diagnostic reasoning strategy and the likelihood of diagnostic success. Compared to hypothetico-deductive reasoning, the odds of diagnostic success were significantly greater when subjects used the diagnostic strategies of pattern recognition and scheme-inductive reasoning. Two other factors emerged as independent determinants of diagnostic success: expertise and clinical presentation. Not surprisingly, experts outperformed novices, while the content area of the clinical cases in each of the four clinical presentations demonstrated varying degrees of difficulty and thus diagnostic success. These findings have significant implications for medical educators. It supports the introduction of "schemes" as a means of enhancing memory organization and improving diagnostic success.

  1. Cloud Surprises Discovered in Moving NASA EOSDIS Applications into Amazon Web Services… and #6 Will Shock You!

    Science.gov (United States)

    McLaughlin, B. D.; Pawloski, A. W.

    2017-12-01

    NASA ESDIS has been moving a variety of data ingest, distribution, and science data processing applications into a cloud environment over the last 2 years. As expected, there have been a number of challenges in migrating primarily on-premises applications into a cloud-based environment, related to architecture and taking advantage of cloud-based services. What was not expected is a number of issues that were beyond purely technical application re-architectures. From surprising network policy limitations, billing challenges in a government-based cost model, and obtaining certificates in an NASA security-compliant manner to working with multiple applications in a shared and resource-constrained AWS account, these have been the relevant challenges in taking advantage of a cloud model. And most surprising of all… well, you'll just have to wait and see the "gotcha" that caught our entire team off guard!

  2. The influence of psychological resilience on the relation between automatic stimulus evaluation and attentional breadth for surprised faces.

    Science.gov (United States)

    Grol, Maud; De Raedt, Rudi

    2015-01-01

    The broaden-and-build theory relates positive emotions to resilience and cognitive broadening. The theory proposes that the broadening effects underly the relation between positive emotions and resilience, suggesting that resilient people can benefit more from positive emotions at the level of cognitive functioning. Research has investigated the influence of positive emotions on attentional broadening, but the stimulus in the target of attention may also influence attentional breadth, depending on affective stimulus evaluation. Surprised faces are particularly interesting as they are valence ambiguous, therefore, we investigated the relation between affective evaluation--using an affective priming task--and attentional breadth for surprised faces, and how this relation is influenced by resilience. Results show that more positive evaluations are related to more attentional broadening at high levels of resilience, while this relation is reversed at low levels. This indicates that resilient individuals can benefit more from attending to positively evaluated stimuli at the level of attentional broadening.

  3. Educational Attainment: Success to the Successful

    Science.gov (United States)

    Anthony, Peter; Gould, David; Smith, Gina

    2013-01-01

    Systems archetypes are patterns of structure found in systems that are helpful in understanding some of the dynamics within them. The intent of this study was to examine educational attainment data using the success-to-the-successful archetype as a model to see if it helps to explain the inequality observed in the data. Data covering 1990 to 2009…

  4. College Success Courses: Success for All

    Science.gov (United States)

    Coleman, Sandra Lee; Skidmore, Susan Troncoso; Weller, Carol Thornton

    2018-01-01

    College success courses (CSCs), or orientation courses, are offered by community colleges and universities to facilitate the success of first-time-in-college students. Primarily, these courses are designed to address students' nonacademic deficiencies, such as weak study habits and poor organizational skills, and to familiarize students with…

  5. Team play with a powerful and independent agent: operational experiences and automation surprises on the Airbus A-320

    Science.gov (United States)

    Sarter, N. B.; Woods, D. D.

    1997-01-01

    Research and operational experience have shown that one of the major problems with pilot-automation interaction is a lack of mode awareness (i.e., the current and future status and behavior of the automation). As a result, pilots sometimes experience so-called automation surprises when the automation takes an unexpected action or fails to behave as anticipated. A lack of mode awareness and automation surprises can he viewed as symptoms of a mismatch between human and machine properties and capabilities. Changes in automation design can therefore he expected to affect the likelihood and nature of problems encountered by pilots. Previous studies have focused exclusively on early generation "glass cockpit" aircraft that were designed based on a similar automation philosophy. To find out whether similar difficulties with maintaining mode awareness are encountered on more advanced aircraft, a corpus of automation surprises was gathered from pilots of the Airbus A-320, an aircraft characterized by high levels of autonomy, authority, and complexity. To understand the underlying reasons for reported breakdowns in human-automation coordination, we also asked pilots about their monitoring strategies and their experiences with and attitude toward the unique design of flight controls on this aircraft.

  6. Attitudes of Success.

    Science.gov (United States)

    Pendarvis, Faye

    This document investigates the attitudes of successful individuals, citing the achievement of established goals as the criteria for success. After offering various definitions of success, the paper focuses on the importance of self-esteem to success and considers ways by which the self-esteem of students can be improved. Theories of human behavior…

  7. Biodiesel from Mandarin Seed Oil: A Surprising Source of Alternative Fuel

    Directory of Open Access Journals (Sweden)

    A. K. Azad

    2017-10-01

    Full Text Available Mandarin (Citrus reticulata is one of the most popular fruits in tropical and sub-tropical countries around the world. It contains about 22–34 seeds per fruit. This study investigated the potential of non-edible mandarin seed oil as an alternative fuel in Australia. The seeds were prepared after drying in the oven for 20 h to attain an optimum moisture content of around 13.22%. The crude oil was extracted from the crushed seed using 98% n-hexane solution. The biodiesel conversion reaction (transesterification was designed according to the acid value (mg KOH/g of the crude oil. The study also critically examined the effect of various reaction parameters (such as effect of methanol: oil molar ratio, % of catalyst concentration, etc. on the biodiesel conversion yield. After successful conversion of the bio-oil into biodiesel, the physio-chemical fuel properties of the virgin biodiesel were measured according to relevant ASTM standards and compared with ultra-low sulphur diesel (ULSD and standard biodiesel ASTM D6751. The fatty acid methyl esters (FAMEs were analysed by gas chromatography (GC using the EN 14103 standard. The behaviour of the biodiesel (variation of density and kinematic viscosity at various temperatures (10–40 °C was obtained and compared with that of diesel fuel. Finally, mass and energy balances were conducted for both the oil extraction and biodiesel conversion processes to analyse the total process losses of the system. The study found 49.23 wt % oil yield from mandarin seed and 96.82% conversion efficiency for converting oil to biodiesel using the designated transesterification reaction. The GC test identified eleven FAMEs. The biodiesel mainly contains palmitic acid (C16:0 26.80 vol %, stearic acid (C18:0 4.93 vol %, oleic acid (C18:1 21.43 vol % (including cis. and trans., linoleic acid (C18:2 4.07 vol %, and less than one percent each of other fatty acids. It is an important source of energy because it has a higher

  8. Long-term surveillance of zinc implant in murine artery: Surprisingly steady biocorrosion rate.

    Science.gov (United States)

    Drelich, Adam J; Zhao, Shan; Guillory, Roger J; Drelich, Jaroslaw W; Goldman, Jeremy

    2017-08-01

    Metallic zinc implanted into the abdominal aorta of rats out to 6months has been demonstrated to degrade while avoiding responses commonly associated with the restenosis of vascular implants. However, major questions remain regarding whether a zinc implant would ultimately passivate through the production of stable corrosion products or via a cell mediated fibrous encapsulation process that prevents the diffusion of critical reactants and products at the metal surface. Here, we have conducted clinically relevant long term in vivo studies in order to characterize late stage zinc implant biocorrosion behavior and products to address these critical questions. We found that zinc wires implanted in the murine artery exhibit steady corrosion without local toxicity for up to at least 20months post-implantation, despite a steady buildup of passivating corrosion products and intense fibrous encapsulation of the wire. Although fibrous encapsulation was not able to prevent continued implant corrosion, it may be related to the reduced chronic inflammation observed between 10 and 20months post-implantation. X-ray elemental and infrared spectroscopy analyses confirmed zinc oxide, zinc carbonate, and zinc phosphate as the main components of corrosion products surrounding the Zn implant. These products coincide with stable phases concluded from Pourbaix diagrams of a physiological solution and in vitro electrochemical impedance tests. The results support earlier predictions that zinc stents could become successfully bio-integrated into the arterial environment and safely degrade within a time frame of approximately 1-2years. Previous studies have shown zinc to be a promising candidate material for bioresorbable endovascular stenting applications. An outstanding question, however, is whether a zinc implant would ultimately passivate through the production of stable corrosion products or via a cell mediated tissue encapsulation process that prevented the diffusion of critical

  9. Collaborative Resilience to Episodic Shocks and Surprises: A Very Long-Term Case Study of Zanjera Irrigation in the Philippines 1979–2010

    Directory of Open Access Journals (Sweden)

    Ruth Yabes

    2015-07-01

    Full Text Available This thirty-year case study uses surveys, semi-structured interviews, and content analysis to examine the adaptive capacity of Zanjera San Marcelino, an indigenous irrigation management system in the northern Philippines. This common pool resource (CPR system exists within a turbulent social-ecological system (SES characterized by episodic shocks such as large typhoons as well as novel surprises, such as national political regime change and the construction of large dams. The Zanjera nimbly responded to these challenges, although sometimes in ways that left its structure and function substantially altered. While a partial integration with the Philippine National Irrigation Agency was critical to the Zanjera’s success, this relationship required on-going improvisation and renegotiation. Over time, the Zanjera showed an increasing capacity to learn and adapt. A core contribution of this analysis is the integration of a CPR study within an SES framework to examine resilience, made possible the occurrence of a wide range of challenges to the Zanjera’s function and survival over the long period of study. Long-term analyses like this one, however rare, are particularly useful for understanding the adaptive and transformative dimensions of resilience.

  10. Communication Management and Trust: Their Role in Building Resilience to "Surprises" Such As Natural Disasters, Pandemic Flu, and Terrorism

    Directory of Open Access Journals (Sweden)

    P. H. Longstaff

    2008-06-01

    Full Text Available In times of public danger such as natural disasters and health emergencies, a country's communication systems will be some of its most important assets because access to information will make individuals and groups more resilient. Communication by those charged with dealing with the situation is often critical. We analyzed reports from a wide variety of crisis incidents and found a direct correlation between trust and an organization's preparedness and internal coordination of crisis communication and the effectiveness of its leadership. Thus, trust is one of the most important variables in effective communication management in times of "surprise."

  11. Success in Science, Success in Collaboration

    Energy Technology Data Exchange (ETDEWEB)

    Johnston, Mariann R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-25

    This is a series of four different scientific problems which were resolved through collaborations. They are: "Better flow cytometry through novel focusing technology", "Take Off®: Helping the Agriculture Industry Improve the Viability of Sustainable, Large-Production Crops", "The National Institutes of Health's Models of Infectious Disease Agent Study (MIDAS)", and "Expanding the capabilities of SOLVE/RESOLVE through the PHENIX Consortium." For each one, the problem is listed, the solution, advantages, bottom line, then information about the collaboration including: developing the technology, initial success, and continued success.

  12. Successful Internet Entrepreneurs Don't Have to Be College Dropouts: A Model for Nurturing College Students to Become Successful Internet Entrepreneurs

    Science.gov (United States)

    Zhang, Sonya

    2014-01-01

    Some of today's most successful Internet entrepreneurs didn't graduate from college. Many young people today followed the same path to pursue their dreams however ended up failing, not a surprise because 80% of the startups fail in first 5 years. As technology innovation and market competition on Internet continue to accelerate, college students…

  13. The Project of Success

    DEFF Research Database (Denmark)

    Kreiner, Kristian

    more complicated matter than meeting targets. While success may ultimately be justified in terms of a correspondence between aims and achievements, the understanding of both aspects is highly dependent on the project process. An example of a successful project that did not meet the original performance...... targets will serve to show that success is at matter of perspective as much as it is a matter of achievement. Other types of research, e.g. social psychology, have addressed the issue of success more explicitly. I draw on such literature to conceptualize project success anew and to reestablish...

  14. Surprising finding on colonoscopy.

    Science.gov (United States)

    Griglione, Nicole; Naik, Jahnavi; Christie, Jennifer

    2010-02-01

    A 48-year-old man went to his primary care physician for his annual physical. He told his physician that for the past few years, he had intermittent, painless rectal bleeding consisting of small amounts of blood on the toilet paper after defecation. He also mentioned that he often spontaneously awoke, very early in the morning. His past medical history was unremarkable. The patient was born in Cuba but had lived in the United States for more than 30 years. He was divorced, lived alone, and had no children. He had traveled to Latin America-including Mexico, Brazil, and Cuba-off and on over the past 10 years. His last trip was approximately 2 years ago. His physical exam was unremarkable. Rectal examination revealed no masses or external hemorrhoids; stool was brown and Hemoccult negative. Labs were remarkable for eosinophilia ranging from 10% to 24% over the past several years (the white blood cell count ranged from 5200 to 5900/mcL). A subsequent colonoscopy revealed many white, thin, motile organisms dispersed throughout the colon. The organisms were most densely populated in the cecum. Of note, the patient also had nonbleeding internal hemorrhoids. An aspiration of the organisms was obtained and sent to the microbiology lab for further evaluation. What is your diagnosis? How would you manage this condition?

  15. Surprising quantum bounces

    CERN Document Server

    Nesvizhevsky, Valery

    2015-01-01

    This unique book demonstrates the undivided unity and infinite diversity of quantum mechanics using a single phenomenon: quantum bounces of ultra-cold particles. Various examples of such "quantum bounces" are: gravitational quantum states of ultra-cold neutrons (the first observed quantum states of matter in a gravitational field), the neutron whispering gallery (an observed matter-wave analog of the whispering gallery effect well known in acoustics and for electromagnetic waves), and gravitational and whispering gallery states for anti-matter atoms that remain to be observed. These quantum states are an invaluable tool in the search for additional fundamental short-range forces, for exploring the gravitational interaction and quantum effects of gravity, for probing physics beyond the standard model, and for furthering studies into the foundations of quantum mechanics, quantum optics, and surface science.

  16. More Supernova Surprises

    Science.gov (United States)

    2010-09-24

    originated in South America. E veryone appreciates the beauty of dai- sies, chrysanthemums, and sunfl ow- ers, and many of us enjoy eating lettuce ...few fossils. On page 1621 of this issue, Barreda et al. ( 1) describe an unusually well-preserved new fossil that sheds light on the history of

  17. More statistics, less surprise

    CERN Multimedia

    Antonella Del Rosso & the LHCb collaboration

    2013-01-01

    The LHCb collaboration has recently announced new results for a parameter that measures the CP violation effect in particles containing charm quarks. The new values obtained with a larger data set and with a new independent method are showing that the effect is smaller than previous measurements had  suggested. The parameter is back into the Standard Model picture.   CP violation signals – in particles containing charm quarks, such as the D0 particle, is a powerful probe of new physics. Indeed, such effects could result in unexpected values of parameters whose expectation values in the Standard Model are known. Although less precise than similar approaches used in particles made of b quarks, the investigation of the charm system has proven  to be intriguing. The LHCb collaboration has reported new measurements of ΔACP, the difference in CP violation between the D0→K+K– and D0→π+π– decays. The results are ob...

  18. Business Intelligence Success Factors

    DEFF Research Database (Denmark)

    Gaardboe, Rikke; Jonasen, Tanja Svarre

    2018-01-01

    Business intelligence (BI) is a strategically important practice in many organizations. Several studies have investigated the factors that contribute to BI success; however, an overview of the critical success factors (CSFs) involved is lacking in the extant literature. We have integrated...... 34 CSFs related to BI success. The distinct CSFs identified in the extant literature relate to project management skills (13 papers), management support (20 papers), and user involvement (11 papers). In the articles with operationalized BI success, we found several distinct factors: system quality...

  19. Examining Management Success Potential.

    Science.gov (United States)

    Quatrano, Louis A.

    The derivation of a model of management success potential in hospitals or health services administration is described. A questionnaire developed to assess management success potential in health administration students was voluntarily completed by approximately 700 incoming graduate students in 35 university health services administration programs…

  20. Ingredients for successful partnerships

    NARCIS (Netherlands)

    S.M. Pfisterer (Stella)

    2011-01-01

    textabstractFor the development of new cross-sector partnerships it is required to know what the essence of successful partnership projects is. Which factors influence success or failure of partnerships is highly related to the specific context where partnerships operate. The literature on critical

  1. Human Resource Outsourcing Success

    OpenAIRE

    Hasliza Abdul-Halim; Elaine Ee; T. Ramayah; Noor Hazlina Ahmad

    2014-01-01

    The existing literature on partnership seems to take the relationship between partnership quality and outsourcing success for granted. Therefore, this article aims at examining the role of service quality in strengthening the relationship between partnership quality and human resource (HR) outsourcing success. The samples were obtained from 96 manufacturing organizations in Penang, Malaysia. The results showed that par...

  2. Planning for College Success

    Science.gov (United States)

    PEPNet, 2009

    2009-01-01

    "Planning for College Success" (PCS) is a curriculum model designed by Sharon Downs, M.S., for a course intended to assist deaf and hard of hearing students during their initial introduction to college life. This program allows students to work one-on-one with a counselor to plan for their college success. The program includes short-term goals and…

  3. Mergers: Success versus failure

    International Nuclear Information System (INIS)

    Carley, G. R.

    1997-01-01

    Successful mergers in the context of long-term value creation, as measured by return realized on investor-provided capital, were discussed. In essence, a successful merger is characterized by being motivated by a sound business reason and strategy for the merger, a reasonable price and sound execution. The acquiror's pre-merger success in managing a company is a good indicator of future success. Poorly managed companies that acquire other companies generally continue to be poorly managed with no significant increase in shareholder value. Prior to the acquisition, identification of the potential target, assessment of the people involved on both sides of the transaction, thorough knowledge of the target's potential for value creation, financial implications (debt, equity, terms and demand, tax implications, the potential effect of the proposed acquisition on the acquiror's business plan) and finally the execution of the process itself, are the important determinants of successful mergers

  4. Academic Innovation in the Commercial Domain: Case Studies of Successful Transfers of University-Developed Technologies.

    Science.gov (United States)

    Powers, Joshua B.

    In recent years, considerable attention has been directed toward higher educations role as a driver of economic reform. Yet, surprisingly little is known about the processes and mechanisms by which academic innovations are successfully commercialized. The specific question is, what factors explain why some licensed innovations become bona fide…

  5. Successful removable partial dentures.

    Science.gov (United States)

    Lynch, Christopher D

    2012-03-01

    Removable partial dentures (RPDs) remain a mainstay of prosthodontic care for partially dentate patients. Appropriately designed, they can restore masticatory efficiency, improve aesthetics and speech, and help secure overall oral health. However, challenges remain in providing such treatments, including maintaining adequate plaque control, achieving adequate retention, and facilitating patient tolerance. The aim of this paper is to review the successful provision of RPDs. Removable partial dentures are a successful form of treatment for replacing missing teeth, and can be successfully provided with appropriate design and fabrication concepts in mind.

  6. ACTS – SUCCESS STORY

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. ACTS – SUCCESS STORY. Totally 103 experiments were conducted and the programme succeeded in the areas. Medicine; Education; Defence; Emergency Response; Maritime and Aeronautical Mobile Communications; Science and Astronomy.

  7. Goodbye Career, Hello Success.

    Science.gov (United States)

    Komisar, Randy

    2000-01-01

    Success in today's economy means throwing out the old career rules. The "noncareer" career is driven by passion for the work and has the fluidity and flexibility needed in the contemporary workplace. (JOW)

  8. Human Resource Outsourcing Success

    Directory of Open Access Journals (Sweden)

    Hasliza Abdul-Halim

    2014-07-01

    Full Text Available The existing literature on partnership seems to take the relationship between partnership quality and outsourcing success for granted. Therefore, this article aims at examining the role of service quality in strengthening the relationship between partnership quality and human resource (HR outsourcing success. The samples were obtained from 96 manufacturing organizations in Penang, Malaysia. The results showed that partnership quality variables such as trust, business understanding, and communication have significant positive impact on HR outsourcing success, whereas in general, service quality was found to partially moderate these relationships. Therefore, comprehending the HR outsourcing relationship in the context of service quality may assist the organizations to accomplish HR outsourcing success by identifying areas of expected benefits and improvements.

  9. Fertility Clinic Success Rates

    Science.gov (United States)

    ... Defects ART and Autism 2013 Assisted Reproductive Technology Fertility Clinic Success Rates Report Recommend on Facebook Tweet ... Additional Information About ART in the United States. Fertility Clinic Tables Introduction to Fertility Clinic Tables [PDF - ...

  10. Successful project management

    CERN Document Server

    Young, Trevor L

    2016-01-01

    Successful Project Management, 5th edition, is an essential guide for anyone who wants to improve the success rate of their projects. It will help managers to maintain a balance between the demands of the customer, the project, the team and the organization. Covering the more technical aspects of a project from start to completion it contains practised and tested techniques, covering project conception and start-up, how to manage stake holders, effective risk management, project planning and launch and execution. Also including a brand new glossary of key terms, it provides help with evaluating your project as well as practical checklists and templates to ensure success for any ambitious project manager. With over one million copies sold, the hugely popular Creating Success series covers a wide variety of topic, with the latest editions including new chapters such as Tough Conversations and Treating People Right. This indispensable business skills collection is suited to a variety of roles, from someone look...

  11. Definition of successful defibrillation

    NARCIS (Netherlands)

    Koster, Rudolph W.; Walker, Robert G.; van Alem, Anouk P.

    2006-01-01

    OBJECTIVES: The definition of defibrillation shock "success" endorsed by the International Liaison Committee on Resuscitation since the publication of Guidelines 2000 for Cardiopulmonary Resuscitation and Emergency Cardiac Care has been removal of ventricular fibrillation at 5 secs after shock

  12. Succession planning : phase II.

    Science.gov (United States)

    2008-06-01

    Succession planning is an organizational investment in the future. Institutional : knowledge is a critical ingredient in the culture of an organization, and its intangible : value becomes significant when an organization is faced with the need to pas...

  13. Research into Success

    Directory of Open Access Journals (Sweden)

    Bogomir Novak

    1997-12-01

    Full Text Available As competition is becoming ever more fierce, research into the prerequisites for success is gaining ground. By most people, success is perceived as an external phenomenon, but it is in fact the consequence of a person's readiness to perform in the world (of business. In the paper, Novak distinguishes between internal, external and group success. The essence of interna!success, which is the condition for the other two types of success, is assuming responsibility for, and exercising self-control over one's psychic phenomena. This in fact means that one needs to "reprogramme" the old patterns of behaviour and substitute them for the new, which leads to personality changes based on the understanding and acceptance of the self and others as they are. In realizing personal abilities, motives and goals, mental guiding laws must also be taken into account. Nowadays, the overall success of an organization is an important indicator of the quality of gro up work. The working patterns of individuals comply with the patterns used by his or her colleagues. When we do something for ourselves, we do it for others. In certain organizations, through accepted ways of communication all people become successful, and no body needs to be paid off. Employees wholly identify themselves with their organization, and vice versa. This three-part paradigm (I-Others-Community is the basis for various models of practical training for success, which are often idealized, but are primarily aimed at abolishing passivity and flaws in the system and its wider environment.

  14. Surprisingly high specificity of the PPD skin test for M. tuberculosis infection from recent exposure in The Gambia.

    Science.gov (United States)

    Hill, Philip C; Brookes, Roger H; Fox, Annette; Jackson-Sillah, Dolly; Lugos, Moses D; Jeffries, David J; Donkor, Simon A; Adegbola, Richard A; McAdam, Keith P W J

    2006-12-20

    Options for intervention against Mycobacterium tuberculosis infection are limited by the diagnostic tools available. The Purified Protein Derivative (PPD) skin test is thought to be non-specific, especially in tropical settings. We compared the PPD skin test with an ELISPOT test in The Gambia. Household contacts over six months of age of sputum smear positive TB cases and community controls were recruited. They underwent a PPD skin test and an ELISPOT test for the T cell response to PPD and ESAT-6/CFP10 antigens. Responsiveness to M. tuberculosis exposure was analysed according to sleeping proximity to an index case using logistic regression. 615 household contacts and 105 community controls were recruited. All three tests assessed increased significantly in positivity with increasing M. tuberculosis exposure, the PPD skin test most dramatically (OR 15.7; 95% CI 6.6-35.3). While the PPD skin test positivity continued to trend downwards in the community with increasing distance from a known case (61.9% to 14.3%), the PPD and ESAT-6/CFP-10 ELISPOT positivity did not. The PPD skin test was more in agreement with ESAT-6/CFP-10 ELISPOT (75%, p = 0.01) than the PPD ELISPOT (53%, pPPD skin test positive increased (pPPD skin test negative decreased (pPPD skin test has surprisingly high specificity for M. tuberculosis infection from recent exposure in The Gambia. In this setting, anti-tuberculous prophylaxis in PPD skin test positive individuals should be revisited.

  15. Successful ageing for psychiatrists.

    Science.gov (United States)

    Peisah, Carmelle

    2016-04-01

    This paper aims to explore the concept and determinants of successful ageing as they apply to psychiatrists as a group, and as they can be applied specifically to individuals. Successful ageing is a heterogeneous, inclusive concept that is subjectively defined. No longer constrained by the notion of "super-ageing", successful ageing can still be achieved in the face of physical and/or mental illness. Accordingly, it remains within the reach of most of us. It can, and should be, person-specific and individually defined, specific to one's bio-psycho-social and occupational circumstances, and importantly, reserves. Successful professional ageing is predicated upon insight into signature strengths, with selection of realistic goal setting and substitution of new goals, given the dynamic nature of these constructs as we age. Other essential elements are generativity and self-care. Given that insight is key, taking a regular stock or inventory of our reserves across bio-psycho-social domains might be helpful. Importantly, for successful ageing, this needs to be suitably matched to the professional task and load. This lends itself to a renewable personal ageing plan, which should be systemically adopted with routine expectations of self-care and professional responsibility. © The Royal Australian and New Zealand College of Psychiatrists 2015.

  16. The Impact of a Surprise Dividend Increase on a Stocks Performance : the Analysis of Companies Listed on the Warsaw Stock Exchange

    Directory of Open Access Journals (Sweden)

    Tomasz Słoński

    2012-01-01

    Full Text Available The reaction of marginal investors to the announcement of a surprise dividend increase has been measured. Although field research is performed on companies listed on the Warsaw Stock Exchange, the paper has important theoretical implications. Valuation theory gives many clues for the interpretation of changes in dividends. At the start of the literature review, the assumption of the irrelevance of dividends (to investment decisions is described. This assumption is the basis for up-to-date valuation procedures leading to fundamental and fair market valuation of equity (shares. The paper is designed to verify whether the market value of stock is immune to the surprise announcement of a dividend increase. This study of the effect of a surprise dividend increase gives the chance to partially isolate such an event from dividend changes based on long-term expectations. The result of the research explicitly shows that a surprise dividend increase is on average welcomed by investors (an average abnormal return of 2.24% with an associated p-value of 0.001. Abnormal returns are realized by investors when there is a surprise increase in a dividend payout. The subsample of relatively high increases in a dividend payout enables investors to gain a 3.2% return on average. The results show that valuation models should be revised to take into account a possible impact of dividend changes on investors behavior. (original abstract

  17. Bangladesh becomes "success story".

    Science.gov (United States)

    1999-01-01

    The State Minister for Health and Family of Bangladesh, Dr. Mohammed Amanullah, highlighted some of the successes being achieved by his country in lowering fertility and improving the lives of the people since the 1994 International Conference on Population and Development. Some of these successes include practical measures to eliminate violence against women; introduction of a quota for women in public sector employment; and launching of the Health and Population Sector Program to provide a one-stop, full range of essential reproductive health, family planning and child health services through an integrated delivery mechanism. Moreover, the Minister informed the Forum participants that their success is attributable to many factors which include support from the government, from non-governmental organizations, civil society, mass media, religious and other community leaders, intersectoral collaboration, microcredit and income-generation activities.

  18. A new in vivo model of pantothenate kinase-associated neurodegeneration reveals a surprising role for transcriptional regulation in pathogenesis.

    Directory of Open Access Journals (Sweden)

    Varun ePandey

    2013-09-01

    Full Text Available Pantothenate Kinase-Associated Neurodegeneration (PKAN is a neurodegenerative disorder with a poorly understood molecular mechanism. It is caused by mutations in Pantothenate Kinase, the first enzyme in the Coenzyme A (CoA biosynthetic pathway. Here, we developed a Drosophila model of PKAN (tim-fbl flies that allows us to continuously monitor the modeled disease in the brain. In tim-fbl flies, downregulation of fumble, the Drosophila PanK homologue in the cells containing a circadian clock results in characteristic features of PKAN such as developmental lethality, hypersensitivity to oxidative stress, and diminished life span. Despite quasi-normal circadian transcriptional rhythms, tim-fbl flies display brain-specific aberrant circadian locomotor rhythms, and a unique transcriptional signature. Comparison with expression data from flies exposed to paraquat demonstrates that, as previously suggested, pathways others than oxidative stress are affected by PANK downregulation. Surprisingly we found a significant decrease in the expression of key components of the photoreceptor recycling pathways, which could lead to retinal degeneration, a hallmark of PKAN. Importantly, these defects are not accompanied by changes in structural components in eye genes suggesting that changes in gene expression in the eye precede and may cause the retinal degeneration. Indeed tim-fbl flies have diminished response to light transitions, and their altered day/night patterns of activity demonstrates defects in light perception. This suggest that retinal lesions are not solely due to oxidative stress and demonstrates a role for the transcriptional response to CoA deficiency underlying the defects observed in dPanK deficient flies. Moreover, in the present study we developed a new fly model that can be applied to other diseases and that allows the assessment of neurodegeneration in the brains of living flies.

  19. SUPERCOLLIDER: String test success

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    On 14 August at the Superconducting Supercollider (SSC) Laboratory in Ellis County, Texas, the Accelerator Systems String Test (ASST) successfully met its objective by operating a half-cell of five collider dipole magnets, one quadrupole magnet, and two spool pieces at the design current of 6500 amperes

  20. Mindfulness and Student Success

    Science.gov (United States)

    Leland, Matt

    2015-01-01

    Mindfulness has long been practiced in Eastern spiritual traditions for personal improvement, and educators and educational institutions have recently begun to explore its usefulness in schools. Mindfulness training can be valuable for helping students be more successful learners and more connected members of an educational community. To determine…

  1. International Student Success

    Science.gov (United States)

    Smith, Clayton

    2016-01-01

    This article, with a focus on North American postsecondary education, identifies international students as a strategic enrollment management institutional priority; presents themes in the international student retention, satisfaction, and success research literature; and describes related best practices. It also presents the findings from an…

  2. America's Success Syndrome

    Science.gov (United States)

    Duplisea, Eric A.

    1974-01-01

    America's earliest schools taught career awareness and job skills, but for 200 years it was a speciality curriculum--cultivating a classical heritage predominated. Recently the hard sell message is that schooling and credentialism ensure entry into the "successful life". Vocational educators must become leaders, explode this myth, and redefine…

  3. Focus on Success

    Science.gov (United States)

    Frey, Susan

    2011-01-01

    Successful middle schools do not happen by accident--they happen through leadership. Principals promote a shared vision that empowers school staffs to set high standards and continuously improve student achievement. And these middle grade educators also try to help their adolescent students see the connection between their work in school and their…

  4. Successful international negotiations

    International Nuclear Information System (INIS)

    Gerry, G.

    1997-01-01

    These remarks on successful international trade negotiations deal with the following topics: culture and differences in psychology; building friendly relationships and letting both sides appear to win; well written proposals; security of negotiating information; the complexity and length of nuclear negotiations

  5. Success in Entrepreneurship

    DEFF Research Database (Denmark)

    Iversen, Jens; Malchow-Møller, Nikolaj; Sørensen, Anders

    2016-01-01

    What makes a successful entrepreneur? Using Danish register data, we find strong support for the hypothesis that theoretical skills from schooling and practical skills acquired through wage-work are complementary inputs in the human capital earnings function of entrepreneurs. In fact, we find tha...

  6. Successfully Adapting to Change.

    Science.gov (United States)

    Baird, James R.

    1989-01-01

    Describes methods used to successfully adapt to reductions in budget allocations in the University of Utah's Instructional Media Services Department. Three main areas of concern are addressed: morale and staff development; adapting to change in the areas of funding, control, media priorities, and technology; and planning for the future. (LRW)

  7. Beyond Success and Failure

    NARCIS (Netherlands)

    Etalle, Sandro; Jaffar, Joxan; van Raamsdonk, Femke

    We study a new programming framework based on logic programming where success and failure are replaced by predicates for adequacy and inadequacy. Adequacy allows to extract a result from a partial computation, and inadequacy allows to flexibly constrain the search space. In this parameterized

  8. Predicting Commissary Store Success

    Science.gov (United States)

    2014-12-01

    stores or if it is possible to predict that success. Multiple studies of private commercial grocery consumer preferences , habits and demographics have...appropriate number of competitors due to the nature of international cultures and consumer preferences . 2. Missing Data Four of the remaining stores

  9. Characteristics of Successful Entrepreneurs.

    Science.gov (United States)

    McClelland, David C.

    1987-01-01

    Comparison of characteristics of 12 average and 12 superior small business people in three developing nations (India, Malawi, and Ecuador) found proactive qualities such as initiative and assertiveness, achievement orientation, and commitment to others characteristic of successful entrepreneurs. Other expected qualities (self-confidence,…

  10. Measuring strategic success.

    Science.gov (United States)

    Gish, Ryan

    2002-08-01

    Strategic triggers and metrics help healthcare providers achieve financial success. Metrics help assess progress toward long-term goals. Triggers signal market changes requiring a change in strategy. All metrics may not move in concert. Organizations need to identify indicators, monitor performance.

  11. Passport to Success

    Science.gov (United States)

    Tomlinson, Nigel

    2002-01-01

    Looks at the "Passport to Success" scheme introduced by the Sheffield Chamber of Commerce in a bid to address the employability skills problem among young people. States that the scheme was launched in September 2001 in partnership with a local comprehensive school with the intention of helping pupils make the transition from school into…

  12. Leading to Success

    Science.gov (United States)

    Koballa, Thomas R., Jr.; Bradbury, Leslie U.

    2009-01-01

    Teacher mentoring has its unique challenges that are often associated with the teachers' content specialties. For this reason, the involvement and support of school leaders is essential to teachers' mentoring success. Regardless of content specialty, all teachers face challenges that should be considered when organizing and implementing mentoring.…

  13. Successful introduction of innovations

    International Nuclear Information System (INIS)

    Schoots, K.; Jeeninga, H.

    2008-01-01

    The introduction of new technology is sometimes troubled by discontinuity in incentive schemes. By making prior assessments of the necessary means, the real time span for the incentive scheme and by maintaining this scheme until the technology is mature enough to enter the market, the success of innovation trajectories can be increased significantly. [mk] [nl

  14. Successful introduction of innovation

    International Nuclear Information System (INIS)

    Schoots, K.; Jeeninga, H.

    2008-01-01

    The introduction of new technology sometimes proceeds sluggishly due to discontinuity in incentive schemes. Estimating in advance which means are required, what a realistic time span is for the incentive scheme and continuing this scheme until the technology is marketable can significantly increase the success of innovation trajectories. [mk] [nl

  15. Successfully combating prejudice

    Indian Academy of Sciences (India)

    Lawrence

    Sir Jagdish Chandra Bose, and fascinated by his work that showed that plants were ... U.S., in 1972, I was invited to take up a faculty position at the newly established ... success because of their different social commitments. Today when I look ...

  16. Designing for success

    Energy Technology Data Exchange (ETDEWEB)

    Altounyan, P.; Hurt, K.; Bigby, D. [Rock Mechanics Technology, Stanhope Bretby (United Kingdom)

    1999-07-01

    Successful underground coal mining is dependent on a number of key factors, particularly geotechnical suitability. The impact of rock mechanics on underground mine design and mining methods is discussed in this article. Methods on minimising stress effects in room and pillar mining, and longwall mining are outlined. The use of computer numerical modelling in mine design is mentioned. 8 figs.

  17. A Metamodeling Approach for Reasoning about Requirements

    NARCIS (Netherlands)

    Göknil, Arda; Ivanov, Ivan; van den Berg, Klaas; Schieferdecker, I.; Hartman, A.

    In requirements engineering, there are several approaches for requirements modeling such as goal-oriented, aspect-driven, and system requirements modeling. In practice, companies often customize a given approach to their specific needs. Thus, we seek a solution that allows customization in a

  18. The surprisingly small but increasing role of international agricultural trade on the European Union’s dependence on mineral phosphorus fertiliser

    Science.gov (United States)

    Nesme, Thomas; Roques, Solène; Metson, Geneviève S.; Bennett, Elena M.

    2016-02-01

    Phosphorus (P) is subject to global management challenges due to its importance to both food security and water quality. The European Union (EU) has promoted policies to limit fertiliser over-application and protect water quality for more than 20 years, helping to reduce European P use. Over this time period, the EU has, however, become more reliant on imported agricultural products. These imported products require fertiliser to be used in distant countries to grow crops that will ultimately feed European people and livestock. As such, these imports represent a displacement of European P demand, possibly allowing Europe to decrease its apparent P footprint by moving P use to locations outside the EU. We investigated the effect of EU imports on the European P fertiliser footprint to better understand whether the EU’s decrease in fertiliser use over time resulted from P demand being ‘outsourced’ to other countries or whether it truly represented a decline in P demand. To do this, we quantified the ‘virtual P flow’ defined as the amount of mineral P fertiliser applied to agricultural soils in non-EU countries to support agricultural product imports to the EU. We found that the EU imported a virtual P flow of 0.55 Tg P/yr in 1995 that, surprisingly, decreased to 0.50 Tg P/yr in 2009. These results were contrary to our hypothesis that trade increases would be used to help the EU reduce its domestic P fertiliser use by outsourcing its P footprint abroad. Still, the contribution of virtual P flows to the total P footprint of the EU has increased by 40% from 1995 to 2009 due to a dramatic decrease in domestic P fertiliser use in Europe: in 1995, virtual P was equivalent to 32% of the P used as fertiliser domestically to support domestic consumption but jumped to 53% in 2009. Soybean and palm tree products from South America and South East Asia contributed most to the virtual P flow. These results demonstrate that, although policies in the EU have successfully

  19. Styles of success

    DEFF Research Database (Denmark)

    Dahlgaard, Jens Jørn; Nørgaard, Anders; Jakobsen, Søren

    1997-01-01

    Corporate success stories tend to emphasize the "great men" theory of history. But now a European research project established the managerial attributes that can turn an ordinary leader into one ideal for the pursuit of business excellence. The emergence of five leadership styles as crucial drivers...... of business excellence points to a clear agenda for success. Setting clear strategic goals and the ability to take a long-term view of an organization's direction, combined with other leadership attributes such as creativity, teambuilding and learning, are principal keys to creating an excellent organization....... Leaders seeking to achive business excellence must view the high-level attainment of these sets of leadership competencies as their paramount objective. In striving for business excellence, European leaders may encounter resistance among their employees. Crucially, European employees place a markedly...

  20. Small(pox) success?

    Science.gov (United States)

    Birn, Anne-Emanuelle

    2011-02-01

    The 30th anniversary of the World Health Organization's (WHO) official certification of smallpox eradication was marked by a slew of events hailing the campaign's dramatic tale of technological and organizational triumph against an ancient scourge. Yet commemorations also serve as moments of critical reflection. This article questions the acclaim showered upon smallpox eradication as the single greatest public health success in history. It examines how and why smallpox eradication and WHO's concurrent social justice-oriented primary health care approach (following from the Declaration of Alma-Ata) became competing paradigms. It synthesizes critiques of eradication's shortcomings and debunks some of the myths surrounding the global eradication campaign as a public health priority and necessity, and as a Cold War victory of cooperation. The article concludes with thoughts on integrating technical and social-political aspects of health within the context of welfare states as the means to achieving widespread and enduring global public health success.

  1. Small(pox success?

    Directory of Open Access Journals (Sweden)

    Anne-Emanuelle Birn

    Full Text Available The 30th anniversary of the World Health Organization's (WHO official certification of smallpox eradication was marked by a slew of events hailing the campaign's dramatic tale of technological and organizational triumph against an ancient scourge. Yet commemorations also serve as moments of critical reflection. This article questions the acclaim showered upon smallpox eradication as the single greatest public health success in history. It examines how and why smallpox eradication and WHO's concurrent social justice-oriented primary health care approach (following from the Declaration of Alma-Ata became competing paradigms. It synthesizes critiques of eradication's shortcomings and debunks some of the myths surrounding the global eradication campaign as a public health priority and necessity, and as a Cold War victory of cooperation. The article concludes with thoughts on integrating technical and social-political aspects of health within the context of welfare states as the means to achieving widespread and enduring global public health success.

  2. Profile of success

    DEFF Research Database (Denmark)

    Dahlgaard, Jens Jørn; Nørgaard, Anders; Jakobsen, Søren

    1998-01-01

    What management skills must Europe's business leaders improve to achieve business excellence? Which country's leaders are best placed for success? Does the next generation have what it takes to compete? In the second half of their study of the leadership styles that drive business excellence, Jens...... Dahlgaard, Anders Nørgaard and Søren Jakobsen describe an excellent leadership profile that provides the answers....

  3. Successful time management

    CERN Document Server

    Forsyth, Patrick

    2016-01-01

    Packed with tips and techniques, Successful Time Management serves as a guide to reviewing and assessing new work practices to improve time management. It includes great time-saving ideas, practical solutions, checklists, and advice on controlling paperwork, delegating and working with others, prioritizing to focus on key issues, and getting and staying organized. This new third edition contains new practical tips on using email in a time effective manner and dealing with other internet-based tools and apps to help productivity.

  4. Hypopituitarism and successful pregnancy

    OpenAIRE

    Du, Xue; Yuan, Qing; Yao, Yanni; Li, Zengyan; Zhang, Huiying

    2014-01-01

    Hypopituitarism is a disorder characterized by the deficiency of one or more of the hormones secreted by the pituitary gland. Hypopituitarism patients may present the symptoms of amenorrhea, poor pregnancy potential, infertility, and no production of milk after delivery. Successful pregnancy in hypopituitarism patient is rare because hypopituitarism is associated with an increased risk of pregnancy complications, such as abortion, anemia, pregnancy-induced hypertension, placental abruption, p...

  5. Semantic relation vs. surprise: the differential effects of related and unrelated co-verbal gestures on neural encoding and subsequent recognition.

    Science.gov (United States)

    Straube, Benjamin; Meyer, Lea; Green, Antonia; Kircher, Tilo

    2014-06-03

    Speech-associated gesturing leads to memory advantages for spoken sentences. However, unexpected or surprising events are also likely to be remembered. With this study we test the hypothesis that different neural mechanisms (semantic elaboration and surprise) lead to memory advantages for iconic and unrelated gestures. During fMRI-data acquisition participants were presented with video clips of an actor verbalising concrete sentences accompanied by iconic gestures (IG; e.g., circular gesture; sentence: "The man is sitting at the round table"), unrelated free gestures (FG; e.g., unrelated up down movements; same sentence) and no gestures (NG; same sentence). After scanning, recognition performance for the three conditions was tested. Videos were evaluated regarding semantic relation and surprise by a different group of participants. The semantic relationship between speech and gesture was rated higher for IG (IG>FG), whereas surprise was rated higher for FG (FG>IG). Activation of the hippocampus correlated with subsequent memory performance of both gesture conditions (IG+FG>NG). For the IG condition we found activation in the left temporal pole and middle cingulate cortex (MCC; IG>FG). In contrast, for the FG condition posterior thalamic structures (FG>IG) as well as anterior and posterior cingulate cortices were activated (FG>NG). Our behavioral and fMRI-data suggest different mechanisms for processing related and unrelated co-verbal gestures, both of them leading to enhanced memory performance. Whereas activation in MCC and left temporal pole for iconic co-verbal gestures may reflect semantic memory processes, memory enhancement for unrelated gestures relies on the surprise response, mediated by anterior/posterior cingulate cortex and thalamico-hippocampal structures. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Successful innovation by motivation

    Directory of Open Access Journals (Sweden)

    Petra Koudelková

    2015-10-01

    Full Text Available Innovation is one of the most important factors for business growth. Human capital plays a significant role in the successful process of innovation. This article deals with employee motivation in the innovation process and the main scientific aim of this study is to present results of research that was undertaken in the Czech Republic at the beginning of 2013. Questionnaires were used for the survey and statistical analyses such as Chi square test or Hierarchical cluster analysis were used for data processing. This study also provides a theoretical and practical overview of business innovation in the Czech Republic.

  7. LEIR commissioning successfully completed

    CERN Multimedia

    2006-01-01

    An important milestone has been passed in the preparation of the injector complex to supply ions to the LHC experiments. The LEIR lead-ion beam, seen on one of the control screens just before the PS injection region. The Low-Energy Ion Ring - LEIR for short - has passed its first tests with flying colours. On 12 May, the ring that will accumulate lead ions for the LHC was shut down after seven months of tests (see Bulletin 44/2005). 'The commissioning phase was a resounding success,' enthuses a satisfied Michel Chanel, head of the LEIR construction project. After several months of fine-tuning, the LEIR team has achieved its aim of producing the kind of beam required for first lead-ion collisions in the LHC in 2008. This involved creating bunches containing 230 million ions, in line with the specifications for those first beams. This success can be put down to the machine's outstanding design and components. 'It's a great achivement by all the teams involved in the machine's construction,' underlines Christian...

  8. LHC synchronization test successful

    CERN Multimedia

    The synchronization of the LHC's clockwise beam transfer system and the rest of CERN's accelerator chain was successfully achieved last weekend. Tests began on Friday 8 August when a single bunch of a few particles was taken down the transfer line from the SPS accelerator to the LHC. After a period of optimization, one bunch was kicked up from the transfer line into the LHC beam pipe and steered about 3 kilometres around the LHC itself on the first attempt. On Saturday, the test was repeated several times to optimize the transfer before the operations group handed the machine back for hardware commissioning to resume on Sunday. The anti-clockwise synchronization systems will be tested over the weekend of 22 August.Picture:http://lhc-injection-test.web.cern.ch/lhc-injection-test/

  9. Iridium: failures & successes

    Science.gov (United States)

    Christensen, CarissaBryce; Beard, Suzette

    2001-03-01

    This paper will provide an overview of the Iridium business venture in terms of the challenges faced, the successes achieved, and the causes of the ultimate failure of the venture — bankruptcy and system de-orbit. The paper will address technical, business, and policy issues. The intent of the paper is to provide a balanced and accurate overview of the Iridium experience, to aid future decision-making by policy makers, the business community, and technical experts. Key topics will include the history of the program, the objectives and decision-making of Motorola, the market research and analysis conducted, partnering strategies and their impact, consumer equipment availability, and technical issues — target performance, performance achieved, technical accomplishments, and expected and unexpected technical challenges. The paper will use as sources trade media and business articles on the Iridium program, technical papers and conference presentations, Wall Street analyst's reports, and, where possible, interviews with participants and close observers.

  10. Decommissioning. Success with preparation

    International Nuclear Information System (INIS)

    Klasen, Joerg; Schulz, Rolf; Wilhelm, Oliver

    2017-01-01

    The decommissioning of a nuclear power plant poses a significant challenge for the operating company. The business model is turned upside down and a working culture developed for power operation has to be adapted while necessary know- how for the upcoming tasks has to be built up. The trauma for the employees induced by the final plant shut-down has to be considered and respected. The change of working culture in the enterprise has to be managed and the organization has to be prepared for the future. Here the methods of Change-Management offer a systematic and effective approach. Confidence in the employee's competencies is one of the key success factors for the change into the future.

  11. Success in geothermal development

    International Nuclear Information System (INIS)

    Stefansson, V.

    1992-01-01

    Success in geothermal development can be defined as the ability to produce geothermal energy at compatible energy prices to other energy sources. Drilling comprises usually the largest cost in geothermal development, and the results of drilling is largely influencing the final price of geothermal energy. For 20 geothermal fields with operating power plants, the ratio between installed capacity and the total number of well in the field is 1.9 MWe/well. The drilling history in 30 geothermal fields are analyzed by plotting the average cumulative well outputs as function of the number of wells drilled in the field. The range of the average well output is 1-10 MWe/well with the mean value 4.2 MWe/well for the 30 geothermal fields studied. A leaning curve is defined as the number of wells drilled in each field before the average output per well reaches a fairly constant value, which is characteristic for the geothermal reservoir. The range for this learning time is 4-36 wells and the average is 13 wells. In general, the average well output in a given field is fairly constant after some 10-20 wells has been drilled in the field. The asymptotic average well output is considered to be a reservoir parameter when it is normalized to the average drilling depth. In average, this reservoir parameter can be expressed as 3.3 MWe per drilled km for the 30 geothermal fields studied. The lifetime of the resource or the depletion time of the geothermal reservoir should also be considered as a parameter influencing the success of geothermal development. Stepwise development, where the reservoir response to the utilization for the first step is used to determine the timing of the installment of the next step, is considered to be an appropriate method to minimize the risk for over investment in a geothermal field

  12. Colour for Behavioural Success

    Science.gov (United States)

    Reeves, Adam

    2018-01-01

    Colour information not only helps sustain the survival of animal species by guiding sexual selection and foraging behaviour but also is an important factor in the cultural and technological development of our own species. This is illustrated by examples from the visual arts and from state-of-the-art imaging technology, where the strategic use of colour has become a powerful tool for guiding the planning and execution of interventional procedures. The functional role of colour information in terms of its potential benefits to behavioural success across the species is addressed in the introduction here to clarify why colour perception may have evolved to generate behavioural success. It is argued that evolutionary and environmental pressures influence not only colour trait production in the different species but also their ability to process and exploit colour information for goal-specific purposes. We then leap straight to the human primate with insight from current research on the facilitating role of colour cues on performance training with precision technology for image-guided surgical planning and intervention. It is shown that local colour cues in two-dimensional images generated by a surgical fisheye camera help individuals become more precise rapidly across a limited number of trial sets in simulator training for specific manual gestures with a tool. This facilitating effect of a local colour cue on performance evolution in a video-controlled simulator (pick-and-place) task can be explained in terms of colour-based figure-ground segregation facilitating attention to local image parts when more than two layers of subjective surface depth are present, as in all natural and surgical images. PMID:29770183

  13. Would you be surprised if this patient died?: Preliminary exploration of first and second year residents' approach to care decisions in critically ill patients

    Directory of Open Access Journals (Sweden)

    Armstrong John D

    2003-01-01

    Full Text Available Abstract Background How physicians approach decision-making when caring for critically ill patients is poorly understood. This study aims to explore how residents think about prognosis and approach care decisions when caring for seriously ill, hospitalized patients. Methods Qualitative study where we conducted structured discussions with first and second year internal medicine residents (n = 8 caring for critically ill patients during Medical Intensive Care Unit Ethics and Discharge Planning Rounds. Residents were asked to respond to questions beginning with "Would you be surprised if this patient died?" Results An equal number of residents responded that they would (n = 4 or would not (n = 4 be surprised if their patient died. Reasons for being surprised included the rapid onset of an acute illness, reversible disease, improving clinical course and the patient's prior survival under similar circumstances. Residents reported no surprise with worsening clinical course. Based on the realization that their patient might die, residents cited potential changes in management that included clarifying treatment goals, improving communication with families, spending more time with patients and ordering fewer laboratory tests. Perceived or implied barriers to changes in management included limited time, competing clinical priorities, "not knowing" a patient, limited knowledge and experience, presence of diagnostic or prognostic uncertainty and unclear treatment goals. Conclusions These junior-level residents appear to rely on clinical course, among other factors, when assessing prognosis and the possibility for death in severely ill patients. Further investigation is needed to understand how these factors impact decision-making and whether perceived barriers to changes in patient management influence approaches to care.

  14. On predicting quantal cross sections by interpolation: Surprisal analysis of j/sub z/CCS and statistical j/sub z/ results

    International Nuclear Information System (INIS)

    Goldflam, R.; Kouri, D.J.

    1976-01-01

    New methods for predicting the full matrix of integral cross sections are developed by combining the surprisal analysis of Bernstein and Levine with the j/sub z/-conserving coupled states method (j/sub z/CCS) of McGuire, Kouri, and Pack and with the statistical j/sub z/ approximation (Sj/sub z/) of Kouri, Shimoni, and Heil. A variety of approaches is possible and only three are studied in the present work. These are (a) a surprisal fit of the j=0→j' column of the j/sub z/CCS cross section matrix (thereby requiring only a solution of the lambda=0 set of j/sub z/CCS equations), (b) a surprisal fit of the lambda-bar=0 Sj/sub z/ cross section matrix (again requiring solution of the lambda=0 set of j/sub z/CCS equations only), and (c) a surprisal fit of a lambda-bar not equal to 0 Sj/sub z/ submatrix (involving input cross sections for j,j'> or =lambda-bar transitions only). The last approach requires the solution of the lambda=lambda-bar set of j/sub z/CCS equations only, which requires less computation effort than the effective potential method. We explore three different choices for the prior and two-parameter (i.e., linear) and three-parameter (i.e., parabolic) fits as applied to Ar--N 2 collisions. The results are in general very encouraging and for one choice of prior give results which are within 20% of the exact j/sub z/CCS results

  15. ‘Surprise’: Outbreak of Campylobacter infection associated with chicken liver pâté at a surprise birthday party, Adelaide, Australia, 2012

    OpenAIRE

    Emma Denehy; Amy Parry; Emily Fearnley

    2012-01-01

    Objective: In July 2012, an outbreak of Campylobacter infection was investigated by the South Australian Communicable Disease Control Branch and Food Policy and Programs Branch. The initial notification identified illness at a surprise birthday party held at a restaurant on 14 July 2012. The objective of the investigation was to identify the potential source of infection and institute appropriate intervention strategies to prevent further illness.Methods: A guest list was obtained and a retro...

  16. Success and adaptation

    CERN Multimedia

    2013-01-01

    Yesterday morning, the last colliding proton beams of 2013 were extracted from the LHC, heralding the start of the machine’s first long shutdown (LS1) and crowning its first three glorious years of running. I hardly need to tell the CERN community what a fabulous performance all the people running the machine, the experiments, the computing and all supporting infrastructures put in. Those people are you, and you all know very well what a great job everyone did.   Nevertheless, I would like to express my thanks to all the people who made this first LHC run such a success. Re-measuring the whole Standard Model in such a short period, and then completing it with the discovery of what looks increasingly like the Higgs boson, is no mean feat. What I’d like to focus on today is another aspect of our field: its remarkable ability to adapt. When I started out in research, experiments involved a handful of people and lasted a few years at most. The timescale for the development of ...

  17. Assessing call centers’ success:

    Directory of Open Access Journals (Sweden)

    Hesham A. Baraka

    2013-07-01

    This paper introduces a model to evaluate the performance of call centers based on the Delone and McLean Information Systems success model. A number of indicators are identified to track the call center’s performance. Mapping of the proposed indicators to the six dimensions of the D&M model is presented. A Weighted Call Center Performance Index is proposed to assess the call center performance; the index is used to analyze the effect of the identified indicators. Policy-Weighted approach was used to assume the weights with an analysis of different weights for each dimension. The analysis of the different weights cases gave priority to the User satisfaction and net Benefits dimension as the two outcomes from the system. For the input dimensions, higher priority was given to the system quality and the service quality dimension. Call centers decision makers can use the tool to tune the different weights in order to reach the objectives set by the organization. Multiple linear regression analysis was used in order to provide a linear formula for the User Satisfaction dimension and the Net Benefits dimension in order to be able to forecast the values for these two dimensions as function of the other dimensions

  18. Auditing Marketing Strategy Implementation Success

    OpenAIRE

    Herhausen, Dennis; Egger, Thomas; Oral, Cansu

    2014-01-01

    What makes a marketing strategy implementation successful and how can managers measure this success? To answer these questions, we developed a two-step audit approach. First, managers should measure the implementation success regarding effectiveness, efficiency, performance outcomes, and strategic embeddedness. Second, they should explore the reasons that have led to success or failure by regarding managerial, leadership, and environmental traps. Doing so will also provide corrective action p...

  19. Analysis of successive data sets

    NARCIS (Netherlands)

    Spreeuwers, Lieuwe Jan; Breeuwer, Marcel; Haselhoff, Eltjo Hans

    2008-01-01

    The invention relates to the analysis of successive data sets. A local intensity variation is formed from such successive data sets, that is, from data values in successive data sets at corresponding positions in each of the data sets. A region of interest is localized in the individual data sets on

  20. Analysis of successive data sets

    NARCIS (Netherlands)

    Spreeuwers, Lieuwe Jan; Breeuwer, Marcel; Haselhoff, Eltjo Hans

    2002-01-01

    The invention relates to the analysis of successive data sets. A local intensity variation is formed from such successive data sets, that is, from data values in successive data sets at corresponding positions in each of the data sets. A region of interest is localized in the individual data sets on

  1. Efficient reconfigurable hardware architecture for accurately computing success probability and data complexity of linear attacks

    DEFF Research Database (Denmark)

    Bogdanov, Andrey; Kavun, Elif Bilge; Tischhauser, Elmar

    2012-01-01

    An accurate estimation of the success probability and data complexity of linear cryptanalysis is a fundamental question in symmetric cryptography. In this paper, we propose an efficient reconfigurable hardware architecture to compute the success probability and data complexity of Matsui's Algorithm...... block lengths ensures that any empirical observations are not due to differences in statistical behavior for artificially small block lengths. Rather surprisingly, we observed in previous experiments a significant deviation between the theory and practice for Matsui's Algorithm 2 for larger block sizes...

  2. Presentation of an umbilical cord cyst with a surprising jet: a case report of a patent urachus [v1; ref status: indexed, http://f1000r.es/xx

    Directory of Open Access Journals (Sweden)

    John Svigos

    2013-02-01

    Full Text Available We report a baby with an unusual true umbilical cord cyst detected at 12 weeks gestation which as the pregnancy progressed became increasingly difficult to distinguish from a pseudocyst of the umbilical cord. Concern of the possibility of cord compression/cord accident led to an elective caesarean section being performed at 35+ week’s gestation with delivery of a healthy female infant weighing 2170g. At birth the cyst ruptured and the resultant thickened elongated cord was clamped accordingly. After the cord clamp fell off at 5 days post delivery an elongated umbilical stump was left behind from which a stream of urine surprisingly jetted out from the umbilicus each time the baby cried. A patent urachus was confirmed on ultrasound and the umbilical jet of urine resolved at 4 weeks post delivery after treatment of an Escherichia coli urinary tract infection. At 11 weeks post delivery a laparoscopic excision of the urachus was successfully performed. The baby, now 18 months of age, continues to thrive without incident.

  3. Building Successful Information Systems – a Key for Successful Organization

    Directory of Open Access Journals (Sweden)

    Doina ROSCA

    2010-12-01

    Full Text Available Building Successful Information Systems – a Key for Successful OrganizationAbstract: An Information System (IS can have a major impact on corporate strategy and organizational success. The involvement of managers and decision makers in all aspects of information systems is a major factor for organizational success, including higher profits and lower costs. Some of the benefits business organization seek to achieve through information systems include: better safety, competitive advantage, fewer errors, greater accuracy, higher quality products, improved communications, increased efficiency and productivity, more efficient administration, superior financial and managerial decision making.

  4. Surpresa refrativa pós-facoemulsificação em distrofia corneana posterior amorfa Post-phacoemulsification refractive surprise in a posterior amorphous corneal dystrophy patient

    Directory of Open Access Journals (Sweden)

    Giuliano de Oliveira Freitas

    2010-02-01

    Full Text Available Relato de um caso de surpresa refracional pós-operatória não pretendida em paciente portador de distrofia corneana posterior amorfa submetida à facoemulsificação. A provável causa do erro, bem como a conduta tomada a partir do reconhecimento da mesma, são discutidas neste relato.One case of post-phacoemulsification refractive surprise in a posterior amorphous corneal dystrophy patient is reported herein. Its likely causative factor, as well as our approach once it was recognized are discussed in this report.

  5. Tracing the origins of success: implications for successful aging.

    Science.gov (United States)

    Peterson, Nora M; Martin, Peter

    2015-02-01

    This paper addresses the debate about the use of the term "successful aging" from a humanistic, rather than behavioral, perspective. It attempts to uncover what success, a term frequently associated with aging, is: how can it be defined and when did it first come into use? In this paper, we draw from a number of humanistic perspectives, including the historical and linguistic, in order to explore the evolution of the term "success." We believe that words and concepts have deep implications for how concepts (such as aging) are culturally and historically perceived. We take a comparative approach, turning to the etymological roots of this term in British, French, and German literature. According to the earliest entries of the term in the Oxford English Dictionary, events can have good or bad success. Another definition marks success as outcome oriented. Often used in the context of war, religion, and medicine, the neutral, but often negative, use of "success" in literature of the Renaissance demonstrates the tensions that surround the word, and suggests that success is something to be approached carefully. Ignoring the ambiguous origins of success erases the fact that aging in earlier centuries echoes much of the same ambivalence with which many people discuss it today. Attending to the origins of success can help gerontologists understand the humanistic tradition behind their inquiry into what successful aging means today. © The Author 2014. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Succession Planning for Library Instruction

    Science.gov (United States)

    Sobel, Karen; Drewry, Josiah

    2015-01-01

    Detailed succession planning helps libraries pass information from one employee to the next. This is crucial in preparing for hiring, turnover, retirements, training of graduate teaching assistants in academic libraries, and other common situations. The authors of this article discuss succession planning for instruction programs in academic…

  7. Success and Women's Career Adjustment.

    Science.gov (United States)

    Russell, Joyce E. A.; Burgess, Jennifer R. D.

    1998-01-01

    Women still face barriers to career success and satisfaction: stereotypes, assumptions, organizational culture, human resource practices, and lack of opportunities. Despite individual and organizational strategies, many women leave to become entrepreneurs. There is a need to investigate how women define career success. (SK)

  8. Does Happiness Promote Career Success?

    Science.gov (United States)

    Boehm, Julia K.; Lyubomirsky, Sonja

    2008-01-01

    Past research has demonstrated a relationship between happiness and workplace success. For example, compared with their less happy peers, happy people earn more money, display superior performance, and perform more helpful acts. Researchers have often assumed that an employee is happy and satisfied because he or she is successful. In this article,…

  9. SUCCESSION MANAGEMENT: UPAYA HUMAN RESOURCE PLANNING MENUJU SUCCESS CORPORATE

    Directory of Open Access Journals (Sweden)

    Rini Kuswati

    2010-06-01

    create a more flexible and dynamic approach for preparing future executive and have the leadership necessary ready to meet the business challenges of the remainder of the decade and beyond. Succession management allows the corporate leadership to instill a more dynamic process became easier to integrate with the firm’s strategic initiatives. It better aligns organizational thinking with the external environment where the discontinuities make it possible to anticipate the full spectrum of change that a corporation will confront. It is the leadership and succession philosophy that focuses on developing the creativity and flexibility that allows for a more rapid response to change. So succession management as one way to became the success corporate.

  10. Centrifugal force: a few surprises

    International Nuclear Information System (INIS)

    Abramowicz, M.A.; Max-Planck-Institut fuer Physik und Astrophysik, Garching

    1990-01-01

    The need for a rather fundamental revision in understanding of the nature of the centrifugal force is discussed. It is shown that in general relativity (and contrary to the situation in Newtonian theory) rotation of a reference frame is a necessary but not sufficient condition for the centrifugal force to appear. A sufficient condition for its appearance, in the instantaneously corotating reference frame of a particle, is that the particle motion in space (observed in the global rest frame) differs from a photon trajectory. The direction of the force is the same as that of the gradient of the effective potential for photon motion. In some cases, the centrifugal force will attract towards the axis of rotation. (author)

  11. Cartel surprised by quota reduction

    International Nuclear Information System (INIS)

    Debnar, M.

    2003-01-01

    Development of world prices of petroleum in April - September 2003 are analysed. OPEC accepted a decision to lower oil production in 900 thousands barrels per day to 24.5 millions barrels per day from 1 November 2003. Although Russian Federation reports higher oil production every month, it does not represent danger for OPEC for the present. Problem is to import the oil. Situation will change dramatically from 2008. Russian Federation plans anyway to build gigantic pipeline to Murmansk port in Barents Sea. Thanks to it Russian oil export to USA should sharply extend. It was reported by Russian Minister of economy German Gref with note that construction of pipeline is priority for Russian Federation, which wants to diversification its oil export from traditional Europe. Project of pipeline construction from Western Siberia to deep water of Murmansk terminal was published by five oil companies leaded by Lukoil and Yukos already at the end of last year. Pipeline should cost 4 billions USD and it should transport up to 2.4 millions barrels of oil per day. In the meantime Caspian oil should start to flow to Europe

  12. Conversation Simulation and Sensible Surprises

    Science.gov (United States)

    Hutchens, Jason L.

    I have entered the Loebner Prize five times, winning the "most humanlike program" category in 1996 with a surly ELIZA-clone named HeX, but failed to repeat the performance in subsequent years with more sophisticated techniques. Whether this is indicative of an unanticipated improvement in "conversation simulation" technology, or whether it highlights the strengths of ELIZA-style trickery, is as an exercise for the reader. In 2000, I was invited to assume the role of Chief Scientist at Artificial Intelligence Ltd. (Ai) on a project inspired by the advice given by Alan Turing in the final section of his classic paper - our quest was to build a "child machine" that could learn and use language from scratch. In this chapter, I will discuss both of these experiences, presenting my thoughts regarding the Chinese Room argument and Artificial Intelligence (AI) in between.

  13. Some new surprises in chaos.

    Science.gov (United States)

    Bunimovich, Leonid A; Vela-Arevalo, Luz V

    2015-09-01

    "Chaos is found in greatest abundance wherever order is being sought.It always defeats order, because it is better organized"Terry PratchettA brief review is presented of some recent findings in the theory of chaotic dynamics. We also prove a statement that could be naturally considered as a dual one to the Poincaré theorem on recurrences. Numerical results demonstrate that some parts of the phase space of chaotic systems are more likely to be visited earlier than other parts. A new class of chaotic focusing billiards is discussed that clearly violates the main condition considered to be necessary for chaos in focusing billiards.

  14. Experimental results surprise quantum theory

    International Nuclear Information System (INIS)

    White, C.

    1986-01-01

    Interest in results from Darmstadt that positron-electron pairs are created in nuclei with high atomic numbers (in the Z range from 180-188) lies in the occurrence of a quantized positron kinetic energy peak at 300. The results lend substance to the contention of Erich Bagge that the traditionally accepted symmetries in positron-electron emission do not exist and, therefore, there is no need to posit the existence of the neutrino. The search is on for the decay of a previously unknown boson to account for the findings, which also points to the need for a major revision in quantum theory. 1 figure

  15. Surprises and omissions in toxicology

    Czech Academy of Sciences Publication Activity Database

    Rašková, H.; Zídek, Zdeněk

    2004-01-01

    Roč. 12, - (2004), S94-S96 ISSN 1210-7778. [Inderdisciplinary Czech-Slovak Toxicological Conference /8./. Praha, 03.09.2004-05.09.2004] Institutional research plan: CEZ:AV0Z5008914 Keywords : bacterial toxins Subject RIV: FR - Pharmacology ; Medidal Chemistry

  16. Succession Planning in Australian Farming

    Directory of Open Access Journals (Sweden)

    John Hicks

    2012-11-01

    Full Text Available The theme of this paper is that succession planning in Australian farming is under-developed.It may be linked to economic and social change which suggests that farmers need to adapt togenerational change but this is being resisted or ignored. The implications of this are the slowdecline of family farming, a poor transfer of skills and knowledge to subsequent generationsof farmers in some parts of the agricultural sector and the potential for an extension of thefinancial services industry to develop a more effective raft of succession planning measuresto mitigate the effects of a traditional approach to succession in agriculture.

  17. A review of the Nearctic genus Prostoia (Ricker) (Plecoptera, Nemouridae), with the description of a new species and a surprising range extension for P. hallasi Kondratieff & Kirchner.

    Science.gov (United States)

    Grubbs, Scott A; Baumann, Richard W; DeWalt, R Edward; Tweddale, Tari

    2014-01-01

    The Nearctic genus Prostoia (Plecoptera: Nemouridae) is reviewed. Prostoia ozarkensis sp. n. is described from the male and female adult stages mainly from the Interior Highland region encompassing portions of Arkansas, Missouri, and Oklahoma. Prostoia ozarkensis sp. n. appears most closely related to two species, one distributed broadly across the western Nearctic region, P. besametsa (Ricker), and one found widely throughout the central and eastern Nearctic regions, P. completa (Walker). A surprising range extension is noted for P. hallasi Kondratieff & Kirchner, a species once known only from the Great Dismal Swamp, from small upland streams in southern Illinois. Additional new state records are documented for P. besametsa, P. completa, P. hallasi and P. similis (Hagen). Taxonomic keys to Prostoia males and females are provided, and scanning electron micrographs of adult genitalia of all species are given.

  18. Crystal structure of di-μ-chlorido-bis[dichloridobis(methanol-κOiridium(III] dihydrate: a surprisingly simple chloridoiridium(III dinuclear complex with methanol ligands

    Directory of Open Access Journals (Sweden)

    Joseph S. Merola

    2015-05-01

    Full Text Available The reaction between IrCl3·xH2O in methanol led to the formation of small amounts of the title compound, [Ir2Cl6(CH3OH4]·2H2O, which consists of two IrCl4O2 octahedra sharing an edge via chloride bridges. The molecule lies across an inversion center. Each octahedron can be envisioned as being comprised of four chloride ligands in the equatorial plane with methanol ligands in the axial positions. A lattice water molecule is strongly hydrogen-bonded to the coordinating methanol ligands and weak interactions with coordinating chloride ligands lead to the formation of a three-dimensional network. This is a surprising structure given that, while many reactions of iridium chloride hydrate are carried out in alcoholic solvents, especially methanol and ethanol, this is the first structure of a chloridoiridium compound with only methanol ligands.

  19. Organizational Climate for Successful Aging

    Science.gov (United States)

    Zacher, Hannes; Yang, Jie

    2016-01-01

    Research on successful aging at work has neglected contextual resources such as organizational climate, which refers to employees’ shared perceptions of their work environment. We introduce the construct of organizational climate for successful aging (OCSA) and examine it as a buffer of the negative relationship between employee age and focus on opportunities (i.e., beliefs about future goals and possibilities at work). Moreover, we expected that focus on opportunities, in turn, positively predicts job satisfaction, organizational commitment, and motivation to continue working after official retirement age. Data came from 649 employees working in 120 companies (Mage = 44 years, SD = 13). We controlled for organizational tenure, psychological climate for successful aging (i.e., individuals’ perceptions), and psychological and organizational age discrimination climate. Results of multilevel analyses supported our hypotheses. Overall, our findings suggest that OCSA is an important contextual resource for successful aging at work. PMID:27458405

  20. Building a Successful Technology Cluster

    Science.gov (United States)

    Silicon Valley is the iconic cluster—a dense regional network of companies, universities, research institutions, and other stakeholders involved in a single industry. Many regions have sought to replicate the success of Silicon Valley, which has produced technological innov...

  1. Organizational Climate for Successful Aging.

    Science.gov (United States)

    Zacher, Hannes; Yang, Jie

    2016-01-01

    Research on successful aging at work has neglected contextual resources such as organizational climate, which refers to employees' shared perceptions of their work environment. We introduce the construct of organizational climate for successful aging (OCSA) and examine it as a buffer of the negative relationship between employee age and focus on opportunities (i.e., beliefs about future goals and possibilities at work). Moreover, we expected that focus on opportunities, in turn, positively predicts job satisfaction, organizational commitment, and motivation to continue working after official retirement age. Data came from 649 employees working in 120 companies (M age = 44 years, SD = 13). We controlled for organizational tenure, psychological climate for successful aging (i.e., individuals' perceptions), and psychological and organizational age discrimination climate. Results of multilevel analyses supported our hypotheses. Overall, our findings suggest that OCSA is an important contextual resource for successful aging at work.

  2. Success Teaching Spelling with Music.

    Science.gov (United States)

    Martin, Mariellen

    1983-01-01

    A spelling approach which incorporates music on a cassette with spelling, pronunciation, and definition of specific words was successful in improving junior high learning disabled students' spelling performance, self-esteem, and sequential memories. (CL)

  3. Success Neurosis in College Seniors

    Science.gov (United States)

    Roulet, Norman L.

    1976-01-01

    The incidence and causes of success neurosis in undergraduate students are examined and the suggestion is made that while therapy of the manifest problem is often relatively easy, the longer-term fate is still problematic. (MB)

  4. Human capital and career success

    DEFF Research Database (Denmark)

    Frederiksen, Anders; Kato, Takao

    capital formally through schooling for career success, as well as the gender gap in career success rates. Second, broadening the scope of human capital by experiencing various occupations (becoming a generalist) is found to be advantageous for career success. Third, initial human capital earned through......Denmark’s registry data provide accurate and complete career history data along with detailed personal characteristics (e.g., education, gender, work experience, tenure and others) for the population of Danish workers longitudinally. By using such data from 1992 to 2002, we provide rigorous...... formal schooling and subsequent human capital obtained informally on the job are found to be complements in the production of career success. Fourth, though there is a large body of the literature on the relationship between firm-specific human capital and wages, the relative value of firm-specific human...

  5. Enterprise architecture for business success

    CERN Document Server

    Wijegunaratne, Inji; Evans-Greenwood, Peter

    2014-01-01

    Enterprise Architecture (EA) has evolved to become a prominent presence in today's information systems and technology landscape. The EA discipline is rich in frameworks, methodologies, and the like. However, the question of 'value' for business ;professionals remains largely unanswered - that is, how best can Enterprise Architecture and Enterprise Architects deliver value to the enterprise? Enterprise Architecture for Business Success answers this question. Enterprise Architecture for Business Success is primarily intended for IT professionals working in the area of Enterprise Architectu

  6. Tacit acceptance of the succession

    Directory of Open Access Journals (Sweden)

    Ioana NICOLAE

    2012-01-01

    Full Text Available This paper examines some essential and contradictory aspects regarding the issue of tacit acceptance of succession in terms of distinction between documents valuing tacit acceptance of succession and other acts that would not justify such a solution. The documents expressly indicated by the legislator as having tacit acceptance value as well as those which do not have such value are presented and their most important legal effects are examined and discussed.

  7. The characteristics of successful entrepreneurs

    OpenAIRE

    Pokrajčić Dragana M.

    2004-01-01

    This paper examines the economic, psychological and social-behavioral theories of the entrepreneur in order to determine the characteristics of a successful entrepreneur. The major contribution of economic theories of the entrepreneur is better understanding of the entrepreneur and his/her role in economic development. The psychological characteristic theory of entrepreneur argues that successful entrepreneurs possess certain personality traits that mark them out as special, and tries to dete...

  8. Fast Success and Slow Failure

    DEFF Research Database (Denmark)

    Mors, Louise; Waguespack, David

    Full Abstract: Do the benefits of cross boundary collaborations outweigh the costs? We seek to answer this question by examining 5079 collaborations in the Internet Engineering Task Force (IETF). Our findings suggest that crossing formal boundaries is positively related to success and efficiency ...... of success, suggesting that firms are better off investing in nondiverse projects. This finding has important implications for how we think about the benefits of seeking novelty....

  9. Biosphere reserves: Attributes for success.

    Science.gov (United States)

    Van Cuong, Chu; Dart, Peter; Hockings, Marc

    2017-03-01

    Biosphere reserves established under the UNESCO Man and the Biosphere Program aim to harmonise biodiversity conservation and sustainable development. Concerns over the extent to which the reserve network was living up to this ideal led to the development of a new strategy in 1995 (the Seville Strategy) to enhance the operation of the network of reserves. An evaluation of effectiveness of management of the biosphere reserve network was called for as part of this strategy. Expert opinion was assembled through a Delphi Process to identify successful and less successful reserves and investigate common factors influencing success or failure. Ninety biosphere reserves including sixty successful and thirty less successful reserves in 42 countries across all five Man and the Biosphere Program regions were identified. Most successful sites are the post-Seville generation while the majority of unsuccessful sites are pre-Seville that are managed as national parks and have not been amended to conform to the characteristics that are meant to define a biosphere reserve. Stakeholder participation and collaboration, governance, finance and resources, management, and awareness and communication are the most influential factors in the success or failure of the biosphere reserves. For success, the biosphere reserve concept needs to be clearly understood and applied through landscape zoning. Designated reserves then need a management system with inclusive good governance, strong participation and collaboration, adequate finance and human resource allocation and stable and responsible management and implementation. All rather obvious but it is difficult to achieve without commitment to the biosphere reserve concept by the governance authorities. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. The characteristics of successful entrepreneurs

    Directory of Open Access Journals (Sweden)

    Pokrajčić Dragana M.

    2004-01-01

    Full Text Available This paper examines the economic, psychological and social-behavioral theories of the entrepreneur in order to determine the characteristics of a successful entrepreneur. The major contribution of economic theories of the entrepreneur is better understanding of the entrepreneur and his/her role in economic development. The psychological characteristic theory of entrepreneur argues that successful entrepreneurs possess certain personality traits that mark them out as special, and tries to determine and to evaluate these special traits. The social-behavioral theories stress the influence of experience, knowledge, social environment and ability to learn on the entrepreneur’s success as well as his/her personality traits. Neither of the examined theories of entrepreneur gives a satisfactory explanation of the entrepreneur’s success, but taken as a whole, they can explain key factors of entrepreneur’s success. The entrepreneur’s success comes about as a result of his/her personality traits, ability to learn from experience and ability to adjust to his/her environment.

  11. The International Successful School Principalship Project: Success Sustained?

    Science.gov (United States)

    Moos, Lejf; Johansson, Olof

    2009-01-01

    Purpose: The purpose of this paper is to synthesize the findings of the follow-up studies of successful school principals in six countries: Australia, Denmark, England, Norway, Sweden, and the USA. Design/methodology/approach: Data were categorized according to stakeholder expectations, the concept and practice of leadership, and the…

  12. Statistical factors affecting the success of nuclear operations

    International Nuclear Information System (INIS)

    Sunder, S.; Stephenson, J.R.; Hochman, D.

    1999-01-01

    In this article, the authors present a statistical analysis to determine the operational, financial, technical, and managerial factors that most significantly affect the success of nuclear operations. The study analyzes data for over 70 nuclear plants and 40 operating companies over a period of five years in order to draw conclusions that they hope will be of interest to utility companies and public utility commissions as they seek ways to improve rates of success in nuclear operations. Some of these conclusions will not be surprising--for example, that older plants have heavier maintenance requirements--but others are less intuitive. For instance, the observation that operators of fewer plants have lower costs suggests that any experience curve benefits associated with managing multiple nuclear facilities is overshadowed by the logistic problems of multiple facilities. After presenting a brief history of nuclear power in America, the authors outline the motivations of the study and the methodology of their analysis. They end the article with the results of the study and discuss some of the managerial implications of these findings

  13. Staggering successes amid controversy in California water management

    Science.gov (United States)

    Lund, J. R.

    2012-12-01

    Water in California has always been important and controversial, and it probably always will be. California has a large, growing economy and population in a semi-arid climate. But California's aridity, hydrologic variability, and water controversies have not precluded considerable economic successes. The successes of California's water system have stemmed from the decentralization of water management with historically punctuated periods of more centralized strategic decision-making. Decentralized management has allowed California's water users to efficiently explore incremental solutions to water problems, ranging from early local development of water systems (such as Hetch Hetchy, Owens Valley, and numerous local irrigation projects) to more contemporary efforts at water conservation, water markets, wastewater reuse, and conjunctive use of surface and groundwater. In the cacophony of local and stakeholder interests, strategic decisions have been more difficult, and consequently occur less frequently. California state water projects and Sacramento Valley flood control are examples where decades of effort, crises, floods and droughts were needed to mobilize local interests to agree to major strategic decisions. Currently, the state is faced with making strategic environmental and water management decisions regarding its deteriorating Sacramento-San Joaquin Delta. Not surprisingly, human uncertainties and physical and fiscal non-stationarities dominate this process.

  14. Success factors in technology development

    Science.gov (United States)

    Preston, John T.

    1995-01-01

    Universities in the U.S. have a significant impact on business through the transfer of technology. This paper describes goals and philosophy of the Technology Licensing Office at the Massachusetts Institute of Technology. This paper also relates the critical factors for susscessful technology transfer, particularly relating to new business formation. These critical factors include the quality of the technology, the quality of the management, the quality of the investor, the passion for success, and the image of the company. Descriptions of three different levels of investment are also given and the most successful level of investment for starting a new company is reviewed. Licensing to large companies is also briefly reviewed, as this type of licensing requires some different strategies than that of licensing to start-up companies. High quality critical factors and intelligent investment create rewards for the parties and successful ventures.

  15. Information Systems Success: An empirical study on the appropriate success criteria and the real value of critical success factors

    OpenAIRE

    Skovly, Jørgen

    2013-01-01

    Success is a complex concept, that people have been trying to understand for some time. Extensive research has been conducted in order to improve our understanding, and thus increase our chances for achieving success. However, as projects still continue to fail, the real value of this research seems unclear. This thesis emphasizes the distinction between variables that may cause success (success factors), and variables that are part of success (success criteria). Success is not a 'black and w...

  16. The successful conclusion of the Deep Space 1 Mission: important results without a flashy title

    Science.gov (United States)

    Rayman, M. D.

    2002-01-01

    In September 2001, Deep Space 1 (DS1) completed a high-risk and flawless encounter with comet 19P/Borrelly. Its data provide a detailed view of this comet and offere surprising and exciting insights. With this successful conclusion of its extended mission, DS1 undertook a hyperextended mission. Following this period of extremely agressive testing, with no further technology or science objectives, the mission was terminated on December 18, 2001, with the powering off of the spacecraft's trnasmitter, although the receiver was left on. By the end of its mission, DS1 had returned a wealth of important science data and engineering data for future missions.

  17. Organizational Scale and School Success.

    Science.gov (United States)

    Guthrie, James W.

    1979-01-01

    The relationship between the organizational scale of schooling (school and school district size) and school success is examined. The history of the movement toward larger school units, the evidence of the effects of that movement, and possible research strategies for further investigation of the issue are discussed. (JKS)

  18. of the Intestate Succession Act

    African Journals Online (AJOL)

    10332324

    many different shapes and sizes and it has stressed that one form of family cannot .... same-sex couples the same status and benefits that marriage afforded ..... that his or her share was inherited by the co-heirs, including the surviving spouse, and he ..... Jamneck J et al The Law of Succession in South Africa 2nd ed (Oxford.

  19. Successive Standardization of Rectangular Arrays

    Directory of Open Access Journals (Sweden)

    Richard A. Olshen

    2012-02-01

    Full Text Available In this note we illustrate and develop further with mathematics and examples, the work on successive standardization (or normalization that is studied earlier by the same authors in [1] and [2]. Thus, we deal with successive iterations applied to rectangular arrays of numbers, where to avoid technical difficulties an array has at least three rows and at least three columns. Without loss, an iteration begins with operations on columns: first subtract the mean of each column; then divide by its standard deviation. The iteration continues with the same two operations done successively for rows. These four operations applied in sequence completes one iteration. One then iterates again, and again, and again, ... In [1] it was argued that if arrays are made up of real numbers, then the set for which convergence of these successive iterations fails has Lebesgue measure 0. The limiting array has row and column means 0, row and column standard deviations 1. A basic result on convergence given in [1] is true, though the argument in [1] is faulty. The result is stated in the form of a theorem here, and the argument for the theorem is correct. Moreover, many graphics given in [1] suggest that except for a set of entries of any array with Lebesgue measure 0, convergence is very rapid, eventually exponentially fast in the number of iterations. Because we learned this set of rules from Bradley Efron, we call it “Efron’s algorithm”. More importantly, the rapidity of convergence is illustrated by numerical examples.

  20. 2011 Critical Success Factors Report

    Science.gov (United States)

    North Carolina Community College System (NJ1), 2011

    2011-01-01

    The Critical Success Factors Report is the North Carolina Community College System's major accountability document. This annual performance report serves to inform colleges and the public on the performance of North Carolina's 58 community colleges. In 1993, the State Board of Community Colleges began monitoring performance data on specific…

  1. 2012 Critical Success Factors Report

    Science.gov (United States)

    North Carolina Community College System (NJ1), 2012

    2012-01-01

    The Critical Success Factors Report is the North Carolina Community College System's major accountability document. This annual performance report is based on data compiled from the previous year and serves to inform colleges and the public on the performance of North Carolina's 58 community colleges. In 1993, the State Board of Community Colleges…

  2. Parents, School Success, and Health

    Centers for Disease Control (CDC) Podcasts

    2009-08-03

    In this podcast, Dr. William Jeynes, CDC Parenting Speaker Series guest, discusses the importance of parental involvement in children's academic success and lifelong health.  Created: 8/3/2009 by National Center on Birth Defects and Developmental Disabilities (NCBDDD).   Date Released: 8/3/2009.

  3. Successful Components of Interdisciplinary Education.

    Science.gov (United States)

    Shepard, Katherine; And Others

    1985-01-01

    This article presents 10 ideas for developing successful interdisciplinary curricula as suggested in the allied health literature. Implementation of the ideas is illustrated by examples from a clinical geriatric course involving physical therapy, physician assistant, nurse practitioner, and medical students. (Author/CT)

  4. Relationships and project marketing success

    DEFF Research Database (Denmark)

    Skaates, Maria Anne; Tikkanen, Henrikki; Lindblom, Jarno

    2002-01-01

    Project operations are a dominating mode of international business. Managing relationships and networks is crucial to project marketing success both at the level of the individual project and at the level of multiple projects. This article first defines key characteristics of project business, id...

  5. ‘Surprise’: Outbreak of Campylobacter infection associated with chicken liver pâté at a surprise birthday party, Adelaide, Australia, 2012

    Science.gov (United States)

    Fearnley, Emily; Denehy, Emma

    2012-01-01

    Objective In July 2012, an outbreak of Campylobacter infection was investigated by the South Australian Communicable Disease Control Branch and Food Policy and Programs Branch. The initial notification identified illness at a surprise birthday party held at a restaurant on 14 July 2012. The objective of the investigation was to identify the potential source of infection and institute appropriate intervention strategies to prevent further illness. Methods A guest list was obtained and a retrospective cohort study undertaken. A combination of paper-based and telephone questionnaires were used to collect exposure and outcome information. An environmental investigation was conducted by Food Policy and Programs Branch at the implicated premises. Results All 57 guests completed the questionnaire (100% response rate), and 15 met the case definition. Analysis showed a significant association between illness and consumption of chicken liver pâté (relative risk: 16.7, 95% confidence interval: 2.4–118.6). No other food or beverage served at the party was associated with illness. Three guests submitted stool samples; all were positive for Campylobacter. The environmental investigation identified that the cooking process used in the preparation of chicken liver pâté may have been inconsistent, resulting in some portions not cooked adequately to inactivate potential Campylobacter contamination. Discussion Chicken liver products are a known source of Campylobacter infection; therefore, education of food handlers remains a high priority. To better identify outbreaks among the large number of Campylobacter notifications, routine typing of Campylobacter isolates is recommended. PMID:23908933

  6. Duplicating Research Success at Xerox

    Science.gov (United States)

    Hays, Dan A.

    2003-03-01

    The genesis of Xerox is rooted in the invention of xerography by physicist Chester Carlson in 1938. The initial research by Carlson can be viewed as the first of four successful xerographic research eras that have contributed to the growth of Xerox. The second era began in 1944 when Carlson established a working relationship with Battelle Memorial Institute in Columbus, OH. Due to many research advances at Battelle, the Haloid Corporation in Rochester, NY acquired a license to the xerographic process in 1947. The name of the company was changed to Xerox Corporation in 1961 following the wide market acceptance of the legendary Xerox 914 copier. Rapid revenue growth of Xerox in the mid-'60s provided the foundation for a third successful research era in the '70s and '80s. A research center was established in Webster, NY for the purpose of improving the design of xerographic subsystems and materials. These research efforts contributed to the commercial success of the DocuTech family of digital production printers. The fourth successful research era was initiated in the '90s with the objective of identifying a high-speed color xerographic printing process. A number of research advances contributed to the design of a 100 page per minute printer recently introduced as the Xerox DocuColor iGen3 Digital Production Press. To illustrate the role of research in enabling these waves of successful xerographic products, the physics of photoreceptors, light exposure and development subsystems will be discussed. Since the annual worldwide revenue of the xerographic industry exceeds 100 billion dollars, the economic return on Carlson's initial research investment in the mid-'30s is astronomical. The future for xerography remains promising since the technology enables high-speed digital printing of high-quality color documents with variable information.

  7. Ending the CEO succession crisis.

    Science.gov (United States)

    Charan, Ram

    2005-02-01

    The CEO succession process is broken. Many companies have no meaningful succession plans, and few of the ones that do are happy with them. CEO tenure is shrinking; in fact, two out of five CEOs fail in their first 18 months. It isn't just that more CEOs are being replaced; it's that they're being replaced badly. The problems extend to every aspect of CEO succession: internal development programs, board supervision, and outside recruitment. While many organizations do a decent job of nurturing middle managers, few have set up the comprehensive programs needed to find the half-dozen true CEO candidates out of the thousands of leaders in their midst. Even more damaging is the failure of boards to devote enough attention to succession. Search committee members often have no experience hiring CEOs; lacking guidance, they supply either the narrowest or the most general of requirements and then fail to vet eitherthe candidates or the recruiters. The result is that too often new CEOs are plucked from the well-worn Rolodexes of a remarkably small number of recruiters. These candidates may be strong in charisma but may lack critical skills or otherwise be a bad fit with the company. The resulting high turnover is particularly damaging, since outside CEOs often bring in their own teams, can cause the company to lose focus, and are especially costly to be rid of. Drawing on over 35 years of experience with CEO succession, the author explains how companies can create a deep pool of internal candidates, how boards can consistently align strategy and leadership development, and how directors can get their money's worth from recruiters. Choosing a CEO should be not one decision but an amalgam ofthousands of decisions made by many people every day over years.

  8. Anatomy of a Rescue: What Makes Hostage Rescue Operations Successful?

    National Research Council Canada - National Science Library

    Perez, Carlos

    2004-01-01

    ...: surprise, intelligence, operator's skill, and deception. These principles are derived from planning models used in special operations, personal experience, and an analysis of six historical case studies...

  9. Drought, pollen and nectar availability, and pollination success.

    Science.gov (United States)

    Waser, Nickolas M; Price, Mary V

    2016-06-01

    Pollination success of animal-pollinated flowers depends on rate of pollinator visits and on pollen deposition per visit, both of which should vary with the pollen and nectar "neighborhoods" of a plant, i.e., with pollen and nectar availability in nearby plants. One determinant of these neighborhoods is per-flower production of pollen and nectar, which is likely to respond to environmental influences. In this study, we explored environmental effects on pollen and nectar production and on pollination success in order to follow up a surprising result from a previous study: flowers of Ipomopsis aggregata received less pollen in years of high visitation by their hummingbird pollinators. A new analysis of the earlier data indicated that high bird visitation corresponded to drought years. We hypothesized that drought might contribute to the enigmatic prior result if it decreases both nectar and pollen production: in dry years, low nectar availability could cause hummingbirds to visit flowers at a higher rate, and low pollen availability could cause them to deposit less pollen per visit. A greenhouse experiment demonstrated that drought does reduce both pollen and nectar production by I. aggregata flowers. This result was corroborated across 6 yr of variable precipitation and soil moisture in four unmanipulated field populations. In addition, experimental removal of pollen from flowers reduced the pollen received by nearby flowers. We conclude that there is much to learn about how abiotic and biotic environmental drivers jointly affect pollen and nectar production and availability, and how this contributes to pollen and nectar neighborhoods and thus influences pollination success.

  10. Key to Language Learning Success

    Directory of Open Access Journals (Sweden)

    Oktavian Mantiri

    2015-01-01

    Full Text Available This paper looks at the important elements of language learning and teaching i.e. the role of teachers as well as the attitude and motivation of learners. Teachers undoubtedly play crucial roles in students’ language learning outcome which could ignite or diminish students’ motivation. Positive attitudes and motivation – instrumental or integrative and intrinsic or extrinsic – are key to successful learning. Therefore it is paramount for language teachers as well as learners to know these roles and nurture the best possible ways where language teaching and learning will thrive. This paper also suggested that both stake-holders should be open to holistic approach of language learning and that other factors such as the environment could play an important part in language teaching and learning success.

  11. When is cartilage repair successful?

    International Nuclear Information System (INIS)

    Raudner, M.; Roehrich, S.; Zalaudek, M.; Trattnig, S.; Schreiner, M.M.

    2017-01-01

    Focal cartilage lesions are a cause of long-term disability and morbidity. After cartilage repair, it is crucial to evaluate long-term progression or failure in a reproducible, standardized manner. This article provides an overview of the different cartilage repair procedures and important characteristics to look for in cartilage repair imaging. Specifics and pitfalls are pointed out alongside general aspects. After successful cartilage repair, a complete, but not hypertrophic filling of the defect is the primary criterion of treatment success. The repair tissue should also be completely integrated to the surrounding native cartilage. After some months, the transplants signal should be isointense compared to native cartilage. Complications like osteophytes, subchondral defects, cysts, adhesion and chronic bone marrow edema or joint effusion are common and have to be observed via follow-up. Radiological evaluation and interpretation of postoperative changes should always take the repair method into account. (orig.) [de

  12. Writing a successful business plan.

    Science.gov (United States)

    Haag, A B

    1997-01-01

    1. In creating and building a business, the entrepreneur assumes all the responsibilities for its development and management, as well as the risks and risks and rewards. Many businesses do not survive because business owners fail to develop an effective plan. 2. The business plan focuses on major areas of concern and their contribution to the success of a new business. The finished product communicates the product/service to others and provides the basis for the financial proposal. 3. Planning helps identify customers, market area, pricing strategy, and competitive conditions. It aids in decision making and is an essential guide for operating a business successfully and measuring progress. 4. The business plan not only serves as a mechanism for obtaining any needed financial resources, but also indicates the future direction of the company.

  13. The sudden success of prose

    DEFF Research Database (Denmark)

    Mortensen, Lars Boje

    2017-01-01

    The article presents a new model for understanding the sudden success of prose in four literatures: Greek, Latin, French and Old Norse. Through comparison and quantitative observations, and by focusing on the success of prose rather than its invention, it is shown that in all four cases two...... reading aloud) has been underplayed in previous scholarship mostly focused on authorial choices and invention. For two of the literatures (Greek, French) the fast dynamics of the rise of prose has already been identified and discussed, but for the two others (Latin, Old Norse), the observation is new....... It is also suggested that the exactly contemporary rise of French and Old Norse prose (c. 1200-1230) most probably is connected. The four literatures are each shown in chronological charts so as to visualize the timeline and the relation between poetic and prosaic works. The article furthermore reflects...

  14. Communication: essential strategies for success.

    Science.gov (United States)

    O'Connor, Mary

    2013-06-01

    This department highlights change management strategies that may be successful in strategically planning and executing organizational change initiatives. With the goal of presenting practical approaches helpful to nurse leaders advance organizational change, content includes evidence-based projects, tool, and resources that mobilize and sustain organizational change initiatives. In this article, the author discusses strategies for communication for change processes, whether large or small. Intentional planning and development of a communication strategy alongside, not as an afterthought, to change initiatives are essential.

  15. The California cogeneration success story

    International Nuclear Information System (INIS)

    Neiggemann, M.F.

    1992-01-01

    This chapter describes the involvement of Southern California Gas Company(SoCalGas) in the promotion and demonstration of the benefits of cogeneration in California. The topics covered in this chapter are market strategy, cogeneration program objectives, cogeneration program, incentive cofunding, special gas rate, special service priority, special gas pressure and main options, advertising, promotional brochures and handbooks, technical support, program accomplishments, cogeneration outlook, and reasons for success of the program

  16. Successes against insects and parasites

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1967-10-15

    With more and more answers being found to intricate problems which have entailed years of research in many parts of the world, some successes can now be claimed in the fight to control insect threats to crops, animals and human beings. Nuclear techniques are playing an important part in world efforts, and recent reports show that they have been effective in pioneer work against crop pests as well as in finding an answer to some diseases caused by parasites

  17. Factors determining UK album success

    OpenAIRE

    Elliott, Caroline; Simmons, Robert

    2011-01-01

    This article uses a recently compiled dataset on the UK album sales to determine which factors contribute to best-selling album sales success. We control for factors including length of time since release, nationality of artist, artist type and album type, testing the increasing returns to information hypothesis. Information on general public online review scores for the albums in the dataset allows for a strong test of the accuracy of online reviews in predicting music sales, as online revie...

  18. [Organising a successful return home].

    Science.gov (United States)

    Mézière, Anthony

    Discharge from hospital is a major component of the quality and efficiency of the healthcare system. The failures of the return home of elderly people testify to the difficulties of applying guidelines in the area of hospital discharge. The action plan decided in the hospital for a successful return home can be jeopardised for personal, relational, functional and structural reasons originating from the different players involved in the hospital discharge. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  19. Another successful Doctoral Student Assembly

    CERN Multimedia

    Katarina Anthony

    2014-01-01

    On Wednesday 2 April, CERN hosted its third Doctoral Student Assembly in the Council Chamber.   CERN PhD students show off their posters in CERN's Main Building. Speaking to a packed house, Director-General Rolf Heuer gave the assembly's opening speech and introduced the poster session that followed. Seventeen CERN PhD students presented posters on their work, and were greeted by their CERN and University supervisors. It was a very successful event!

  20. Success of opposition against Temelin

    International Nuclear Information System (INIS)

    Tollmann, A.

    1989-01-01

    A first success for the anti-nuclear movement emerges: the Czechoslovakian government renounces of blocks 3 and 4, 1000 MW each, in Temelin. Although lack of money is admitted as a partial cause, the main cause is the broad opposition of the population, especially in Austria, says the author. Therefore the author appeals to the coworkers to double their efforts in the signature collection against Temelin. The slogan is: we shall make Temelin too. (qui)

  1. Fumigation success for California facility.

    Science.gov (United States)

    Hacker, Robert

    2010-02-01

    As Robert Hacker, at the time director of facilities management at the St John's Regional Medical Center in Oxnard, California, explains, the hospital, one of the area's largest, recently successfully utilised a new technology to eliminate mould, selecting a cost and time-saving fumigation process in place of the traditional "rip and tear" method. Although hospital managers knew the technology had been used extremely effectively in other US buildings, this was reportedly among the first ever healthcare applications.

  2. Success in Transdisciplinary Sustainability Research

    Directory of Open Access Journals (Sweden)

    Tobias Luthe

    2017-01-01

    Full Text Available The complexity of sustainable development and societal transitions require both analytical understandings of how coupled human-environment systems function and transdisciplinary science-to-practice approaches. The academic discourse has advanced in developing a framework for defining success in transdisciplinary research (TDR. Further empirical evidence is needed to validate the proposed concepts with TDR case studies. This paper applies a widely used TDR framework to test and critically evaluate its design principles and criteria of success with five TDR case studies the author is intimately familiar with. Overall, the design principles of the framework are validated for the five cases. Additional design principles are derived from the case analysis and proposed to complement the applied framework: (1 A project origin from society as opposed to with and for society; (2 Quickly available initiation funding; (3 Flexibility in time, objectives and methods throughout the research process; (4 Acceptance of process vs. project results; (5 Inclusion of public science communication; and (6 A demand-driven transition to a prolonged or new project partnership. The complementing principles are proposed for integration in the applied framework and are subject to further empirical testing. The reflexive empirical approach I have taken in this paper offers a key step towards removing institutional barriers for successful TDR, demonstrating how conceptual frameworks can be applied.

  3. Succession planning for technical experts

    International Nuclear Information System (INIS)

    Kirk, Bernadette Lugue; Cain, Ronald A.; Dewji, Shaheen A.; Agreda, Carla L.

    2017-01-01

    This report describes a methodology for identifying, evaluating, and mitigating the loss of key technical skills at nuclear operations facilities. The methodology can be adapted for application within regulatory authorities and research and development organizations, and can be directly applied by international engagement partners of the Department of Energy’s National Nuclear Security Administration (NNSA). The resultant product will be of direct benefit to two types of NNSA missions: (1) domestic human capital development programs tasked to provide focused technical expertise to succeed an aging nuclear operations workforce, and (2) international safeguards programs charged with maintaining operational safeguards for developing/existing nuclear power program in nations where minimal available resources must be used effectively. This report considers succession planning and the critical skills necessary to meet an institution’s goals and mission. Closely tied to succession planning are knowledge management and mentorship. In considering succession planning, critical skill sets are identified and are greatly dependent on the subject matter expert in question. This report also provides examples of critical skills that are job specific.

  4. Succession planning for technical experts

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, Bernadette Lugue [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Cain, Ronald A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Dewji, Shaheen A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Agreda, Carla L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-07-01

    This report describes a methodology for identifying, evaluating, and mitigating the loss of key technical skills at nuclear operations facilities. The methodology can be adapted for application within regulatory authorities and research and development organizations, and can be directly applied by international engagement partners of the Department of Energy’s National Nuclear Security Administration (NNSA). The resultant product will be of direct benefit to two types of NNSA missions: (1) domestic human capital development programs tasked to provide focused technical expertise to succeed an aging nuclear operations workforce, and (2) international safeguards programs charged with maintaining operational safeguards for developing/existing nuclear power program in nations where minimal available resources must be used effectively. This report considers succession planning and the critical skills necessary to meet an institution’s goals and mission. Closely tied to succession planning are knowledge management and mentorship. In considering succession planning, critical skill sets are identified and are greatly dependent on the subject matter expert in question. This report also provides examples of critical skills that are job specific.

  5. ORCID @ CMU: Successes and Failures

    Directory of Open Access Journals (Sweden)

    Denise Troll Covey

    2016-02-01

    Full Text Available Setting and Objectives: Carnegie Mellon University (CMU recently planned and implemented a project to help CMU researchers get an Open Researcher and Contributor Identifier (ORCID and to enable administrators to integrate the ORCIDs into university systems. This article describes and assesses the planning, performance, and outcome of this initiative, branded ORCID @ CMU. Design and Methods: The article chronicles why and how ORCID was integrated at CMU, including the rationale for changes in strategic plans. It assesses researcher participation in the project using transaction log and content analyses, and the performance of the ORCID project team using recommendations in the Jisc ORCID project report, frankly reporting the team’s successes and failures. The article concludes with lessons learned that should inform ORCID integration projects and expectations at other institutions. Results: The ORCID @ CMU web application was a great success. However, the project team did not allow enough time to prepare or devote enough attention to advocacy. The marketing message was not sufficiently persuasive and the marketing channels were not particularly effective. The overall participation rate in ORCID @ CMU was far below the target of 40%, though participation in many demographics exceeded the goal. Conclusions: Strategic planning does not guarantee success. Secure more than lip service from senior administrators. Recruit champions from across the institution. Develop a message that resonates with researchers. Allow sufficient time to prepare. Empower the project manager. Start with the low hanging fruit. Develop special outreach to doctoral students and postdocs.

  6. Successful Climate Science Communication Strategies

    Science.gov (United States)

    Sinclair, P.

    2016-12-01

    In the past decade, efforts to communicate the facts of global change have not successfully moved political leaders and the general public to action. In response, a number of collaborative efforts between scientists and professional communicators, writers, journalists, bloggers, filmmakers, artists and others have arisen seeking to bridge that gap. As a result, a new cadre of science-literate communicators, and media-savvy scientists have made themselves visible across diverse mainstream, traditional, and social media outlets. Because of these collaborations, in recent years, misinformation, and disinformation have been successfully met with accurate and credible rebuttals within a single news cycle.Examples of these efforts is the Dark Snow Project, a science/communication collaboration focusing initially on accelerated arctic melt and sea level rise, and the Climate Science Rapid Response team, which matches professional journalists with appropriate science experts in order to respond within a single news cycle to misinformation or misunderstandings about climate science.The session will discuss successful examples and suggest creative approaches for the future.

  7. Surprisingly different star-spot distributions on the near equal-mass equal-rotation-rate stars in the M dwarf binary GJ 65 AB

    Science.gov (United States)

    Barnes, J. R.; Jeffers, S. V.; Haswell, C. A.; Jones, H. R. A.; Shulyak, D.; Pavlenko, Ya. V.; Jenkins, J. S.

    2017-10-01

    We aim to understand how stellar parameters such as mass and rotation impact the distribution of star-spots on the stellar surface. To this purpose, we have used Doppler imaging to reconstruct the surface brightness distributions of three fully convective M dwarfs with similar rotation rates. We secured high cadence spectral time series observations of the 5.5 au separation binary GJ 65, comprising GJ 65A (M5.5V, Prot = 0.24 d) and GJ 65B (M6V, Prot = 0.23 d). We also present new observations of GJ 791.2A (M4.5V, Prot = 0.31 d). Observations of each star were made on two nights with UVES, covering a wavelength range from 0.64 - 1.03μm. The time series spectra reveal multiple line distortions that we interpret as cool star-spots and which are persistent on both nights suggesting stability on the time-scale of 3 d. Spots are recovered with resolutions down to 8.3° at the equator. The global spot distributions for GJ 791.2A are similar to observations made a year earlier. Similar high latitude and circumpolar spot structure is seen on GJ 791.2A and GJ 65A. However, they are surprisingly absent on GJ 65B, which instead reveals more extensive, larger, spots concentrated at intermediate latitudes. All three stars show small amplitude latitude-dependent rotation that is consistent with solid body rotation. We compare our measurements of differential rotation with previous Doppler imaging studies and discuss the results in the wider context of other observational estimates and recent theoretical predictions.

  8. ‘Surprise’: Outbreak of Campylobacter infection associated with chicken liver pâté at a surprise birthday party, Adelaide, Australia, 2012

    Directory of Open Access Journals (Sweden)

    Emma Denehy

    2012-11-01

    Full Text Available Objective: In July 2012, an outbreak of Campylobacter infection was investigated by the South Australian Communicable Disease Control Branch and Food Policy and Programs Branch. The initial notification identified illness at a surprise birthday party held at a restaurant on 14 July 2012. The objective of the investigation was to identify the potential source of infection and institute appropriate intervention strategies to prevent further illness.Methods: A guest list was obtained and a retrospective cohort study undertaken. A combination of paper-based and telephone questionnaires were used to collect exposure and outcome information. An environmental investigation was conducted by Food Policy and Programs Branch at the implicated premises.Results: All 57 guests completed the questionnaire (100% response rate, and 15 met the case definition. Analysis showed a significant association between illness and consumption of chicken liver pâté (relative risk: 16.7, 95% confidence interval: 2.4–118.6. No other food or beverage served at the party was associated with illness. Three guests submitted stool samples; all were positive for Campylobacter. The environmental investigation identified that the cooking process used in the preparation of chicken liver pâté may have been inconsistent, resulting in some portions not cooked adequately to inactivate potential Campylobacter contamination.Discussion: Chicken liver products are a known source of Campylobacter infection; therefore, education of food handlers remains a high priority. To better identify outbreaks among the large number of Campylobacter notifications, routine typing of Campylobacter isolates is recommended.

  9. Supporting Success for All Students

    Science.gov (United States)

    Manduca, C. A.; Macdonald, H.; McDaris, J. R.; Weissmann, G. S.

    2015-12-01

    The geoscience student population in the United States today does not reflect the diversity of the US population. Not only does this challenge our ability to educate sufficient numbers of students in the geosciences, it also challenges our ability to address issues of environmental justice, to bring geoscience expertise to diverse communities, and to pursue a research agenda reflecting the needs and interests of our nation as a whole. Programs that are successful in supporting students from underrepresented groups attend to the whole student (Jolly et al, 2004) as they develop not only knowledge and skills, but a sense of belonging and a drive to succeed in geoscience. The whole student approach provides a framework for supporting the success of all students, be they members of underrepresented groups or not. Important aspects of support include mentoring and advising, academic support, an inclusive learning community, and opportunities to learn about the profession and to develop geoscience and professional skills. To successfully provide support for the full range of students, it is critical to consider not only what opportunities are available but the barriers different types of students face in accessing these opportunities. Barriers may arise from gaps in academic experiences, crossing into a new and unfamiliar culture, lack of confidence, stereotype threat, implicit bias and other sources. Isolation of geoscience learning from its application and social context may preferentially discourage some groups. Action can be taken to increase support for all students within an individual course, a department or an institution. The InTeGrate STEP Center for the Geosciences, the Supporting and Advancing Geoscience Education at Two-Year Colleges program and the On the Cutting Edge Professional Development for Geoscience Faculty program all provide resources for individuals and departments including on line information, program descriptions, and workshop opportunities.

  10. Investor Reaction to Market Surprises on the Istanbul Stock Exchange Investor Reaction to Market Surprises on the Istanbul Stock Exchange = İstanbul Menkul Kıymetler Borsasında Piyasa Sürprizlerine Yatırımcı Tepkisi

    Directory of Open Access Journals (Sweden)

    Yaman Ömer ERZURUMLU

    2011-08-01

    Full Text Available This paper examines the reaction of investors to the arrival of unexpected information on the Istanbul Stock Exchange. The empirical results suggest that the investor reaction following unexpected news on the ISE100 is consistent with Overreaction Hypothesis especially after unfavorable market surprises. Interestingly such pattern does not exist for ISE30 index which includes more liquid and informationally efficient securities. A possible implication of this study for investors is that employing a semi contrarian investment strategy of buying losers in ISE100 may generate superior returns. Moreover, results are supportive of the last regulation change of Capital Market Board of Turkey which mandates more disclosure regarding the trading of less liquid stocks with lower market capitalization.

  11. A Risk Communication Success Story

    Science.gov (United States)

    Peecook, Keith

    2010-01-01

    A key success of the decommissioning effort at the National Aeronautics and Space Administration's (NASA's) Plum Brook Reactor Facility (PBRF) has been the public outreach program. The approach has been based on risk communications rather than a public relations approach. As a result it has kept the public feeling more involved in the process. It ensures they have the information needed to understand the project and its goals, and to make recommendations. All this is done so that NASA can better plan and execute the necessary work without delays or suprises.

  12. Ikea success in chinese furniture

    OpenAIRE

    Yihong, Li

    2007-01-01

    This thesis will focus on the market exploiting and development of IKEA in China, analysis the characteristics of Chinese market and the supply-demand of the IKEA products in China. It also analyze the main Chinese consumers’ behaviour and evaluating the furniture retail market in China. IKEA is a successful case to open the china market recent years. The main goal is to acquire this information in order to provide the overseas retailers with a good starting point for creating an effective bu...

  13. CERN LHC dipole prototype success

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    In a crash programme, the first prototype superconducting dipole magnet for CERN's LHC protonproton collider was successfully powered for the first time at CERN on 14 April, eventually sailing to 9T, above the 8.65T nominal LHC field, before quenching for the third time. The next stage is to install the delicate measuring system for making comprehensive magnetic field maps in the 10 m long, 50 mm diameter twin-apertures of the magnet. These measurements will check that the required LHC field quality has been achieved at both the nominal and injection fields

  14. Double success for neutrino lab

    CERN Multimedia

    2010-01-01

    "The Gran Sasso National Laboratory in Italy is celebrating two key developments in the field of neutrino physics. Number one is the first ever detection, by the OPERA experiement, of possible tau neutrino that has switched its identity from a muon neutrino as it travelled form its origins at CERN in Switzerland to the Italian lab. Number two is the successful start-up of the ICARUS detector, which, like OPERA, is designed to study neutrinos that "oscillate" between types" (0.5 pages)

  15. Political Economy: Success or Failure?

    Directory of Open Access Journals (Sweden)

    Bruno S. Frey

    2012-09-01

    Full Text Available The Political Economy and Public Choice approaches have promoted the study of interactions between the economy and the polity for over 60 years now. The present paper endeavours to provide a critical discussion of this literature and its achievements. In particular, it begins with the different approaches based on empirically tested or politometric models and it then proceeds to discuss different studies of the effects that particular rules of the game have on politico-economic outcomes. The third section of the paper will address studies that take institutions to be endogenous and aims to explain why particular institutions emerge. Finally, the question of whether Political Economy has been a success or a failure will be tackled. While the success in terms of the position it has gained in economic research and teaching is undeniable, a look at one of the most thriving recent areas of economics, happiness research, will reveal that some of its fundamental lessons are all too often disregarded.

  16. Towards Successful Cloud Ordering Service

    Directory of Open Access Journals (Sweden)

    Chen Yan-Kwang

    2015-03-01

    Full Text Available Background: The rise of cloud services has led to a drastic growth of e-commerce and a greater investment in development of new cloud services systems by related industries. For SaaS developers, it is important to understand customer needs and make use of available resources at as early as the system design and development stage. Objectives: This study integrates E-commerce Systems (ECS Success model and Importance-Performance Analysis (IPA into empirical research of the critical factors for cloud ordering system success. Methods/Approach: A survey research is conducted to collect data on customer perceptions of the importance and performance of each attribute of the particular cloud ordering service. The sample is further divided according to the degree of use of online shopping into high-usage users and low-usage users in order to explore their views regarding the system and generate adequate coping strategies. Results: Developers of online ordering systems can refer to the important factors obtained in this study when planning strategies of product/service improvement. Conclusions: The approach proposed in this study can also be applied to evaluation of other kinds of cloud services systems.

  17. Neste Corporation - a successful year

    International Nuclear Information System (INIS)

    Ihamuotila, J.

    1991-01-01

    The past year proved a successful one for Neste Corporation. Profitability was good and operations were consistently developed. Neste is committed to giving high priority to productivity and know- how to ensure that this success continues into the future. Important developments affecting the structure of Neste Corporation during 1990 included the amalgamation of Neste's oil-related activities into a single division, the increasing concentration of Neste Chemicals, activities in Central and Southern Europe and a major strengthening of oil exploration and production operations. Neste Oil turned in a good result during 1990. Neste imported a total of 8.9 million tonnes of crude oil during 1990. Imports from the Soviet Union at 5.2 million tonnes, were over 2 million tonnes less than planned. Some 2.5 million tonnes were imported from the North Sea, and 1.2 million tonnes from the Middle East. The year was one of expansion, diversification, and solid profit for Neste Chemicals. Net sales grew by 18 % compared to 1989 and the division recorded a satisfactory performance. Petrochemicals and polyolefins production increased suhstantially as a result of plants completed, acquired, or leased during 1989. The gas division's net sales during 1990 were 46 % higher than during 1989. This growth largely resulted from an increase in the consumption of natural gas and an expansion in the volume of international IPG business. The division's profitability remained satisfactory

  18. LS1 Report: Successful tests

    CERN Multimedia

    CERN Bulletin

    2013-01-01

    At the PS Booster, the new beam dump and the associated shielding blocks surrounding it have been successfully installed and the installation of the beam transfer lines are now under way. The BI.SMH septum magnet has been successfully repaired following a confirmed vacuum leak.   At the PS, the consolidation of the seven main PS magnets has started, and the replacement of the old cooling and ventilation system continues to progress well. At the SPS, the replacement of the irradiated cables in Long Straight Section 1 (LSS1) of the SPS is now well under way and proceeding well. At the LHC, the Superconducting Magnets and Circuits Consolidation (SMACC) project remains ongoing. The closure of internal sleeves has begun in sector 7-8, and the shunt installations, a major consolidation activity, are progressing well in sector 8-1. The equivalent of more than one sector's outer sleeves (W) have been closed, and leak tests are in progress in several sub-sector...

  19. Successful Social Enterprises in Africa

    DEFF Research Database (Denmark)

    Thisted, Karen Panum; Hansen, Michael W.

    as each constitutes solid evidence of social routes to success at the BOP, they also reveal important dilemmas facing managers who each day are forced to make difficult decisions in order to strike the right balance between achieving both commercial and social goals. Thereby the paper also adds......As part of the greater focus on the role of firms and entrepreneurship in development, spotlight has recently fallen upon so-called ‘social enterprises’. Social enterprises are organizations that operate in the borderland between the for-profit and non-for-profit spheres. The inherent purpose...... of the poverty related development challenges endemic to Sub-Saharan Africa. Hence, this paper presents six tales of social enterprises from the Kenyan BOP, who all have managed to pursue a social agenda while at the same time achieving commercial viability. While the cases contribute to the BOP literature...

  20. Success factors of agricultural company

    Directory of Open Access Journals (Sweden)

    Helena Chládková

    2013-01-01

    Full Text Available This paper focuses on developing a proposal to eliminate weaknesses in medium-sized farm and thus improving its market position. To determine the position of company’s products due to the competition, a BCG Matrix (Boston Consulting Group Matrix was used. Financial Ratio Analysis was used to identify company’s financial situation. With the help of the situational analysis of the company’s internal environment, success factors, strengths and weaknesses were defined. Further the proposals were designed to remove the selected weaknesses. In the farm was identified the following strengths: quality of managers, long-term and stable customer-supplier relationships, selling commodities at optimum moisture content and purity, the use of subsidies and high milk yield cows. A weakness was mainly breeding pigs, missing website, company profitability and obsolete buildings. Amongst others was suggested to create website for the farm.

  1. Success stories in nuclear science

    International Nuclear Information System (INIS)

    Fox, M.R.

    1990-01-01

    The low level of public understanding of energy in general, and nuclear energy in particular in the United States is well known, especially by the world's scientific community. A technologically leading nation such as the United States, will not remain so for long, if fear, anxiety, worry, anger, and technological misinformation continue to influence if not drive science and energy policy. Our society, our freedom, and even our national security are at risk when sound science and energy policies are inhibited or prevented. As a scientific organization, the American Nuclear Society believes that it is our responsibility, not merely an obligation, to get involved with the educational processes of our nation. Through the Public Information Committee of ANS a variety of educational activities have been undertaken, with remarkable success. This presentation describes some of these and some of the many lessons learned from these activities and about ourselves

  2. Minimal Marking: A Success Story

    Directory of Open Access Journals (Sweden)

    Anne McNeilly

    2014-11-01

    Full Text Available The minimal-marking project conducted in Ryerson’s School of Journalism throughout 2012 and early 2013 resulted in significantly higher grammar scores in two first-year classes of minimally marked university students when compared to two traditionally marked classes. The “minimal-marking” concept (Haswell, 1983, which requires dramatically more student engagement, resulted in more successful learning outcomes for surface-level knowledge acquisition than the more traditional approach of “teacher-corrects-all.” Results suggest it would be effective, not just for grammar, punctuation, and word usage, the objective here, but for any material that requires rote-memory learning, such as the Associated Press or Canadian Press style rules used by news publications across North America.

  3. A Canadian isotope success story

    International Nuclear Information System (INIS)

    Malkoske, G.

    1997-01-01

    This paper provides some historical background on the commercial production of radioisotopes in Canada, and the evolution of the present vendor, MDS Nordion. The chief isotopes are molybdenum 99, iodine 131, and cobalt 60. Cobalt 60 for medical sterilization and irradiation is considered to be a significant growing market. Food irradiation is believed to be a big marketing opportunity, although attempts to popularize it have so far met with limited success. Candu reactors supply the bulk of the world's 60 Co supply. Eighty percent of the world's 99 Mo supply for medical imaging comes from Canada, and is at present produced in NRU Reactor, which is to be replaced by two Maple reactors coming into production in 1999 and 2000

  4. CERN vector boson hunt successful

    International Nuclear Information System (INIS)

    Robinson, A.L.

    1983-01-01

    UA-1 and UA-2 are code names for two groups of physicists at the European Laboratory for Particle Physics (CERN), together comprising almost 200 researchers. From data collected in two 3-month-long runs last fall and spring, the groups have collected 100 intermediate vector bosons (90 W's and 10 Z 0 's) whose properties so far fit the predictions of the unified quantum field theory of the electromagnetic and weak forces. Although the number of events is short of staggering, the discovery is immensely important. Physicists have been looking for the W for about 50 years. The Z 0 is crucial to the success of the method by which the two forces were melded into one - the electro-weak force

  5. Proactive Assessment for Collaboration Success

    Directory of Open Access Journals (Sweden)

    Teresa L. Ju

    2014-07-01

    Full Text Available This study describes a government–academia–industry joint training project that produces Vietnamese midlevel technical managers. To ensure collaboration success, a proactive assessment methodology was developed as a supplement to the conventional project management practices. In the postproject feedback, the funding agencies acknowledged that the project fulfilled its contractual obligations and achieved its objectives. The implementing university was pleased as it broke ground in this type of collaboration in Taiwan. The industrial partners, however, were not so sure about the effectiveness of this collaborative training endeavor because there were many skirmishes between company supervisors and Vietnamese interns caused by the interns’ self-interested perception and expectation. Consequently, a theoretical framework for predicting internship acceptance and preventing unfavorable perceptions was proposed to strengthen the proactive assessment methodology. Collaboration research, funding agencies, academia, and industry could all benefit from this study.

  6. High-Impact Succession Management. Executive Summary

    Science.gov (United States)

    Lamoureux, Kim; Campbell, Michael; Smith, Roland

    2009-01-01

    Most companies have an opportunity to improve their succession management programs. The number one challenge for succession management (as identified by both HR leaders and executives) is developing a succession planning strategy. This comprehensive industry study sets out to determine how succession management (when done well) helps improve…

  7. Defining Success in Open Science.

    Science.gov (United States)

    Ali-Khan, Sarah E; Jean, Antoine; MacDonald, Emily; Gold, E Richard

    2018-01-01

    Mounting evidence indicates that worldwide, innovation systems are increasing unsustainable. Equally, concerns about inequities in the science and innovation process, and in access to its benefits, continue. Against a backdrop of growing health, economic and scientific challenges global stakeholders are urgently seeking to spur innovation and maximize the just distribution of benefits for all. Open Science collaboration (OS) - comprising a variety of approaches to increase open, public, and rapid mobilization of scientific knowledge - is seen to be one of the most promising ways forward. Yet, many decision-makers hesitate to construct policy to support the adoption and implementation of OS without access to substantive, clear and reliable evidence. In October 2017, international thought-leaders gathered at an Open Science Leadership Forum in the Washington DC offices of the Bill and Melinda Gates Foundation to share their views on what successful Open Science looks like. Delegates from developed and developing nations, national governments, science agencies and funding bodies, philanthropy, researchers, patient organizations and the biotechnology, pharma and artificial intelligence (AI) industries discussed the outcomes that would rally them to invest in OS, as well as wider issues of policy and implementation. This first of two reports, summarizes delegates' views on what they believe OS will deliver in terms of research, innovation and social impact in the life sciences. Through open and collaborative process over the next months, we will translate these success outcomes into a toolkit of quantitative and qualitative indicators to assess when, where and how open science collaborations best advance research, innovation and social benefit. Ultimately, this work aims to develop and openly share tools to allow stakeholders to evaluate and re-invent their innovation ecosystems, to maximize value for the global public and patients, and address long-standing questions

  8. Evolution, Appearance, and Occupational Success

    Directory of Open Access Journals (Sweden)

    Anthony C. Little

    2012-12-01

    Full Text Available Visual characteristics, including facial appearance, are thought to play an important role in a variety of judgments and decisions that have real occupational outcomes in many settings. Indeed, there is growing evidence suggesting that appearance influences hiring decisions and even election results. For example, attractive individuals are more likely to be hired, taller men earn more, and the facial appearance of candidates has been linked to real election outcomes. In this article, we review evidence linking physical appearance to occupational success and evaluate the hypothesis that appearance based biases are consistent with predictions based on evolutionary theories of coalition formation and leadership choice. We discuss why appearance based effects are so pervasive, addressing ideas about a “kernel of truth” in attributions and about coalitional psychology. We additionally highlight that appearance may be differently related to success at work according to the types of job or task involved. For example, leaders may be chosen because the characteristics they possess are seen as best suited to lead in particular situations. During a time of war, a dominant-appearing leader may inspire confidence and intimidate enemies while during peace-time, when negotiation and diplomacy are needed, interpersonal skills may outweigh the value of a dominant leader. In line with these ideas, masculine-faced leaders are favored in war-time scenarios while feminine-faced leaders are favored in peace-time scenarios. We suggest that such environment or task specific competencies may be prevalent during selection processes, whereby individuals whose appearance best matches perceived task competences are most likely selected, and propose the general term “task-congruent selection” to describe these effects. Overall, our review highlights how potentially adaptive biases could influence choices in the work place. With respect to certain biases

  9. Obesity management: what brings success?

    Science.gov (United States)

    Lagerros, Ylva Trolle; Rössner, Stephan

    2013-01-01

    The upward trend in obesity prevalence across regions and continents is a worldwide concern. Today a majority of the world's population live in a country where being overweight or obese causes more deaths than being underweight. Only a portion of those qualifying for treatment will get the health care they need. Still, a minor weight loss of 5-10% seems to be sufficient to provide a clinically significant health benefit in terms of risk factors for cardiovascular disease and diabetes. Diet, exercise and behavior modifications remain the current cornerstones of obesity treatment. Weight-loss drugs play a minor role. Drugs which were available and reasonably effective have been withdrawn because of side effects. The fact that the 'old' well known, but pretty unexciting tools remain the basic armamentarium causes understandable concern and disappointment among both patients and therapists. Hence, bariatric surgery has increasingly been recognized and developed, as it offers substantial weight loss and prolonged weight control. The present review highlights the conventional tools to counter obesity, lifestyle modification, pharmacotherapy and bariatric surgery, including some of the barriers to successful weight loss: (1) unrealistic expectations of success; (2) high attrition rates; (3) cultural norms of self-acceptance in terms of weight and beliefs of fat being healthy; (4) neighborhood attributes such as a lack of well-stocked supermarkets and rather the presence of convenience stores with low-quality foods; and (5) the perception of the neighborhood as less safe and with low walkability. Prevention is the obvious key. Cost-effective societal interventions such as a tax on unhealthy food and beverages, front-of-pack traffic light nutrition labeling and prohibition of advertising of junk food and beverages to children are also discussed.

  10. Critical success factors for achieving superior m-health success.

    Science.gov (United States)

    Dwivedi, A; Wickramasinghe, N; Bali, R K; Naguib, R N G

    2007-01-01

    Recent healthcare trends clearly show significant investment by healthcare institutions into various types of wired and wireless technologies to facilitate and support superior healthcare delivery. This trend has been spurred by the shift in the concept and growing importance of the role of health information and the influence of fields such as bio-informatics, biomedical and genetic engineering. The demand is currently for integrated healthcare information systems; however for such initiatives to be successful it is necessary to adopt a macro model and appropriate methodology with respect to wireless initiatives. The key contribution of this paper is the presentation of one such integrative model for mobile health (m-health) known as the Wi-INET Business Model, along with a detailed Adaptive Mapping to Realisation (AMR) methodology. The AMR methodology details how the Wi-INET Business Model can be implemented. Further validation on the concepts detailed in the Wi-INET Business Model and the AMR methodology is offered via a short vignette on a toolkit based on a leading UK-based healthcare information technology solution.

  11. The Success of Podcasting as a Success for Science Outreach

    Science.gov (United States)

    Haupt, R. J.; Wheatley, P.; Padilla, A. J.; Barnhart, C. J.

    2015-12-01

    Podcasts are downloadable web-hosted audio programs (radio on demand). The medium's popularity has grown immensely since its beginning 10+ years ago. "Science and Medicine" remains a prominent category in iTunes (the most popular podcast marketplace), but is unfortunately inundated with non-scientific and dubious content (e.g. the paranormal, health fads, etc.). It seems unlikely that legitimate science content would thrive in such an environment. However, our experience as an independent science podcast shows it is possible to successfully present authentic science to a general audience and maintain popularity. Our show, Science… sort of, began in the fall of 2009, and we have since produced episodes regularly. As of July 31, 2015, our feed hosts 235 episodes, with an average ~6,700 downloads per episode, and over 1.6 million total downloads originating from all across the globe. Thanks to listener involvement and contribution, the show is financially self-sustaining. While production requires a significant time input, no external financial support from the creators or other granting agencies is needed. Traditional media outlets rely on advertisers, thus pressuring shows to produce "popular" content featuring science celebrities. In contrast, independent podcasts can interview big name science communicators, such as Dr. Neil DeGrasse Tyson, while also exploring the research of graduate students and early career scientists. This level playing field provides an unprecedented opportunity for studies to reach a global audience and share research that previously may have only be seen by those at a specialized conference or subscribed to niche journals. Further, direct public engagement helps the audience personally connect to the research and researcher. In combination with other social media platforms, podcasting is a powerful tool in the outreach arsenal, enabling one to share science directly with the world in a way that both educates and excites listeners.

  12. BUSINESS SUCCESS IN TODAY'S ROMANIA: OPINIONS EXPRESSED BY STUDENTS AND ENTREPRENEURS

    Directory of Open Access Journals (Sweden)

    Elena NEDELCU

    2016-06-01

    Full Text Available We consider that a study - which contributes to the further knowledge of the entrepreneurial spirit of the Romanian students (to what extent and in what manner this spirit manifests itself, the students' and entrepreneurs' relation to the business environment and the "nowadays" challenges of the workforce - is both necessary and useful. Moreover, the present study aims at identifying the existence of possible differences between the way in which students evolve and the way in which entrepreneurs assess certain elements that make up the Romanian business environment and that might contribute to their business success. Which are "the keys to success" in business - according to students? What about the entrepreneurs? What would be more useful for business success: the knowledge of success patterns, training and qualification, access to information, to financial resources, competence (knowing what to do or a friendly business environment? The research method that we have used is the social inquiry based on surveys. The survey was applied to 1,500 students from Universities within Bucharest. The analysis of data has surprised because "coping personal abilities" have turned out to be "the keys of success" in business in Romania - according to students (67% and entrepreneurs (86%. The significant differences between the students' and entrepreneurs' answers have been included within the "professional competence" criterion and the "rules observance" criterion. In comparison with entrepreneurs, students appreciate these criteria to a larger extent.

  13. The Origin of Carbon-bearing Volatiles in Surprise Valley Hot Springs in the Great Basin: Carbon Isotope and Water Chemistry Characterizations

    Science.gov (United States)

    Fu, Qi; Socki, Richard A.; Niles, Paul B.; Romanek, Christopher; Datta, Saugata; Darnell, Mike; Bissada, Adry K.

    2013-01-01

    There are numerous hydrothermal fields within the Great Basin of North America, some of which have been exploited for geothermal resources. With methane and other carbon-bearing compounds being observed, in some cases with high concentrations, however, their origins and formation conditions remain unknown. Thus, studying hydrothermal springs in this area provides us an opportunity to expand our knowledge of subsurface (bio)chemical processes that generate organic compounds in hydrothermal systems, and aid in future development and exploration of potential energy resources as well. While isotope measurement has long been used for recognition of their origins, there are several secondary processes that may generate variations in isotopic compositions: oxidation, re-equilibration of methane and other alkanes with CO2, mixing with compounds of other sources, etc. Therefore, in addition to isotopic analysis, other evidence, including water chemistry and rock compositions, are necessary to identify volatile compounds of different sources. Surprise Valley Hot Springs (SVHS, 41 deg 32'N, 120 deg 5'W), located in a typical basin and range province valley in northeastern California, is a terrestrial hydrothermal spring system of the Great Basin. Previous geophysical studies indicated the presence of clay-rich volcanic and sedimentary rocks of Tertiary age beneath the lava flows in late Tertiary and Quaternary. Water and gas samples were collected for a variety of chemical and isotope composition analyses, including in-situ pH, alkalinity, conductivity, oxidation reduction potential (ORP), major and trace elements, and C and H isotope measurements. Fluids issuing from SVHS can be classified as Na-(Cl)-SO4 type, with the major cation and anion being Na+ and SO4(2-), respectively. Thermodynamic calculation using ORP and major element data indicated that sulfate is the most dominant sulfur species, which is consistent with anion analysis results. Aquifer temperatures at depth

  14. The Origin of Carbon-bearing Volatiles in Surprise Valley Hot Springs in the Great Basin: Carbon Isotope aud Water Chemistry Characterizations

    Science.gov (United States)

    Fu, Qi; Socki, Richard A.; Niles, Paul B.; Romanek, Christopher; Datta, Saugata; Darnell, Mike; Bissada, Adry K.

    2013-01-01

    There are numerous hydrothermal fields within the Great Basin of North America, some of which have been exploited for geothermal resources. With methane and other carbon-bearing compounds being observed, in some cases with high concentrations, however, their origins and formation conditions remain unknown. Thus, studying hydrothermal springs in this area provides us an opportunity to expand our knowledge of subsurface (bio)chemical processes that generate organic compounds in hydrothermal systems, and aid in future development and exploration of potential energy resources as well. While isotope measurement has long been used for recognition of their origins, there are several secondary processes that may generate variations in isotopic compositions: oxidation, re-equilibration of methane and other alkanes with CO2, mixing with compounds of other sources, etc. Therefore, in addition to isotopic analysis, other evidence, including water chemistry and rock compositions, are necessary to identify volatile compounds of different sources. Surprise Valley Hot Springs (SVHS, 41º32'N, 120º5'W), located in a typical basin and range province valley in northeastern California, is a terrestrial hydrothermal spring system of the Great Basin. Previous geophysical studies indicated the presence of clay-rich volcanic and sedimentary rocks of Tertiary age beneath the lava flows in late Tertiary and Quaternary. Water and gas samples were collected for a variety of chemical and isotope composition analyses, including in-situ pH, alkalinity, conductivity, oxidation reduction potential (ORP), major and trace elements, and C and H isotope measurements. Fluids issuing from SVHS can be classified as Na-(Cl)-SO4 type, with the major cation and anion being Na+ and SO4 2-, respectively. Thermodynamic calculation using ORP and major element data indicated that sulfate is the most dominant sulfur species, which is consistent with anion analysis results. Aquifer temperatures at depth estimated

  15. Success in transmitting hazard science

    Science.gov (United States)

    Price, J. G.; Garside, T.

    2010-12-01

    Money motivates mitigation. An example of success in communicating scientific information about hazards, coupled with information about available money, is the follow-up action by local governments to actually mitigate. The Nevada Hazard Mitigation Planning Committee helps local governments prepare competitive proposals for federal funds to reduce risks from natural hazards. Composed of volunteers with expertise in emergency management, building standards, and earthquake, flood, and wildfire hazards, the committee advises the Nevada Division of Emergency Management on (1) the content of the State’s hazard mitigation plan and (2) projects that have been proposed by local governments and state agencies for funding from various post- and pre-disaster hazard mitigation programs of the Federal Emergency Management Agency. Local governments must have FEMA-approved hazard mitigation plans in place before they can receive this funding. The committee has been meeting quarterly with elected and appointed county officials, at their offices, to encourage them to update their mitigation plans and apply for this funding. We have settled on a format that includes the county’s giving the committee an overview of its infrastructure, hazards, and preparedness. The committee explains the process for applying for mitigation grants and presents the latest information that we have about earthquake hazards, including locations of nearby active faults, historical seismicity, geodetic strain, loss-estimation modeling, scenarios, and documents about what to do before, during, and after an earthquake. Much of the county-specific information is available on the web. The presentations have been well received, in part because the committee makes the effort to go to their communities, and in part because the committee is helping them attract federal funds for local mitigation of not only earthquake hazards but also floods (including canal breaches) and wildfires, the other major concerns in

  16. Global sea turtle conservation successes.

    Science.gov (United States)

    Mazaris, Antonios D; Schofield, Gail; Gkazinou, Chrysoula; Almpanidou, Vasiliki; Hays, Graeme C

    2017-09-01

    We document a tendency for published estimates of population size in sea turtles to be increasing rather than decreasing across the globe. To examine the population status of the seven species of sea turtle globally, we obtained 299 time series of annual nesting abundance with a total of 4417 annual estimates. The time series ranged in length from 6 to 47 years (mean, 16.2 years). When levels of abundance were summed within regional management units (RMUs) for each species, there were upward trends in 12 RMUs versus downward trends in 5 RMUs. This prevalence of more upward than downward trends was also evident in the individual time series, where we found 95 significant increases in abundance and 35 significant decreases. Adding to this encouraging news for sea turtle conservation, we show that even small sea turtle populations have the capacity to recover, that is, Allee effects appear unimportant. Positive trends in abundance are likely linked to the effective protection of eggs and nesting females, as well as reduced bycatch. However, conservation concerns remain, such as the decline in leatherback turtles in the Eastern and Western Pacific. Furthermore, we also show that, often, time series are too short to identify trends in abundance. Our findings highlight the importance of continued conservation and monitoring efforts that underpin this global conservation success story.

  17. Developing a successful robotics program.

    Science.gov (United States)

    Luthringer, Tyler; Aleksic, Ilija; Caire, Arthur; Albala, David M

    2012-01-01

    Advancements in the robotic surgical technology have revolutionized the standard of care for many surgical procedures. The purpose of this review is to evaluate the important considerations in developing a new robotics program at a given healthcare institution. Patients' interest in robotic-assisted surgery has and continues to grow because of improved outcomes and decreased periods of hospitalization. Resulting market forces have created a solid foundation for the implementation of robotic surgery into surgical practice. Given proper surgeon experience and an efficient system, robotic-assisted procedures have been cost comparable to open surgical alternatives. Surgeon training and experience is closely linked to the efficiency of a new robotics program. Formally trained robotic surgeons have better patient outcomes and shorter operative times. Training in robotics has shown no negative impact on patient outcomes or mentor learning curves. Individual economic factors of local healthcare settings must be evaluated when planning for a new robotics program. The high cost of the robotic surgical platform is best offset with a large surgical volume. A mature, experienced surgeon is integral to the success of a new robotics program.

  18. Is OSCE successful in pediatrics?

    Directory of Open Access Journals (Sweden)

    M Imani

    2009-02-01

    Full Text Available Background: The Faculty of Medical Sciences, University of Zahedan implemented the Objective Structured Clinical Examination (OSCE in the final Examination during the 2003–2004 academic year. Simultaneously, the pediatric department initiated faculty and student training, and instituted the OSCE as an assessment instrument during the pediatric clerkship in year 5. The study set out to explore student acceptance of the OSCE as part of an evaluation of the Pediatric clerkship.Purpose: This study implemented to evaluate a new method of assessment in medical education in pediatrics.Methods: A self-administered questionnaire was completed by successive groups of students immediately after the OSCE at the end of each clerkship rotation. Main outcome measures were student perception of examination attributes, which included the quality of instructions and organization, the quality of performance, authenticity and transparency of the process, and usefulness of the OSCE as an assessment instrument compared to other methods.Results: There was overwhelming acceptance of the OSCE in Pediatric with respect to the comprehensiveness (90%, transparency (87%, fairness (57% and authenticity of the required tasks (58–78%. However, students felt that it was a strong anxiety-producing experience. And concerns were expressed regarding the ambiguity of some questions and inadequacy of time for expected tasks.Conclusion: Student feedback was invaluable in influencing faculty teaching, curriculum direction and appreciation of student opinion. Further psychometric evaluation will strengthen the development of the OSCE.Key words: OSCE, COMPETENCE ASSESSMENT

  19. Uniquely positioned for future success

    International Nuclear Information System (INIS)

    Thorn, D.

    2001-01-01

    The author began this presentation by giving a corporate profile of PanCanadian, discussing its evolution over time to it being spun-off of PanCanadian Petroleum Limited from Canadian Pacific to form PanCanadian Energy Corporation in 2001. The worldwide activities were mentioned. PanCanadian Petroleum Limited is one of the country's largest producers, experiencing emerging international exploration success, and marketing energy throughout North America. It is also a company that is financially strong. PanCanadian is involved in natural gas, crude, natural gas liquids, and power. An outline of PanCanadian Energy Services was provided, along with its strategic positioning. The next section of the presentation focused on Nova Scotia natural gas, beginning with a map displaying the Eastern Canada development. The focus shifted to the Deep Panuke field, where exploration began in early 1999. PanCanadian holds 20 per cent of the Scotian Shelf. The industry activity offshore Nova Scotia began with the first well in 1967, and to date 178 wells have been drilled, of which 106 were exploration wells. There have been 21 significant discoveries. Production in the Deep Panuke is expected to begin during the first quarter of 2005. The various approval agencies were listed. With large interests held in the offshore Nova Scotia and extensive exploration plans, PanCanadian opportunity in Eastern Canada, New England and the Mid-Atlantic markets has greatly increased. refs., tabs., figs

  20. The success story of crystallography.

    Science.gov (United States)

    Schwarzenbach, Dieter

    2012-01-01

    Diffractionists usually place the birth of crystallography in 1912 with the first X-ray diffraction experiment of Friedrich, Knipping and Laue. This discovery propelled the mathematical branch of mineralogy to global importance and enabled crystal structure determination. Knowledge of the geometrical structure of matter at atomic resolution had revolutionary consequences for all branches of the natural sciences: physics, chemistry, biology, earth sciences and material science. It is scarcely possible for a single person in a single article to trace and appropriately value all of these developments. This article presents the limited, subjective view of its author and a limited selection of references. The bulk of the article covers the history of X-ray structure determination from the NaCl structure to aperiodic structures and macromolecular structures. The theoretical foundations were available by 1920. The subsequent success of crystallography was then due to the development of diffraction equipment, the theory of the solution of the phase problem, symmetry theory and computers. The many structures becoming known called for the development of crystal chemistry and of data banks. Diffuse scattering from disordered structures without and with partial long-range order allows determination of short-range order. Neutron and electron scattering and diffraction are also mentioned.

  1. Nurse manager succession planning: a concept analysis.

    Science.gov (United States)

    Titzer, Jennifer L; Shirey, Maria R

    2013-01-01

    The current nursing leadership pipeline is inadequate and demands strategic succession planning methods. This article provides concept clarification regarding nurse manager succession planning. Attributes common to succession planning include organizational commitment and resource allocation, proactive and visionary leadership approach, and a mentoring and coaching environment. Strategic planning, current and future leadership analysis, high-potential identification, and leadership development are succession planning antecedents. Consequences of succession planning are improved leadership and organizational culture continuity, and increased leadership bench strength. Health care has failed to strategically plan for future leadership. Developing a strong nursing leadership pipeline requires deliberate and strategic succession planning. © 2013 Wiley Periodicals, Inc.

  2. Status and Mating Success Amongst Visual Artists

    Science.gov (United States)

    Clegg, Helen; Nettle, Daniel; Miell, Dorothy

    2011-01-01

    Geoffrey Miller has hypothesized that producing artwork functions as a mating display. Here we investigate the relationship between mating success and artistic success in a sample of 236 visual artists. Initially, we derived a measure of artistic success that covered a broad range of artistic behaviors and beliefs. As predicted by Miller’s evolutionary theory, more successful male artists had more sexual partners than less successful artists but this did not hold for female artists. Also, male artists with greater artistic success had a mating strategy based on longer term relationships. Overall the results provide partial support for the sexual selection hypothesis for the function of visual art. PMID:22059085

  3. The seven S's for successful management.

    Science.gov (United States)

    Davidhizar, R

    1995-03-01

    Becoming a successful manager in a health care agency is, for most new managers, an awesome goal. Successful management is more than knowledge of leadership roles and management functions that can be learned in school or educational workshops. Successful management involves effective use of both the manager's affective and cognitive domains. Mentoring and apprenticeship with a successful nurse leader is for many novice managers a highly valuable way to learn management skills since this allows for techniques with a successful nurse manager to be visualized and then modeled. "Seven S's" that provide a framework for managerial success are discussed.

  4. They teach it more successfully

    Science.gov (United States)

    Reyes Ruiz-Gallardo, Jose; Valdés, Arturo; Castaño, Santiago

    2010-05-01

    Science education has been involved in a crisis due to the way in which teachers teach future teachers (McDermott, 1990; Bernal, 2005). During generations, students have been learning sciences as something already done, based on memorizing a number of contents or formulas that always give a correct answer (CUSE, 1997). Thus, Lederman y Abd-El-Khalick (1998) considered that is difficult that future teachers feel Science as something tempting and based on empiricism, if they only learn contents. To learn Science it is required to think, to do and to talk (Pujol, 2003). In this study an experience where students are teachers is shown. 160 students from the Faculty of Education have participated. They had to make, in cooperative groups of four, several activities to eliminate typical Science conceptual mistakes in children (such as minerals and rocks as the same thing, or the proportion of the Earth flattened out at the poles). Some peer groups have to develop activities as kids, question teachers and extract activity strengths and weakness from a kid point of view. A condition of these activities is that they are not mere teacher's demonstrations. Kids have to discover by themselves the conceptual mistake throughout the proposed activity. Afterwards, teacher's groups pass to occupy children's role and vice-versa with new activities from other conceptual mistakes. The experience was tested from two different points of view: a) student's perception of the experience, and b) final exam outcomes. Results show that 95% of the students prefer to be explained by their peers than by the lecturer. As outcomes, 94% of the students that experienced with their peers these activities and explanations, answer successfully the exam questions, while in former experiences where lecturer explain the same concepts, this value decreased up to 64%. These results coincide with other experiences concluding that students have more success than the teacher to make understand concepts to their

  5. Prison hospice: an unlikely success.

    Science.gov (United States)

    Craig, E L; Craig, R E

    1999-01-01

    Efforts to introduce hospice and palliative care into American prisons have become fairly widespread, in response to the sharp increase in inmate deaths. The primary impetus originally came from the alarming number of AIDS deaths among prisoners. The new combination therapies have proved very successful in treating AIDS, but are very costly, and many problems must be overcome to ensure their effectiveness in correctional settings. Although the AIDS epidemic seems to be in decline, prisons are experiencing a rise in the number of deaths due to "natural causes." In this article we present a review of the prison hospice scene--the response to this crisis in correctional health care. First, we discuss the challenges facing the introduction of hospice into the correctional setting. Then, we present a brief overview of recent developments and a discussion of some ways hospice components have been adapted for life behind bars. Finally, we indicate some of the prospects for the future. Hospice professionals, armed with thorough professional training and years of experience, often fear that correctional health care providers will only parody superficial aspects of the hospice approach. Continual nudging and nurturing by local and state hospice professionals is required in order to bring about this change in the first place and to sustain it through time. Prison hospice workers need not only initial training, but also ongoing education and personal contact with experienced hospice professionals. While the interest of the big national organizations is necessary, the real action happens when local hospices work with nearby prisons to attend to the needs of dying inmates.

  6. The scientific success of JET

    International Nuclear Information System (INIS)

    Keilhacker, M.; Gibson, A.; Gormezano, C.; Rebut, P.H.

    2001-01-01

    The paper highlights the JET work in physics and technology during the period of the JET Joint Undertaking (1978-1999), with special emphasis on what has been learned for extrapolation to a NEXT STEP device. - Global confinement scaling has been extended to high currents and heating powers. Dimensionless scaling experiments of ELMy H mode plasmas suggest that bulk plasma transport is gyro-Bohm and predict ignition for a device with ITER-FDR parameters. Experiments in which the plasma elongation and triangularity were varied independently show a strong increase of confinement time with elongation (τ E ∼κ α 0.8±0.3 ), thus supporting a basic design principle of ITER-FEAT. With the Pellet Enhanced Performance (PEP) mode, JET has discovered the beneficial effect of reversed magnetic shear on confinement, opening the possibility of advanced tokamak scenarios. - With a three stage programme of progressively more closed divertors, JET has demonstrated the benefits of divertor closure, in particular, of high divertor neutral pressure which facilitates helium removal. It has also shown that in detached (or semidetached) radiative divertor plasmas the average power load on the target plates of a NEXT STEP device should be tolerable but, in addition, that the transient power loads during ELMs could cause problems. - In 1991 JET has demonstrated the first ever controlled production of a megawatt of fusion power. More extensive D-T experiments in 1997 (DTE1) have established new records in fusion performance: 16 MW transient fusion power with Q in =0.62 (i.e. close to breakeven, Q in =1) and 4 MW steady state fusion power with Q in =0.18 for 4 s. DTE1 has also allowed a successful test of various reactor ICRF heating schemes and a clear demonstration of alpha particle heating, consistent with classical expectations. - JET has developed and tested some of the most important technologies for a NEXT STEP and a reactor, in particular the safe handling of tritium and the

  7. CSM a success in Bangladesh.

    Science.gov (United States)

    1983-01-01

    The Bangladesh Social Marketing Project (SMP), providing contraceptives at an annual rate of 931,000 couple years of protection (CYP) as of June 1983, is a success. This figure has grown markedly since the start of the program in late 1975, when the SMP provided 80,000 CYPs, or 8% of nonclinical protection provided. The SMP has contributed to the steadily increasing national nonclinical contraceptive distribution. Currently, SMP distribution accounts for as much as the government and nongovernment programs combined. When clinical methods (including sterilizations) are added to national distribution, the SMP share represents about 28% of total contraceptive use. The SMP does not provide clinical methods, but the entire increase in nonclinical protection provided by the national program since 1975 has been the result of SMP product sales. The SMP utilizes the available mass media for promotion, including print, radio, television, as well as outdoor media and point of purchase materials. Mobile Film Units (MFUs) are an innovative promotional method employed by the SMP. Approximately 80 night time outdoor showings are organized each month in rural areas by SMP promoters. Typically, several short films, usually a popular story with a family planning theme, are run. Between each film the SMP products are of advertised. Products are often sold during and after the films. Retail outlets for SMP products include general stores, pharmacies, and other small shops. When products were introduced in 1975 retail outlets totaled 7500. By August 1983 the number of country wide retailers carrying SMP products had grown to nearly 100,000. In 1982 a marketing strategy emphasizing the role of doctors and rural medical practitioners (RMPs) was introduced. There are between 70,00-100,000 RMPs in Bangladesh. They are well known and respected "doctors" in their villages and add an extensive family planning outreach to the SMP system. The most important advantage of using the RMPs is their

  8. Key performance indicators for successful simulation projects

    OpenAIRE

    Jahangirian, M; Taylor, SJE; Young, T; Robinson, S

    2016-01-01

    There are many factors that may contribute to the successful delivery of a simulation project. To provide a structured approach to assessing the impact various factors have on project success, we propose a top-down framework whereby 15 Key Performance Indicators (KPI) are developed that represent the level of successfulness of simulation projects from various perspectives. They are linked to a set of Critical Success Factors (CSF) as reported in the simulation literature. A single measure cal...

  9. Project Success in Agile Development Software Projects

    Science.gov (United States)

    Farlik, John T.

    2016-01-01

    Project success has multiple definitions in the scholarly literature. Research has shown that some scholars and practitioners define project success as the completion of a project within schedule and within budget. Others consider a successful project as one in which the customer is satisfied with the product. This quantitative study was conducted…

  10. Crop succession requirements in agricultural production planning

    NARCIS (Netherlands)

    Klein Haneveld, W.K.; Stegeman, A.

    2005-01-01

    A method is proposed to write crop succession requirements as linear constraints in an LP-based model for agricultural production planning. Crop succession information is given in the form of a set of inadmissible successions of crops. The decision variables represent the areas where a certain

  11. Critical success factors for managing purchasing groups

    NARCIS (Netherlands)

    Schotanus, Fredo; Telgen, Jan; de Boer, L.

    2010-01-01

    In this article, we identify critical success factors for managing small and intensive purchasing groups by comparing successful and unsuccessful purchasing groups in a large-scale survey. The analysis of our data set suggests the following success factors: no enforced participation, sufficient

  12. Endoscopic Third Ventriculostomy: Success and Failure.

    Science.gov (United States)

    Deopujari, Chandrashekhar E; Karmarkar, Vikram S; Shaikh, Salman T

    2017-05-01

    Endoscopic third ventriculostomy (ETV) has now become an accepted mode of hydrocephalus treatment in children. Varying degrees of success for the procedure have been reported depending on the type and etiology of hydrocephalus, age of the patient and certain technical parameters. Review of these factors for predictability of success, complications and validation of success score is presented.

  13. A meta-model perspective on business models

    NARCIS (Netherlands)

    Alberts, Berend Thomas; Meertens, Lucas Onno; Iacob, Maria Eugenia; Nieuwenhuis, Lambertus Johannes Maria; Shishkov, Boris

    2013-01-01

    The business model field of research is a young and emerging discipline that finds itself confronted with the need for a common language, lack of conceptual consolidation, and without adequate theoretical development. This not only slows down research, but also undermines business model’s usefulness

  14. A Metamodel for Crustal Magmatism: Phase Equilibria of Giant Ignimbrites

    OpenAIRE

    Fowler, Sarah J.; Spera, Frank J.

    2017-01-01

    Diverse explanations exist for the large-volume catastrophic eruptions that formed the Bishop Tuff of Long Valley in eastern California, the Bandelier Tuff in New Mexico, and the tuffs of Yellowstone, Montana, USA. These eruptions are among the largest on Earth within the last 2 Myr. A common factor in recently proposed petrogenetic scenarios for each system is multistage processing, in which a crystal mush forms by crystal fractionation and is then remobilized to liberate high-silica liquids...

  15. A Java Bytecode Metamodel for Composable Program Analyses

    NARCIS (Netherlands)

    Yildiz, Bugra Mehmet; Bockisch, Christoph; Rensink, Arend; Aksit, Mehmet; Seidl, Martina; Zschaler, Steffen

    Program analyses are an important tool to check if a system fulfills its specification. A typical implementation strategy for program analyses is to use an imperative, general-purpose language like Java; and access the program to be analyzed through libraries for manipulating intermediate code, such

  16. Meta-Modeling by Symbolic Regression and Pareto Simulated Annealing

    NARCIS (Netherlands)

    Stinstra, E.; Rennen, G.; Teeuwen, G.J.A.

    2006-01-01

    The subject of this paper is a new approach to Symbolic Regression.Other publications on Symbolic Regression use Genetic Programming.This paper describes an alternative method based on Pareto Simulated Annealing.Our method is based on linear regression for the estimation of constants.Interval

  17. Ontology Driven Meta-Modeling of Service Oriented Architecture

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... #Department of Computer Applications, National Institute of ... *5Department of Computer Science, Winona State University, MN, USA ..... Further, it has aided in service .... Software: A Research Roadmap”, Workshop on the Future of ... and A. Solberg, “Model-driven service engineering with SoaML”, in.

  18. Metamodel-based robust simulation-optimization : An overview

    NARCIS (Netherlands)

    Dellino, G.; Meloni, C.; Kleijnen, J.P.C.; Dellino, Gabriella; Meloni, Carlo

    2015-01-01

    Optimization of simulated systems is the goal of many methods, but most methods assume known environments. We, however, develop a "robust" methodology that accounts for uncertain environments. Our methodology uses Taguchi's view of the uncertain world but replaces his statistical techniques by

  19. Software Quality Assessment Tool Based on Meta-Models

    OpenAIRE

    Doneva Rositsa; Gaftandzhieva Silvia; Doneva Zhelyana; Staevsky Nevena

    2015-01-01

    In the software industry it is indisputably essential to control the quality of produced software systems in terms of capabilities for easy maintenance, reuse, portability and others in order to ensure reliability in the software development. But it is also clear that it is very difficult to achieve such a control through a ‘manual’ management of quality.There are a number of approaches for software quality assurance based typically on software quality models (e.g. ISO 9126, McCall’s, Boehm’s...

  20. Medical abortion. defining success and categorizing failures

    DEFF Research Database (Denmark)

    Rørbye, Christina; Nørgaard, Mogens; Vestermark, Vibeke

    2003-01-01

    . The difference in short- and long-term success rates increased with increasing gestational age. The majority of failures (76%) were diagnosed more than 2 weeks after initiation of the abortion. At a 2-week follow-up visit, the women who turned out to be failures had a larger endometrial width, higher beta......-hCG values and smaller reductions of beta-hCG than those treated successfully. To optimize comparison of success rates after different medical abortion regimens, we suggest that the criteria for success are stated clearly, that the success rates are stratified according to gestational age...

  1. Critical success factors in ERP implementation

    Directory of Open Access Journals (Sweden)

    Blerta Abazi Chaushi

    2016-11-01

    Full Text Available This study conducts state of the art literature review of critical success factors for enterprise resource planning systems implementation success. Since research on critical success factors for ERP implementation success is very rare and fragmented, this study provides a more comprehensive list of ten factors that companies that have adopted and struggle with the implementation, as well as companies who are in the process of considering implementation of ERP system can easily adopt and follow. The main contribution of this paper is that these ten new critical success factors are identifi ed through a thorough analysis of 22 selected research papers and is more comprehensive and straightforwardly employable for use.

  2. A kinematic comparison of successful and unsuccessful tennis serves across the elite development pathway.

    Science.gov (United States)

    Whiteside, David; Elliott, Bruce; Lay, Brendan; Reid, Machar

    2013-08-01

    While velocity generation is an obvious prerequisite to proficient tennis serve performance, it is also the only stroke where players are obliged to negotiate a unique target constraint. Therefore, the dearth of research attending to the accuracy component of the serve is surprising. This study compared the body, racquet and ball kinematics characterising successful serves and service faults, missed into the net, in two groups of elite junior female players and one professional female tennis player. Three-dimensional body, racquet and ball kinematics were recorded using a 22-camera VICON motion analysis system. There were no differences in body kinematics between successful serves and service faults, suggesting that service faults cannot be attributed to a single source of biomechanical error. However, service faults missing into the net are characterized by projection angles significantly further below the horizontal, implying that consistency in this end-point parameter is critical to successful performance. Regulation of this parameter appears dependent on compensatory adjustments in the distal elbow and wrist joints immediately prior to impact and also perceptual feedback. Accordingly, coordination of the distal degrees of freedom and a refined perception-action coupling appear more important to success than any isolated mechanical component of the service action. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Building-Up of a DNA Barcode Library for True Bugs (Insecta: Hemiptera: Heteroptera) of Germany Reveals Taxonomic Uncertainties and Surprises

    Science.gov (United States)

    Raupach, Michael J.; Hendrich, Lars; Küchler, Stefan M.; Deister, Fabian; Morinière, Jérome; Gossner, Martin M.

    2014-01-01

    During the last few years, DNA barcoding has become an efficient method for the identification of species. In the case of insects, most published DNA barcoding studies focus on species of the Ephemeroptera, Trichoptera, Hymenoptera and especially Lepidoptera. In this study we test the efficiency of DNA barcoding for true bugs (Hemiptera: Heteroptera), an ecological and economical highly important as well as morphologically diverse insect taxon. As part of our study we analyzed DNA barcodes for 1742 specimens of 457 species, comprising 39 families of the Heteroptera. We found low nucleotide distances with a minimum pairwise K2P distance 2.2% were detected for 16 traditionally recognized and valid species. With a successful identification rate of 91.5% (418 species) our study emphasizes the use of DNA barcodes for the identification of true bugs and represents an important step in building-up a comprehensive barcode library for true bugs in Germany and Central Europe as well. Our study also highlights the urgent necessity of taxonomic revisions for various taxa of the Heteroptera, with a special focus on various species of the Miridae. In this context we found evidence for on-going hybridization events within various taxonomically challenging genera (e.g. Nabis Latreille, 1802 (Nabidae), Lygus Hahn, 1833 (Miridae), Phytocoris Fallén, 1814 (Miridae)) as well as the putative existence of cryptic species (e.g. Aneurus avenius (Duffour, 1833) (Aradidae) or Orius niger (Wolff, 1811) (Anthocoridae)). PMID:25203616

  4. Motivational and adaptational factors of successful women engineers

    Science.gov (United States)

    Bornsen, Susan Edith

    It is no surprise that there is a shortage of women engineers. The reasons for the shortage have been researched and discussed in myriad papers, and suggestions for improvement continue to evolve. However, there are few studies that have specifically identified the positive aspects that attract women to engineering and keep them actively engaged in the field. This paper examines how women engineers view their education, their work, and their motivation to remain in the field. A qualitative research design was used to understand the motivation and adaptability factors women use to support their decision to major in engineering and stay in the engineering profession. Women engineers were interviewed using broad questions about motivation and adaptability. Interviews were transcribed and coded, looking for common threads of factors that suggest not only why women engineers persist in the field, but also how they thrive. Findings focus on the experiences, insights, and meaning of women interviewed. A grounded theory approach was used to describe the success factors found in practicing women engineers. The study found categories of attraction to the field, learning environment, motivation and adaptability. Sub-categories of motivation are intrinsic motivational factors such as the desire to make a difference, as well as extrinsic factors such as having an income that allows the kind of lifestyle that supports the family. Women engineers are comfortable with and enjoy working with male peers and when barriers arise, women learn to adapt in the male dominated field. Adaptability was indicated in areas of gender, culture, and communication. Women found strength in the ability to 'read' their clients, and provide insight to their teams. Sufficient knowledge from the field advances theory and offers strategies to programs for administrators and faculty of schools of engineering as well as engineering firms, who have interest in recruitment, and retention of female students

  5. Healthcare succession planning: an integrative review.

    Science.gov (United States)

    Carriere, Brian K; Muise, Melanie; Cummings, Greta; Newburn-Cook, Chris

    2009-12-01

    Succession planning is a business strategy that has recently gained attention in the healthcare literature, primarily because of nursing shortage concerns and the demand for retaining knowledgeable personnel to meet organizational needs. Little research has been conducted in healthcare settings that clearly defines best practices for succession planning frameworks. To effectively carry out such organizational strategies during these challenging times, an integrative review of succession planning in healthcare was performed to identify consistencies in theoretical approaches and strategies for chief nursing officers and healthcare managers to initiate. Selected articles were compared with business succession planning to determine whether healthcare strategies were similar to best practices already established in business contexts. The results of this integrative review will aid leaders and managers to use succession planning as a tool in their recruitment, retention, mentoring, and administration activities and also provide insights for future development of healthcare succession planning frameworks.

  6. Successful Enterprise System Re-Implementation

    DEFF Research Database (Denmark)

    Svejvig, Per

    2017-01-01

    Achieving success in enterprise systems (ES) implementations is challenging. The success rate is not high in view of the sums invested by many organizations in these companywide systems. The literature is charged with reasons for unsuccessful implementations, such as a lack of top management...... support and insufficient change management. Contrary to this research, empirical data from an ES re-implementation in a Scandinavian high-tech company shows successful implementation despite many problematic shifts in outsourcing partners. Therefore, it is natural to ask: why was the re......-implementation of the ES at SCANDI successful despite the major troubles encountered during the project? Building an analysis based on ten Critical Success Factors (CSFs) combined with an investigation into the institutional structures at play, we present several reasons for the successful implementation. The CSF analysis...

  7. Project Success in IT Project Management

    OpenAIRE

    Siddiqui, Farhan Ahmed

    2010-01-01

    The rate of failed and challenged Information Technology (IT) projects is too high according to the CHAOS Studies by the Standish Group and the literature on project management (Standish Group, 2008). The CHAOS Studies define project success as meeting the triple constraints of scope, time, and cost. The criteria for project success need to be agreed by all parties before the start of the project and constantly reviewed as the project progresses. Assessing critical success factors is another ...

  8. Success of Chemotherapy in Soft Matter

    OpenAIRE

    Trifonova, I.; Kurteva, G.; Stefanov, S. Z.

    2014-01-01

    The success of chemotharapy in soft matter as a survival is found in the paper. Therefore, it is found the analogous tumor stretching force in soft matter; ultrasonography is performed for this tumor; restoration in soft matter with such a tumor is found; Bayes estimate of the probability of chemotherapy success is derived from the transferred chemical energy and from soft matter entropy; survival probability is juxtaposed to this probability of success.

  9. ORGANIZATIONAL CULTURE IN THE SUCCESSFUL GLOBAL BUSINESS

    OpenAIRE

    Eliyana, Anis

    2009-01-01

    A successful business becomes the goal of each individual who is involved in anorganizational business. The capability of business organization to enter and compete inthe global market is one of those ways often used by businessmen to both maintain andachieve a successful organization. With respect to the said issue, the writer would like toanswer the following subject matter: How to understand the organizational culture inorder to be success in global business? And does the organizational cu...

  10. Strategic Intent, Confucian Harmony and Firm Success

    OpenAIRE

    Edward J. Romar

    2009-01-01

    This paper argues that by using the Confucian concept of harmony as its strategicintent a firm can be both ethical and successful. Using the work of Peter F. Drucker,Michael Porter, and Gary Hamel and C. K. Prahalad the paper discusses the role ofstrategic intent as a contributing factor to firm success. The paper then discusses how theConfucian concept of harmony fulfills the concept of strategic intent and how harmony cancontribute to the development of a successful and ethical organization...

  11. MODEL OF TRAINING OF SUCCESS IN LIFE

    Directory of Open Access Journals (Sweden)

    Екатерина Александровна Лежнева

    2014-04-01

    Full Text Available The article explains the importance of the development of motive to succeed in adolescence. It is determined the value of the motive to achieve success in the further development of the teenager: a motive to achieve effective internal forces mobilized for the implementation of successful operation ensures the active involvement of teenagers in social and interpersonal relationships. As the primary means of motive development success is considered training. The author provides a definition of "training for success in life," creates a model of training for success in life, and describes its units (targeted, informative, technological, productive, reveals the successful development of the technology life strategy used during the training (self-presentation, targets, incentives, subject-orientation. The author pays attention to the need for a future psychologist to develop teenagers’ motive to achieve success through the mastery of competence in constructing a model of training for success in life, and its implementation in the course of professional activities. The main means of training students of psychology to the use of training success in life identified the additional educational programs and psychological section.DOI: http://dx.doi.org/10.12731/2218-7405-2013-9-77

  12. Black Artists' Music Videos: Three Successful Strategies

    Science.gov (United States)

    Peterson-Lewis, Sonja; Chennault, Shirley A.

    1986-01-01

    Identifies three successful self-presentational patterns used by black artists to penetrate the music television market. Discusses the historical relationship between minorities and the mass media. (MS)

  13. Federal Technology Transfer Act Success Stories

    Science.gov (United States)

    Successful Federal Technology Transfer Act (FTTA) partnerships demonstrate the many advantages of technology transfer and collaboration. EPA and partner organizations create valuable and applicable technologies for the marketplace.

  14. Critical success factors for renewable energy projects

    International Nuclear Information System (INIS)

    1995-01-01

    This project highlighted best practice in the planning and assessment of proposals with the aim of: encouraging more successful renewable energy projects and proposals; lowering financial and other barriers; and stimulating a climate for success. Based on the analysis of a number of case studies, data was collected through a series of extensive interviews to identify why certain schemes were considered successful, what might have been done differently and which factors were considered important when entering a market. The Critical Success Factors can be broken down into five groups: Universal CSFs; CSFs for funding bodies; CSFs for managing agencies; CSFs for niche markets; CSFs for individual technologies. (author)

  15. 5 CFR 412.201 - Management succession.

    Science.gov (United States)

    2010-01-01

    ... programs must be supported by employee training and development programs. The focus of the program should... learning experiences throughout an employee's career, such as details, mentoring, coaching, learning groups..., MANAGEMENT, AND EXECUTIVE DEVELOPMENT Succession Planning § 412.201 Management succession. The head of each...

  16. Build an Early Foundation for Algebra Success

    Science.gov (United States)

    Knuth, Eric; Stephens, Ana; Blanton, Maria; Gardiner, Angela

    2016-01-01

    Research tells us that success in algebra is a factor in many other important student outcomes. Emerging research also suggests that students who are started on an algebra curriculum in the earlier grades may have greater success in the subject in secondary school. What's needed is a consistent, algebra-infused mathematics curriculum all…

  17. Succession Planning Demystified. IES Report 372.

    Science.gov (United States)

    Hirsh, W.

    This book, which is designed for human resource (HR) practitioners, details the principles and applications of succession planning, shows how succession planning is conducted, and explains its place in relation to other HR processes and business priorities. The introduction describes the book's intended audience and provides a brief overview of…

  18. Pathways to Success for Michigan's Opportunity Youth

    Science.gov (United States)

    American Youth Policy Forum, 2015

    2015-01-01

    Each young person must navigate his/her own pathway into and through postsecondary education and the workforce to long-term success personalized to his/her own unique needs and desires. The pathway to long-term success is often articulated as a straight road through K-12 education into postsecondary education (either academic or technical…

  19. A framework for successful hotel developments

    Directory of Open Access Journals (Sweden)

    Chris E Cloete

    2013-04-01

    Various critical success factors for hotel development are identified in this article, and incorporated into a hotel property development framework, establishing a practical ‘road map’ for successful hotel developments. The validity of the proposed hotel property development framework has been assessed by intensive direct interviews with hotel development professionals.

  20. Determinants of distribution, abundance and reproductive success ...

    African Journals Online (AJOL)

    ... while local vegetation structure determines the abundance of locally established populations. The abundance of trees affects nest site availability and breeding success, based on observations at two oases. Blackbird nests were usually situated on pomegranate trees and olive trees. The Common Blackbird is a successful ...

  1. Success and Motivation among College Students

    Science.gov (United States)

    Schweinle, Amy; Helming, Luralyn M.

    2011-01-01

    The present research explores college students' explanations of their success and failure in challenging activities and how it relates to students' efficacy, value, and engagement. The results suggest most students hold one primary reason for success during the challenging activity, including grade/extrinsic, mastery/intrinsic,…

  2. Establishing a Successful Smart Card Program.

    Science.gov (United States)

    Wiens, Janet

    2001-01-01

    Discusses how to run a successful smart card program through a comprehensive approach that includes a detailed plan for the present and future, high level support from school administration, and extensive user input. Florida State University is used to illustrate a successfully implemented smart card program. (GR)

  3. Modeling Student Success in Engineering Education

    Science.gov (United States)

    Jin, Qu

    2013-01-01

    In order for the United States to maintain its global competitiveness, the long-term success of our engineering students in specific courses, programs, and colleges is now, more than ever, an extremely high priority. Numerous studies have focused on factors that impact student success, namely academic performance, retention, and/or graduation.…

  4. Investigating critical success factors in tile industry

    Directory of Open Access Journals (Sweden)

    Davood Salmani

    2014-04-01

    Full Text Available This paper presents an empirical investigation to determine critical success factors influencing the success of tile industry in Iran. The study designs a questionnaire in Likert scale, distributes it among some experts in tile industry. Using Pearson correlation test, the study has detected that there was a positive and meaningful relationship between marketing planning and the success of tile industry (r = 0.312 Sig. = 0.001. However, there is not any meaningful relationship between low cost production and success of tile industry (r = 0.13 Sig. = 0.12 and, there is a positive and meaningful relationship between organizational capabilities and success of tile industry (r = 0.635 Sig. = 0.000. Finally, our investigation states that technology and distributing systems also influence on the success of tile industry, positively. The study has also used five regression analyses where the success of tile industry was the dependent variable and marketing planning, low cost production and organizational capabilities are independent variables and the results have confirmed some positive and meaningful relationship between the successes of tile industry with all independent variables.

  5. A Partial Theory of Executive Succession.

    Science.gov (United States)

    Thiemann, Francis C.

    This study has two purposes: (1) To construct a partial theory of succession, and (2) to utilize a method of theory construction which combines some of the concepts of Hans Zetterberg with the principles of formal symbolic logic. A bibliography on succession in complex organizations with entries on descriptive and empirical studies from various…

  6. The Entrepreneurial Subjectivity of Successful Researchers

    Science.gov (United States)

    Sinclair, Jennifer; Cuthbert, Denise; Barnacle, Robyn

    2014-01-01

    This article begins the work of examining what kind of doctoral experiences positively influence researcher development, and what other attributes may contribute to a successful research career. It reports preliminary findings from the analysis of survey responses by a sample of successful mid-career researchers. Positive doctoral experiences and…

  7. The Professional Success of Higher Education Graduates

    Science.gov (United States)

    Schomburg, Harald

    2007-01-01

    Measures of professional success provided by surveys on higher education graduates can be divided into objective (e.g. income or professional position) and subjective (e.g. job satisfaction, reported use of knowledge and skills, work autonomy) indicators. In this article a broad range of measures of professional success is used to describe aspects…

  8. Telling Successes of Japanese Foreign Aid

    DEFF Research Database (Denmark)

    Hansen, Annette Skovsted

    Stakeholders of two success story events negotiated an idea of development as individual entrepreneurship. The sixty-five-year-old Japanese Foreign Aid history includes stories of successes told by professionals from developing countries throughout the world. Their stories reflect the cultural...... sector training programs partly financed by Japanese Official development Assistance (ODA)....

  9. Critical Success Factors in Online Language Learning

    Science.gov (United States)

    Alberth

    2011-01-01

    With the proliferation of online courses nowadays, it is necessary to ask what defines the success of teaching and learning in these new learning environments exactly. This paper identifies and critically discusses a number of factors for successful implementation of online delivery, particularly as far as online language learning is concerned.…

  10. Physical attractiveness and reproductive success in humans: Evidence from the late 20 century United States.

    Science.gov (United States)

    Jokela, Markus

    2009-09-01

    Physical attractiveness has been associated with mating behavior, but its role in reproductive success of contemporary humans has received surprisingly little attention. In the Wisconsin Longitudinal Study (1244 women, 997 men born between 1937 and 1940) we examined whether attractiveness assessed from photographs taken at age ~18 predicted the number of biological children at age 53-56. In women, attractiveness predicted higher reproductive success in a nonlinear fashion, so that attractive (second highest quartile) women had 16% and very attractive (highest quartile) women 6% more children than their less attractive counterparts. In men, there was a threshold effect so that men in the lowest attractiveness quartile had 13% fewer children than others who did not differ from each other in the average number of children. These associations were partly but not completely accounted for by attractive participants' increased marriage probability. A linear regression analysis indicated relatively weak directional selection gradient for attractiveness (β=0.06 in women, β=0.07 in men). These findings indicate that physical attractiveness may be associated with reproductive success in humans living in industrialized settings.

  11. An evolutionary concept of polycystic ovarian disease: does evolution favour reproductive success over survival?

    Science.gov (United States)

    Gleicher, Norbert; Barad, David

    2006-05-01

    Polycystic ovarian disease (PCOD) is currently considered as possibly the most frequent cause of female infertility. It is also closely associated with syndrome XX, which, in turn, is closely linked with premature and excessive mortality. Considering these adverse effects on reproductive success and human survival, the evolutionary survival of PCOD, itself considered by many to be a genetically transmitted condition, would, on first glance, appear surprising, since evolution usually discriminates against both of these traits. However, an analysis of some recently reported characteristics of the condition calls for the reconsideration of PCOD as a condition which, from an evolutionary viewpoint, favours decreased reproductive success. Indeed, the reported observations that patients with PCOD will resume spontaneous ovulation with even relatively minor weight loss, and experience later menopause than controls, suggests exactly the opposite. Under an evolutionary concept, PCOD can thus be seen as a 'fertility storage condition' which in fact favours human reproductive success and allows the human species to maintain fertility even during adverse environmental circumstances, such as famines.

  12. Reframing Success and Failure of Information Systems

    DEFF Research Database (Denmark)

    Cecez-Kecmanovic, Dubravka; Kautz, Karlheinz; Abrahall, Rebecca

    2014-01-01

    -networks of developers, managers, technologies, project documents, methodologies, and other actors. Drawing from a controversial case of a highly innovative information system in an insurance company-considered a success and failure at the same time- the paper reveals the inherent indeterminacy of IS success and failure......he paper questions common assumptions in the dominant representational framings of information systems success and failure and proposes a performative perspective that conceives IS success and failure as relational effects performed by sociomaterial practices of IS project actor...... that performed both different IS realities and competing IS assessments. The analysis shows that the IS project and the implemented system as objects of assessment are not given and fixed, but are performed by the agencies of assessment together with the assessment outcomes of success and failure. The paper...

  13. Biofilm community succession: a neutral perspective.

    Science.gov (United States)

    Woodcock, Stephen; Sloan, William T

    2017-05-22

    Although biofilms represent one of the dominant forms of life in aqueous environments, our understanding of the assembly and development of their microbial communities remains relatively poor. In recent years, several studies have addressed this and have extended the concepts of succession theory in classical ecology into microbial systems. From these datasets, niche-based conceptual models have been developed explaining observed biodiversity patterns and their dynamics. These models have not, however, been formulated mathematically and so remain untested. Here, we further develop spatially resolved neutral community models and demonstrate that these can also explain these patterns and offer alternative explanations of microbial succession. The success of neutral models suggests that stochastic effects alone may have a much greater influence on microbial community succession than previously acknowledged. Furthermore, such models are much more readily parameterised and can be used as the foundation of more complex and realistic models of microbial community succession.

  14. Success tree method of resources evaluation

    International Nuclear Information System (INIS)

    Chen Qinglan; Sun Wenpeng

    1994-01-01

    By applying the reliability theory in system engineering, the success tree method is used to transfer the expert's recognition on metallogenetic regularities into the form of the success tree. The aim of resources evaluation is achieved by means of calculating the metallogenetic probability or favorability of the top event of the success tree. This article introduces in detail, the source, principle of the success tree method and three kinds of calculation methods, expounds concretely how to establish the success tree of comprehensive uranium metallogenesis as well as the procedure from which the resources evaluation is performed. Because this method has not restrictions on the number of known deposits and calculated area, it is applicable to resources evaluation for different mineral species, types and scales and possesses good prospects of development

  15. The Extended Family and Children's Educational Success

    DEFF Research Database (Denmark)

    Jæger, Mads Meier

    2012-01-01

    Research on family background and educational success focuses almost exclusively on two generations and on parents and children. This paper argues that the extended family makes up a significant part of the total effect of family background on educational success. Empirical results based...... on the Wisconsin Longitudinal Study show that, net of family factors shared by siblings from the same immediate family, factors shared by first cousins from the same extended family account for a nontrivial part of the total variance in children’s educational success. Results also show that while socioeconomic...... characteristics of grandparents and aunts and uncles have few direct effects on educational success, resources in the extended family compensate lacking resources in low-SES families, which in turn promote children’s educational success. The main conclusion is that the total effect of family background...

  16. Critical success factors of Indian Entrepreneurs

    Directory of Open Access Journals (Sweden)

    Alex Antonites

    2013-12-01

    Full Text Available This research seeks to explore the critical success factors that influence the success of Indian small business owners in the largest metropolitan area in South Africa. To achieve this, the objective of the study was to confirm whether there are significant differences between a successful and less successful group of business owners in terms of general management skills, personal characteristics, and entrepreneurial orientation and financing of the business. Through analysing secondary evidence and empirical results it was possible to facilitate a better understanding of how Indian entrepreneurs operating in small and medium enterprises sustain success, thus contributing to the body of knowledge relating to entrepreneurship development in the domain of entrepreneurship. From the literature it became clear that cultural dimensions have an impact on the entrepreneurial process. The arrival of Indians in South Africa has contributed to a unique Indian culture. The characteristics that describe ethnic entrepreneurs and success factors attributed to their success are described. Small and medium enterprises (SMEs are crucial for the development of any country as they offer benefits of economic growth and employment generation. The success factors to sustain SMEs are also described. The findings of the study indicate that there are no significant differences between the comparable groups in relation to management skills and finance factors. There are, however, significant differences relating to personal factors, such as the level of education, family support and experience. Finally, an important learning is that the Indian entrepreneurs in this study are similar to ethnic entrepreneurs reviewed in literature. The study was conducted in Tshwane, the largest metropolitan area in South Africa, and amongst the largest in the world. Keywords: Culture, ethnic entrepreneurship, Indian entrepreneurship, critical success factors, small and medium enterprises

  17. Determining Success Criteria and Success Factors for International Construction Projects for Malaysian Contractors

    Directory of Open Access Journals (Sweden)

    Ali Mohammed Alashwal

    2017-06-01

    Full Text Available The success of international construction projects is fraught with various challenges such as competitiveness, lack of resources, versatile global economy, and specific conditions in the host country. Malaysian contractors have been venturing into global construction market since early 1980s. However, their venturing was not successful all the time. The number of international projects awarded to Malaysian contractors has reduced drastically during the past decade. Taking advantage of this experience, this paper aims to identify the success criteria and success factors of international construction projects. The data was collected from 120 respondents using a questionnaire survey and analysed using principal component analysis and regression analysis. The results revealed three principal criteria of project success namely, Management Success, Functional Success, and Organisation Success. The main components of success factors include Team Power and Skills, Resource Availability, External Environment, Organisation Capability, Project Support, and Project Organisation. Further analysis emphasized the importance of strong financing capacity of contractors, project social environment, and competence of the project manager in achieving project success. The results of this paper can serve as a guideline for contractors and project managers to achieve success in this context. Future studies may provide in-depth analysis of success criteria and success factors specific for construction project type and host-country location.

  18. The Future of Basic Science in Academic Surgery: Identifying Barriers to Success for Surgeon-scientists.

    Science.gov (United States)

    Keswani, Sundeep G; Moles, Chad M; Morowitz, Michael; Zeh, Herbert; Kuo, John S; Levine, Matthew H; Cheng, Lily S; Hackam, David J; Ahuja, Nita; Goldstein, Allan M

    2017-06-01

    The aim of this study was to examine the challenges confronting surgeons performing basic science research in today's academic surgery environment. Multiple studies have identified challenges confronting surgeon-scientists and impacting their ability to be successful. Although these threats have been known for decades, the downward trend in the number of successful surgeon-scientists continues. Clinical demands, funding challenges, and other factors play important roles, but a rigorous analysis of academic surgeons and their experiences regarding these issues has not previously been performed. An online survey was distributed to 2504 members of the Association for Academic Surgery and Society of University Surgeons to determine factors impacting success. Survey results were subjected to statistical analyses. We also reviewed publicly available data regarding funding from the National Institutes of Health (NIH). NIH data revealed a 27% decline in the proportion of NIH funding to surgical departments relative to total NIH funding from 2007 to 2014. A total of 1033 (41%) members responded to our survey, making this the largest survey of academic surgeons to date. Surgeons most often cited the following factors as major impediments to pursuing basic investigation: pressure to be clinically productive, excessive administrative responsibilities, difficulty obtaining extramural funding, and desire for work-life balance. Surprisingly, a majority (68%) did not believe surgeons can be successful basic scientists in today's environment, including departmental leadership. We have identified important barriers that confront academic surgeons pursuing basic research and a perception that success in basic science may no longer be achievable. These barriers need to be addressed to ensure the continued development of future surgeon-scientists.

  19. Clonal structure and variable fertilization success in Florida Keys broadcast-spawning corals

    Science.gov (United States)

    Miller, M. W.; Baums, I. B.; Pausch, R. E.; Bright, A. J.; Cameron, C. M.; Williams, D. E.; Moffitt, Z. J.; Woodley, C. M.

    2018-03-01

    Keystone reef-building corals in the Caribbean are predominantly self-incompatible broadcast spawners and a majority are threatened due to both acute adult mortality and poor recruitment. As population densities decline, concerns about fertilization limitation and effective population size in these species increase and would be further exacerbated by either high clonality or gametic incompatibility of parental genotypes. This study begins to address these concerns for two Caribbean broadcasting species by characterizing clonal structure and quantifying experimental pairwise fertilization success. Orbicella faveolata showed surprisingly high and contrasting levels of clonality between two sampled sites; Acropora palmata was previously known to be highly clonal. Individual pairwise crosses of synchronously spawning genotypes of each species were conducted by combining aliquots of gamete bundles immediately after spawning, and showed high and significant variability in fertilization success. Over half of the individual crosses of O. faveolata and about one-third of A. palmata crosses yielded ≤ 40% fertilization. Total sperm concentration was quantified in only a subset of O. faveolata crosses (range of 1-6 × 107 mL-1), but showed no correlation with fertilization success. We interpret that both parental incompatibility and individual genotypes with low-quality gametes are likely to have contributed to the variable fertilization observed with important implications for conservation. Differential fertilization success implies effective population size may be considerably smaller than hoped and population enhancement efforts need to incorporate many more parental genotypes at the patch scale to ensure successful larval production than indicated by estimates based simply on preserving levels of standing genetic diversity.

  20. Social-philosophical practices of success

    Directory of Open Access Journals (Sweden)

    S. R. Karpenko

    2017-01-01

    Is social-philosophical experts of success represent the complicated system of various world outlook, speech, mental factors and events in life of the various professional, age and subcultural bunches producing assessments under different visual angles, from positions of various social installations and identity in what the social philosophy of success expresses. In the course of forming social an expert (both in daily, and in an institutional discourse are shaped also theoretical ideas success: instrumental, is social-philosophical, is social-psychological, world outlook, historical and cultural, etc., characterising thereby various systems of a social discourse. Examination is social-philosophical the success expert shows the real complexity and ambiguity of the given appearance. Besides the presented typology constructed as the most approximate abstract plan, in each separate case probably build-up of typological models according to a principle ad hoc. It looks quite justified, considering that circumstance that representations about success and the successful person are constantly transformed and acquire new performances. Efficiency of the further examinations of a discourse and a success expert will depend on accepting of new heuristic approaches, capable to consider multidimensionality and ambiguity of the given phenomenon.