WorldWideScience

Sample records for surprisingly successful metamodels

  1. Challenges to Applying a Metamodel for Groundwater Flow Beyond Underlying Numerical Model Boundaries

    Science.gov (United States)

    Reeves, H. W.; Fienen, M. N.; Feinstein, D.

    2015-12-01

    Metamodels of environmental behavior offer opportunities for decision support, adaptive management, and increased stakeholder engagement through participatory modeling and model exploration. Metamodels are derived from calibrated, computationally demanding, numerical models. They may potentially be applied to non-modeled areas to provide screening or preliminary analysis tools for areas that do not yet have the benefit of more comprehensive study. In this decision-support mode, they may be fulfilling a role often accomplished by application of analytical solutions. The major challenge to transferring a metamodel to a non-modeled area is how to quantify the spatial data in the new area of interest in such a way that it is consistent with the data used to derive the metamodel. Tests based on transferring a metamodel derived from a numerical groundwater-flow model of the Lake Michigan Basin to other glacial settings across the northern U.S. show that the spatial scale of the numerical model must be appropriately scaled to adequately represent different settings. Careful GIS analysis of the numerical model, metamodel, and new area of interest is required for successful transfer of results.

  2. METAMODELS OF INFORMATION TECHNOLOGY BEST PRACTICES FRAMEWORKS

    Directory of Open Access Journals (Sweden)

    Arthur Nunes Ferreira Neto

    2011-12-01

    Full Text Available This article deals with the generation and application of ontological metamodels of frameworks of best practices in IT. The ontological metamodels represent the logical structures and fundamental semantics of framework models and constitute adequate tools for the analysis, adaptation, comparison and integration of the frameworks of best practices in IT. The MetaFrame methodology for the construction of the metamodels, founded on the discipline of the conceptual metamodelling and on the extended Entity/Relationship methodology is described herein, as well as the metamodels of the best practices for the outsourcing of IT, the eSCM-SP v2.01 (eSourcing Capability Model for Service Providers and the eSCM-CL v1.1 (eSourcing Capability Model for Client Organizations, constructed according to the MetaFrame methodology.

  3. University Students' Meta-Modelling Knowledge

    Science.gov (United States)

    Krell, Moritz; Krüger, Dirk

    2017-01-01

    Background: As one part of scientific meta-knowledge, students' meta-modelling knowledge should be promoted on different educational levels such as primary school, secondary school and university. This study focuses on the assessment of university students' meta-modelling knowledge using a paper-pencil questionnaire. Purpose: The general purpose…

  4. Metamodel of the IT Governance Framework COBIT

    Directory of Open Access Journals (Sweden)

    João Souza Neto

    2013-10-01

    Full Text Available This paper addresses the generation and analysis of the COBIT 4.1 ontological metamodel of IT Governance framework. The ontological metamodels represent the logical structures and fundamental semantics of framework models and constitute adequate tools for the analysis, adaptation, comparison and integration of IT best practice frameworks. The MetaFrame methodology used for the construction of the COBIT metamodel is based on the discipline of conceptual metamodeling and on the extended Entity/Relationship methodology. It has an iterative process of construction of the metamodel’s components, using techniques of modeling and documentation of information systems. In the COBIT 4.1metamodel, the central entity type is the IT Process. The entity type of IT Domain represents the four domains that group one or more IT processes of the COBIT 4.1. In turn, these domains are divided into one or more Activities that are carried through by one or more Roles which are consulted, informed, accounted for or liable for each Activity. The COBIT 4.1 metamodel may suggest adaptation or implementation of a new process within the framework or even contribute to the integration of frameworks, when, after the processes of analysis and comparison, there are connection points between the components and the logical structures of its relationships.

  5. Introducing Meta-models for a More Efficient Hazard Mitigation Strategy with Rockfall Protection Barriers

    Science.gov (United States)

    Toe, David; Mentani, Alessio; Govoni, Laura; Bourrier, Franck; Gottardi, Guido; Lambert, Stéphane

    2018-04-01

    The paper presents a new approach to assess the effecctiveness of rockfall protection barriers, accounting for the wide variety of impact conditions observed on natural sites. This approach makes use of meta-models, considering a widely used rockfall barrier type and was developed from on FE simulation results. Six input parameters relevant to the block impact conditions have been considered. Two meta-models were developed concerning the barrier capability either of stopping the block or in reducing its kinetic energy. The outcome of the parameters range on the meta-model accuracy has been also investigated. The results of the study reveal that the meta-models are effective in reproducing with accuracy the response of the barrier to any impact conditions, providing a formidable tool to support the design of these structures. Furthermore, allowing to accommodate the effects of the impact conditions on the prediction of the block-barrier interaction, the approach can be successfully used in combination with rockfall trajectory simulation tools to improve rockfall quantitative hazard assessment and optimise rockfall mitigation strategies.

  6. Towards the formalisation of the TOGAF Content Metamodel using ontologies

    CSIR Research Space (South Africa)

    Gerber, A

    2010-06-01

    Full Text Available Metamodels are abstractions that are used to specify characteristics of models. Such metamodels are generally included in specifications or framework descriptions. A metamodel is for instance used to inform the generation of enterprise architecture...

  7. Mean-Variance-Validation Technique for Sequential Kriging Metamodels

    International Nuclear Information System (INIS)

    Lee, Tae Hee; Kim, Ho Sung

    2010-01-01

    The rigorous validation of the accuracy of metamodels is an important topic in research on metamodel techniques. Although a leave-k-out cross-validation technique involves a considerably high computational cost, it cannot be used to measure the fidelity of metamodels. Recently, the mean 0 validation technique has been proposed to quantitatively determine the accuracy of metamodels. However, the use of mean 0 validation criterion may lead to premature termination of a sampling process even if the kriging model is inaccurate. In this study, we propose a new validation technique based on the mean and variance of the response evaluated when sequential sampling method, such as maximum entropy sampling, is used. The proposed validation technique is more efficient and accurate than the leave-k-out cross-validation technique, because instead of performing numerical integration, the kriging model is explicitly integrated to accurately evaluate the mean and variance of the response evaluated. The error in the proposed validation technique resembles a root mean squared error, thus it can be used to determine a stop criterion for sequential sampling of metamodels

  8. A Sequential Optimization Sampling Method for Metamodels with Radial Basis Functions

    Science.gov (United States)

    Pan, Guang; Ye, Pengcheng; Yang, Zhidong

    2014-01-01

    Metamodels have been widely used in engineering design to facilitate analysis and optimization of complex systems that involve computationally expensive simulation programs. The accuracy of metamodels is strongly affected by the sampling methods. In this paper, a new sequential optimization sampling method is proposed. Based on the new sampling method, metamodels can be constructed repeatedly through the addition of sampling points, namely, extrema points of metamodels and minimum points of density function. Afterwards, the more accurate metamodels would be constructed by the procedure above. The validity and effectiveness of proposed sampling method are examined by studying typical numerical examples. PMID:25133206

  9. A meta-model for computer executable dynamic clinical safety checklists.

    Science.gov (United States)

    Nan, Shan; Van Gorp, Pieter; Lu, Xudong; Kaymak, Uzay; Korsten, Hendrikus; Vdovjak, Richard; Duan, Huilong

    2017-12-12

    Safety checklist is a type of cognitive tool enforcing short term memory of medical workers with the purpose of reducing medical errors caused by overlook and ignorance. To facilitate the daily use of safety checklists, computerized systems embedded in the clinical workflow and adapted to patient-context are increasingly developed. However, the current hard-coded approach of implementing checklists in these systems increase the cognitive efforts of clinical experts and coding efforts for informaticists. This is due to the lack of a formal representation format that is both understandable by clinical experts and executable by computer programs. We developed a dynamic checklist meta-model with a three-step approach. Dynamic checklist modeling requirements were extracted by performing a domain analysis. Then, existing modeling approaches and tools were investigated with the purpose of reusing these languages. Finally, the meta-model was developed by eliciting domain concepts and their hierarchies. The feasibility of using the meta-model was validated by two case studies. The meta-model was mapped to specific modeling languages according to the requirements of hospitals. Using the proposed meta-model, a comprehensive coronary artery bypass graft peri-operative checklist set and a percutaneous coronary intervention peri-operative checklist set have been developed in a Dutch hospital and a Chinese hospital, respectively. The result shows that it is feasible to use the meta-model to facilitate the modeling and execution of dynamic checklists. We proposed a novel meta-model for the dynamic checklist with the purpose of facilitating creating dynamic checklists. The meta-model is a framework of reusing existing modeling languages and tools to model dynamic checklists. The feasibility of using the meta-model is validated by implementing a use case in the system.

  10. Metamodel-based inverse method for parameter identification: elastic-plastic damage model

    Science.gov (United States)

    Huang, Changwu; El Hami, Abdelkhalak; Radi, Bouchaïb

    2017-04-01

    This article proposed a metamodel-based inverse method for material parameter identification and applies it to elastic-plastic damage model parameter identification. An elastic-plastic damage model is presented and implemented in numerical simulation. The metamodel-based inverse method is proposed in order to overcome the disadvantage in computational cost of the inverse method. In the metamodel-based inverse method, a Kriging metamodel is constructed based on the experimental design in order to model the relationship between material parameters and the objective function values in the inverse problem, and then the optimization procedure is executed by the use of a metamodel. The applications of the presented material model and proposed parameter identification method in the standard A 2017-T4 tensile test prove that the presented elastic-plastic damage model is adequate to describe the material's mechanical behaviour and that the proposed metamodel-based inverse method not only enhances the efficiency of parameter identification but also gives reliable results.

  11. Information-theoretic metamodel of organizational evolution

    Science.gov (United States)

    Sepulveda, Alfredo

    2011-12-01

    Social organizations are abstractly modeled by holarchies---self-similar connected networks---and intelligent complex adaptive multiagent systems---large networks of autonomous reasoning agents interacting via scaled processes. However, little is known of how information shapes evolution in such organizations, a gap that can lead to misleading analytics. The research problem addressed in this study was the ineffective manner in which classical model-predict-control methods used in business analytics attempt to define organization evolution. The purpose of the study was to construct an effective metamodel for organization evolution based on a proposed complex adaptive structure---the info-holarchy. Theoretical foundations of this study were holarchies, complex adaptive systems, evolutionary theory, and quantum mechanics, among other recently developed physical and information theories. Research questions addressed how information evolution patterns gleamed from the study's inductive metamodel more aptly explained volatility in organization. In this study, a hybrid grounded theory based on abstract inductive extensions of information theories was utilized as the research methodology. An overarching heuristic metamodel was framed from the theoretical analysis of the properties of these extension theories and applied to business, neural, and computational entities. This metamodel resulted in the synthesis of a metaphor for, and generalization of organization evolution, serving as the recommended and appropriate analytical tool to view business dynamics for future applications. This study may manifest positive social change through a fundamental understanding of complexity in business from general information theories, resulting in more effective management.

  12. Artificial intelligence metamodel comparison and application to wind turbine airfoil uncertainty analysis

    Directory of Open Access Journals (Sweden)

    Yaping Ju

    2016-05-01

    Full Text Available The Monte Carlo simulation method for turbomachinery uncertainty analysis often requires performing a huge number of simulations, the computational cost of which can be greatly alleviated with the help of metamodeling techniques. An intensive comparative study was performed on the approximation performance of three prospective artificial intelligence metamodels, that is, artificial neural network, radial basis function, and support vector regression. The genetic algorithm was used to optimize the predetermined parameters of each metamodel for the sake of a fair comparison. Through testing on 10 nonlinear functions with different problem scales and sample sizes, the genetic algorithm–support vector regression metamodel was found more accurate and robust than the other two counterparts. Accordingly, the genetic algorithm–support vector regression metamodel was selected and combined with the Monte Carlo simulation method for the uncertainty analysis of a wind turbine airfoil under two types of surface roughness uncertainties. The results show that the genetic algorithm–support vector regression metamodel can capture well the uncertainty propagation from the surface roughness to the airfoil aerodynamic performance. This work is useful to the application of metamodeling techniques in the robust design optimization of turbomachinery.

  13. Laser Welding Process Parameters Optimization Using Variable-Fidelity Metamodel and NSGA-II

    Directory of Open Access Journals (Sweden)

    Wang Chaochao

    2017-01-01

    Full Text Available An optimization methodology based on variable-fidelity (VF metamodels and nondominated sorting genetic algorithm II (NSGA-II for laser bead-on-plate welding of stainless steel 316L is presented. The relationships between input process parameters (laser power, welding speed and laser focal position and output responses (weld width and weld depth are constructed by VF metamodels. In VF metamodels, the information from two levels fidelity models are integrated, in which the low-fidelity model (LF is finite element simulation model that is used to capture the general trend of the metamodels, and high-fidelity (HF model which from physical experiments is used to ensure the accuracy of metamodels. The accuracy of the VF metamodel is verified by actual experiments. To slove the optimization problem, NSGA-II is used to search for multi-objective Pareto optimal solutions. The results of verification experiments show that the obtained optimal parameters are effective and reliable.

  14. Failure Probability Calculation Method Using Kriging Metamodel-based Importance Sampling Method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seunggyu [Korea Aerospace Research Institue, Daejeon (Korea, Republic of); Kim, Jae Hoon [Chungnam Nat’l Univ., Daejeon (Korea, Republic of)

    2017-05-15

    The kernel density was determined based on sampling points obtained in a Markov chain simulation and was assumed to be an important sampling function. A Kriging metamodel was constructed in more detail in the vicinity of a limit state. The failure probability was calculated based on importance sampling, which was performed for the Kriging metamodel. A pre-existing method was modified to obtain more sampling points for a kernel density in the vicinity of a limit state. A stable numerical method was proposed to find a parameter of the kernel density. To assess the completeness of the Kriging metamodel, the possibility of changes in the calculated failure probability due to the uncertainty of the Kriging metamodel was calculated.

  15. Effectiveness of meta-models for multi-objective optimization of centrifugal impeller

    Energy Technology Data Exchange (ETDEWEB)

    Bellary, Sayed Ahmed Imran; Samad, Abdus [Indian Institute of Technology Madras, Chennai (India); Husain, Afzal [Sultan Qaboos University, Al-Khoudh (Oman)

    2014-12-15

    The major issue of multiple fidelity based analysis and optimization of fluid machinery system depends upon the proper construction of low fidelity model or meta-model. A low fidelity model uses responses obtained from a high fidelity model, and the meta-model is then used to produce population of solutions required for evolutionary algorithm for multi-objective optimization. The Pareto-optimal front which shows functional relationships among the multiple objectives can produce erroneous results if the low fidelity models are not well-constructed. In the present research, response surface approximation and Kriging meta-models were evaluated for their effectiveness for the application in the turbomachinery design and optimization. A high fidelity model such as CFD technique along with the metamodels was used to obtain Pareto-optimal front via multi-objective genetic algorithm. A centrifugal impeller has been considered as case study to find relationship between two conflicting objectives, viz., hydraulic efficiency and head. Design variables from the impeller geometry have been chosen and the responses of the objective functions were evaluated through CFD analysis. The fidelity of each metamodel has been discussed in context of their predictions in entire design space in general and near optimal region in particular. Exploitation of the multiple meta-models enhances the quality of multi-objective optimization and provides the information pertaining to fidelity of optimization model. It was observed that the Kriging meta-model was better suited for this type of problem as it involved less approximation error in the Pareto-optimal front.

  16. Effectiveness of meta-models for multi-objective optimization of centrifugal impeller

    International Nuclear Information System (INIS)

    Bellary, Sayed Ahmed Imran; Samad, Abdus; Husain, Afzal

    2014-01-01

    The major issue of multiple fidelity based analysis and optimization of fluid machinery system depends upon the proper construction of low fidelity model or meta-model. A low fidelity model uses responses obtained from a high fidelity model, and the meta-model is then used to produce population of solutions required for evolutionary algorithm for multi-objective optimization. The Pareto-optimal front which shows functional relationships among the multiple objectives can produce erroneous results if the low fidelity models are not well-constructed. In the present research, response surface approximation and Kriging meta-models were evaluated for their effectiveness for the application in the turbomachinery design and optimization. A high fidelity model such as CFD technique along with the metamodels was used to obtain Pareto-optimal front via multi-objective genetic algorithm. A centrifugal impeller has been considered as case study to find relationship between two conflicting objectives, viz., hydraulic efficiency and head. Design variables from the impeller geometry have been chosen and the responses of the objective functions were evaluated through CFD analysis. The fidelity of each metamodel has been discussed in context of their predictions in entire design space in general and near optimal region in particular. Exploitation of the multiple meta-models enhances the quality of multi-objective optimization and provides the information pertaining to fidelity of optimization model. It was observed that the Kriging meta-model was better suited for this type of problem as it involved less approximation error in the Pareto-optimal front.

  17. Applying the Business Process and Practice Alignment Meta-model: Daily Practices and Process Modelling

    Directory of Open Access Journals (Sweden)

    Ventura Martins Paula

    2017-03-01

    Full Text Available Background: Business Process Modelling (BPM is one of the most important phases of information system design. Business Process (BP meta-models allow capturing informational and behavioural aspects of business processes. Unfortunately, standard BP meta-modelling approaches focus just on process description, providing different BP models. It is not possible to compare and identify related daily practices in order to improve BP models. This lack of information implies that further research in BP meta-models is needed to reflect the evolution/change in BP. Considering this limitation, this paper introduces a new BP meta-model designed by Business Process and Practice Alignment Meta-model (BPPAMeta-model. Our intention is to present a meta-model that addresses features related to the alignment between daily work practices and BP descriptions. Objectives: This paper intends to present a metamodel which is going to integrate daily work information into coherent and sound process definitions. Methods/Approach: The methodology employed in the research follows a design-science approach. Results: The results of the case study are related to the application of the proposed meta-model to align the specification of a BP model with work practices models. Conclusions: This meta-model can be used within the BPPAM methodology to specify or improve business processes models based on work practice descriptions.

  18. Unifying approach for model transformations in the MOF metamodeling architecture

    NARCIS (Netherlands)

    Ivanov, Ivan; van den Berg, Klaas

    2004-01-01

    In the Meta Object Facility (MOF) metamodeling architecture a number of model transformation scenarios can be identified. It could be expected that a metamodeling architecture will be accompanied by a transformation technology supporting the model transformation scenarios in a uniform way. Despite

  19. Optimization Using Metamodeling in the Context of Integrated Computational Materials Engineering (ICME)

    Energy Technology Data Exchange (ETDEWEB)

    Hammi, Youssef; Horstemeyer, Mark F; Wang, Paul; David, Francis; Carino, Ricolindo

    2013-11-18

    Predictive Design Technologies, LLC (PDT) proposed to employ Integrated Computational Materials Engineering (ICME) tools to help the manufacturing industry in the United States regain the competitive advantage in the global economy. ICME uses computational materials science tools within a holistic system in order to accelerate materials development, improve design optimization, and unify design and manufacturing. With the advent of accurate modeling and simulation along with significant increases in high performance computing (HPC) power, virtual design and manufacturing using ICME tools provide the means to reduce product development time and cost by alleviating costly trial-and-error physical design iterations while improving overall quality and manufacturing efficiency. To reduce the computational cost necessary for the large-scale HPC simulations and to make the methodology accessible for small and medium-sized manufacturers (SMMs), metamodels are employed. Metamodels are approximate models (functional relationships between input and output variables) that can reduce the simulation times by one to two orders of magnitude. In Phase I, PDT, partnered with Mississippi State University (MSU), demonstrated the feasibility of the proposed methodology by employing MSU?s internal state variable (ISV) plasticity-damage model with the help of metamodels to optimize the microstructure-process-property-cost for tube manufacturing processes used by Plymouth Tube Company (PTC), which involves complicated temperature and mechanical loading histories. PDT quantified the microstructure-property relationships for PTC?s SAE J525 electric resistance-welded cold drawn low carbon hydraulic 1010 steel tube manufacturing processes at seven different material states and calibrated the ISV plasticity material parameters to fit experimental tensile stress-strain curves. PDT successfully performed large scale finite element (FE) simulations in an HPC environment using the ISV plasticity

  20. A MOF Metamodel for the Development of Context-Aware Mobile Applications

    NARCIS (Netherlands)

    Guareis de farias, Cléver; Leite, M.M.; Calvi, C.Z.; Mantovaneli Pessoa, Rodrigo; Pereira Filho, J.G.; Pereira Filho, J.

    Context-aware mobile applications are increasingly attracting interest of the research community. To facilitate the development of this class of applications, it is necessary that both applications and support platforms share a common context metamodel. This paper presents a metamodel defined using

  1. idSpace D2.3 – Semantic meta-model integration and transformations v2

    DEFF Research Database (Denmark)

    Dolog, Peter; Grube, Pascal; Schmid, Klaus

    2009-01-01

    This deliverable discusses an extended set of requirements for transformations and metamodel for creativity techniques. Based on the requirements, the deliverable provides refined meta-model. The metamodel allows for more advanced transforma-tion concepts besides the previously delivered graph tr...... oriented implemen-tation with portlets and widgets in the Liferay portal....

  2. Metamodeling and mapping of nitrate flux in the unsaturated zone and groundwater, Wisconsin, USA

    Science.gov (United States)

    Nolan, Bernard T.; Green, Christopher T.; Juckem, Paul F.; Liao, Lixia; Reddy, James E.

    2018-04-01

    Nitrate contamination of groundwater in agricultural areas poses a major challenge to the sustainability of water resources. Aquifer vulnerability models are useful tools that can help resource managers identify areas of concern, but quantifying nitrogen (N) inputs in such models is challenging, especially at large spatial scales. We sought to improve regional nitrate (NO3-) input functions by characterizing unsaturated zone NO3- transport to groundwater through use of surrogate, machine-learning metamodels of a process-based N flux model. The metamodels used boosted regression trees (BRTs) to relate mappable landscape variables to parameters and outputs of a previous "vertical flux method" (VFM) applied at sampled wells in the Fox, Wolf, and Peshtigo (FWP) river basins in northeastern Wisconsin. In this context, the metamodels upscaled the VFM results throughout the region, and the VFM parameters and outputs are the metamodel response variables. The study area encompassed the domain of a detailed numerical model that provided additional predictor variables, including groundwater recharge, to the metamodels. We used a statistical learning framework to test a range of model complexities to identify suitable hyperparameters of the six BRT metamodels corresponding to each response variable of interest: NO3- source concentration factor (which determines the local NO3- input concentration); unsaturated zone travel time; NO3- concentration at the water table in 1980, 2000, and 2020 (three separate metamodels); and NO3- "extinction depth", the eventual steady state depth of the NO3- front. The final metamodels were trained to 129 wells within the active numerical flow model area, and considered 58 mappable predictor variables compiled in a geographic information system (GIS). These metamodels had training and cross-validation testing R2 values of 0.52 - 0.86 and 0.22 - 0.38, respectively, and predictions were compiled as maps of the above response variables. Testing

  3. Achieving Actionable Results from Available Inputs: Metamodels Take Building Energy Simulations One Step Further

    Energy Technology Data Exchange (ETDEWEB)

    Horsey, Henry; Fleming, Katherine; Ball, Brian; Long, Nicholas

    2016-08-26

    Modeling commercial building energy usage can be a difficult and time-consuming task. The increasing prevalence of optimization algorithms provides one path for reducing the time and difficulty. Many use cases remain, however, where information regarding whole-building energy usage is valuable, but the time and expertise required to run and post-process a large number of building energy simulations is intractable. A relatively underutilized option to accurately estimate building energy consumption in real time is to pre-compute large datasets of potential building energy models, and use the set of results to quickly and efficiently provide highly accurate data. This process is called metamodeling. In this paper, two case studies are presented demonstrating the successful applications of metamodeling using the open-source OpenStudio Analysis Framework. The first case study involves the U.S. Department of Energy's Asset Score Tool, specifically the Preview Asset Score Tool, which is designed to give nontechnical users a near-instantaneous estimated range of expected results based on building system-level inputs. The second case study involves estimating the potential demand response capabilities of retail buildings in Colorado. The metamodel developed in this second application not only allows for estimation of a single building's expected performance, but also can be combined with public data to estimate the aggregate DR potential across various geographic (county and state) scales. In both case studies, the unique advantages of pre-computation allow building energy models to take the place of topdown actuarial evaluations. This paper ends by exploring the benefits of using metamodels and then examines the cost-effectiveness of this approach.

  4. Metamodeling and mapping of nitrate flux in the unsaturated zone and groundwater, Wisconsin, USA

    Science.gov (United States)

    Nolan, Bernard T.; Green, Christopher T.; Juckem, Paul F.; Liao, Lixia; Reddy, James E.

    2018-01-01

    Nitrate contamination of groundwater in agricultural areas poses a major challenge to the sustainability of water resources. Aquifer vulnerability models are useful tools that can help resource managers identify areas of concern, but quantifying nitrogen (N) inputs in such models is challenging, especially at large spatial scales. We sought to improve regional nitrate (NO3−) input functions by characterizing unsaturated zone NO3− transport to groundwater through use of surrogate, machine-learning metamodels of a process-based N flux model. The metamodels used boosted regression trees (BRTs) to relate mappable landscape variables to parameters and outputs of a previous “vertical flux method” (VFM) applied at sampled wells in the Fox, Wolf, and Peshtigo (FWP) river basins in northeastern Wisconsin. In this context, the metamodels upscaled the VFM results throughout the region, and the VFM parameters and outputs are the metamodel response variables. The study area encompassed the domain of a detailed numerical model that provided additional predictor variables, including groundwater recharge, to the metamodels. We used a statistical learning framework to test a range of model complexities to identify suitable hyperparameters of the six BRT metamodels corresponding to each response variable of interest: NO3− source concentration factor (which determines the local NO3− input concentration); unsaturated zone travel time; NO3− concentration at the water table in 1980, 2000, and 2020 (three separate metamodels); and NO3− “extinction depth”, the eventual steady state depth of the NO3−front. The final metamodels were trained to 129 wells within the active numerical flow model area, and considered 58 mappable predictor variables compiled in a geographic information system (GIS). These metamodels had training and cross-validation testing R2 values of 0.52 – 0.86 and 0.22 – 0.38, respectively, and predictions were compiled as maps of the above

  5. MISTRAL : A Language for Model Transformations in the MOF Meta-modeling Architecture

    NARCIS (Netherlands)

    Kurtev, Ivan; van den Berg, Klaas; Aßmann, Uwe; Aksit, Mehmet; Rensink, Arend

    2005-01-01

    n the Meta Object Facility (MOF) meta-modeling architecture a number of model transformation scenarios can be identified. It could be expected that a meta-modeling architecture will be accompanied by a transformation technology supporting the model transformation scenarios in a uniform way. Despite

  6. Certified metamodels for sensitivity indices estimation

    Directory of Open Access Journals (Sweden)

    Prieur Clémentine

    2012-04-01

    Full Text Available Global sensitivity analysis of a numerical code, more specifically estimation of Sobol indices associated with input variables, generally requires a large number of model runs. When those demand too much computation time, it is necessary to use a reduced model (metamodel to perform sensitivity analysis, whose outputs are numerically close to the ones of the original model, while being much faster to run. In this case, estimated indices are subject to two kinds of errors: sampling error, caused by the computation of the integrals appearing in the definition of the Sobol indices by a Monte-Carlo method, and metamodel error, caused by the replacement of the original model by the metamodel. In cases where we have certified bounds for the metamodel error, we propose a method to quantify both types of error, and we compute confidence intervals for first-order Sobol indices. L’analyse de sensibilité globale d’un modèle numérique, plus précisément l’estimation des indices de Sobol associés aux variables d’entrée, nécessite généralement un nombre important d’exécutions du modèle à analyser. Lorsque celles-ci requièrent un temps de calcul important, il est judicieux d’effectuer l’analyse de sensibilité sur un modèle réduit (ou métamodèle, fournissant des sorties numériquement proches du modèle original mais pour un coût nettement inférieur. Les indices estimés sont alors entâchés de deux sortes d’erreur : l’erreur d’échantillonnage, causée par l’estimation des intégrales définissant les indices de Sobol par une méthode de Monte-Carlo, et l’erreur de métamodèle, liée au remplacement du modèle original par le métamodèle. Lorsque nous disposons de bornes d’erreurs certifiées pour le métamodèle, nous proposons une méthode pour quantifier les deux types d’erreurs et fournir des intervalles de confiance pour les indices de Sobol du premier ordre.

  7. Risk Analysis for Road Tunnels – A Metamodel to Efficiently Integrate Complex Fire Scenarios

    DEFF Research Database (Denmark)

    Berchtold, Florian; Knaust, Christian; Arnold, Lukas

    2018-01-01

    Fires in road tunnels constitute complex scenarios with interactions between the fire, tunnel users and safety measures. More and more methodologies for risk analysis quantify the consequences of these scenarios with complex models. Examples for complex models are the computational fluid dynamics...... complex scenarios in risk analysis. To face this challenge, we improved the metamodel used in the methodology for risk analysis presented on ISTSS 2016. In general, a metamodel quickly interpolates the consequences of few scenarios simulated with the complex models to a large number of arbitrary scenarios...... used in risk analysis. Now, our metamodel consists of the projection array-based design, the moving least squares method, and the prediction interval to quantify the metamodel uncertainty. Additionally, we adapted the projection array-based design in two ways: the focus of the sequential refinement...

  8. Schizoanalysis as Metamodeling

    Directory of Open Access Journals (Sweden)

    Janell Watson

    2008-01-01

    Full Text Available Félix Guattari, writing both on his own and with philosopher Gilles Deleuze, developed the notion of schizoanalysis out of his frustration with what he saw as the shortcomings of Freudian and Lacanian psychoanalysis, namely the orientation toward neurosis, emphasis on language, and lack of socio-political engagement. Guattari was analyzed by Lacan, attended the seminars from the beginning, and remained a member of Lacan's school until his death in 1992. His unorthodox lacanism grew out of his clinical work with schizophrenics and involvement in militant politics. Paradoxically, even as he rebelled theoretically and practically against Lacan's 'mathemes of the unconscious' and topology of knots, Guattari ceaselessly drew diagrams and models. Deleuze once said of him that 'His ideas are drawings, or even diagrams.' Guattari's singled-authored books are filled with strange figures, which borrow from fields as diverse as linguistics, cultural anthropology, chaos theory, energetics, and non-equilibrium thermodynamics. Guattari himself declared schizoanalysis a 'metamodeling,' but at the same time insisted that his models were constructed aesthetically, not scientifically, despite his liberal borrowing of scientific terminology. The practice of schizoanalytic metamodeling is complicated by his and Deleuze's concept of the diagram, which they define as a way of thinking that bypasses language, as for example in musical notation or mathematical formulas. This article will explore Guattari's models, in relation to Freud, Lacan, C.S. Peirce, Louis Hjelmslev, Noam Chomsky, and Ilya Prigogine. I will also situate his drawings in relation to his work as a practicing clinician, political activist, and co-author of Anti-Oedipus and A Thousand Plateaus.

  9. Metamodeling of Semantic Web Enabled Multiagent Systems

    NARCIS (Netherlands)

    Kardas, G.; Göknil, Arda; Dikenelli, O.; Topaloglu, N.Y.; Weyns, D.; Holvoet, T.

    2006-01-01

    Several agent researchers are currently studying agent modeling and they propose dierent architectural metamodels for developing Multiagent Systems (MAS) according to specic agent development methodologies. When support for Semantic Web technology and its related constructs are considered, agent

  10. Mapping ground water vulnerability to pesticide leaching with a process-based metamodel of EuroPEARL.

    Science.gov (United States)

    Tiktak, A; Boesten, J J T I; van der Linden, A M A; Vanclooster, M

    2006-01-01

    To support EU policy, indicators of pesticide leaching at the European level are required. For this reason, a metamodel of the spatially distributed European pesticide leaching model EuroPEARL was developed. EuroPEARL considers transient flow and solute transport and assumes Freundlich adsorption, first-order degradation and passive plant uptake of pesticides. Physical parameters are depth dependent while (bio)-chemical parameters are depth, temperature, and moisture dependent. The metamodel is based on an analytical expression that describes the mass fraction of pesticide leached. The metamodel ignores vertical parameter variations and assumes steady flow. The calibration dataset was generated with EuroPEARL and consisted of approximately 60,000 simulations done for 56 pesticides with different half-lives and partitioning coefficients. The target variable was the 80th percentile of the annual average leaching concentration at 1-m depth from a time series of 20 yr. The metamodel explains over 90% of the variation of the original model with only four independent spatial attributes. These parameters are available in European soil and climate databases, so that the calibrated metamodel could be applied to generate maps of the predicted leaching concentration in the European Union. Maps generated with the metamodel showed a good similarity with the maps obtained with EuroPEARL, which was confirmed by means of quantitative performance indicators.

  11. Application of Metamodels to Identification of Metallic Materials Models

    Directory of Open Access Journals (Sweden)

    Maciej Pietrzyk

    2016-01-01

    Full Text Available Improvement of the efficiency of the inverse analysis (IA for various material tests was the objective of the paper. Flow stress models and microstructure evolution models of various complexity of mathematical formulation were considered. Different types of experiments were performed and the results were used for the identification of models. Sensitivity analysis was performed for all the models and the importance of parameters in these models was evaluated. Metamodels based on artificial neural network were proposed to simulate experiments in the inverse solution. Performed analysis has shown that significant decrease of the computing times could be achieved when metamodels substitute finite element model in the inverse analysis, which is the case in the identification of flow stress models. Application of metamodels gave good results for flow stress models based on closed form equations accounting for an influence of temperature, strain, and strain rate (4 coefficients and additionally for softening due to recrystallization (5 coefficients and for softening and saturation (7 coefficients. Good accuracy and high efficiency of the IA were confirmed. On the contrary, identification of microstructure evolution models, including phase transformation models, did not give noticeable reduction of the computing time.

  12. Global sensitivity analysis using a Gaussian Radial Basis Function metamodel

    International Nuclear Information System (INIS)

    Wu, Zeping; Wang, Donghui; Okolo N, Patrick; Hu, Fan; Zhang, Weihua

    2016-01-01

    Sensitivity analysis plays an important role in exploring the actual impact of adjustable parameters on response variables. Amongst the wide range of documented studies on sensitivity measures and analysis, Sobol' indices have received greater portion of attention due to the fact that they can provide accurate information for most models. In this paper, a novel analytical expression to compute the Sobol' indices is derived by introducing a method which uses the Gaussian Radial Basis Function to build metamodels of computationally expensive computer codes. Performance of the proposed method is validated against various analytical functions and also a structural simulation scenario. Results demonstrate that the proposed method is an efficient approach, requiring a computational cost of one to two orders of magnitude less when compared to the traditional Quasi Monte Carlo-based evaluation of Sobol' indices. - Highlights: • RBF based sensitivity analysis method is proposed. • Sobol' decomposition of Gaussian RBF metamodel is obtained. • Sobol' indices of Gaussian RBF metamodel are derived based on the decomposition. • The efficiency of proposed method is validated by some numerical examples.

  13. A kriging metamodel-assisted robust optimization method based on a reverse model

    Science.gov (United States)

    Zhou, Hui; Zhou, Qi; Liu, Congwei; Zhou, Taotao

    2018-02-01

    The goal of robust optimization methods is to obtain a solution that is both optimum and relatively insensitive to uncertainty factors. Most existing robust optimization approaches use outer-inner nested optimization structures where a large amount of computational effort is required because the robustness of each candidate solution delivered from the outer level should be evaluated in the inner level. In this article, a kriging metamodel-assisted robust optimization method based on a reverse model (K-RMRO) is first proposed, in which the nested optimization structure is reduced into a single-loop optimization structure to ease the computational burden. Ignoring the interpolation uncertainties from kriging, K-RMRO may yield non-robust optima. Hence, an improved kriging-assisted robust optimization method based on a reverse model (IK-RMRO) is presented to take the interpolation uncertainty of kriging metamodel into consideration. In IK-RMRO, an objective switching criterion is introduced to determine whether the inner level robust optimization or the kriging metamodel replacement should be used to evaluate the robustness of design alternatives. The proposed criterion is developed according to whether or not the robust status of the individual can be changed because of the interpolation uncertainties from the kriging metamodel. Numerical and engineering cases are used to demonstrate the applicability and efficiency of the proposed approach.

  14. SWAT meta-modeling as support of the management scenario analysis in large watersheds.

    Science.gov (United States)

    Azzellino, A; Çevirgen, S; Giupponi, C; Parati, P; Ragusa, F; Salvetti, R

    2015-01-01

    In the last two decades, numerous models and modeling techniques have been developed to simulate nonpoint source pollution effects. Most models simulate the hydrological, chemical, and physical processes involved in the entrainment and transport of sediment, nutrients, and pesticides. Very often these models require a distributed modeling approach and are limited in scope by the requirement of homogeneity and by the need to manipulate extensive data sets. Physically based models are extensively used in this field as a decision support for managing the nonpoint source emissions. A common characteristic of this type of model is a demanding input of several state variables that makes the calibration and effort-costing in implementing any simulation scenario more difficult. In this study the USDA Soil and Water Assessment Tool (SWAT) was used to model the Venice Lagoon Watershed (VLW), Northern Italy. A Multi-Layer Perceptron (MLP) network was trained on SWAT simulations and used as a meta-model for scenario analysis. The MLP meta-model was successfully trained and showed an overall accuracy higher than 70% both on the training and on the evaluation set, allowing a significant simplification in conducting scenario analysis.

  15. A precategorical spatial-data metamodel

    OpenAIRE

    Steven A Roberts; G Brent Hall; Paul H Calamai

    2006-01-01

    Increasing recognition of the extent and speed of habitat fragmentation and loss, particularly in the urban fringe, is driving the need to analyze qualitatively and quantitatively regional landscape structure for decision support in land-use planning and environmental-policy implementation. The spatial analysis required in this area is not well served by existing spatial-data models. In this paper a new theoretical spatial-data metamodel is introduced as a tool for addressing such needs and a...

  16. Metamodel comparison and model comparison for safety assurance

    NARCIS (Netherlands)

    Luo, Y.; Engelen, L.J.P.; Brand, van den M.G.J.; Bondavelli, A.; Ceccarelli, A.; Ortmeier, F.

    2014-01-01

    In safety-critical domains, conceptual models are created in the form of metamodels using different concepts from possibly overlapping domains. Comparison between those conceptual models can facilitate the reuse of models from one domain to another. This paper describes the mappings detected when

  17. An organizational metamodel for hospital emergency departments.

    Science.gov (United States)

    Kaptan, Kubilay

    2014-10-01

    I introduce an organizational model describing the response of the hospital emergency department. The hybrid simulation/analytical model (called a "metamodel") can estimate a hospital's capacity and dynamic response in real time and incorporate the influence of damage to structural and nonstructural components on the organizational ones. The waiting time is the main parameter of response and is used to evaluate the disaster resilience of health care facilities. Waiting time behavior is described by using a double exponential function and its parameters are calibrated based on simulated data. The metamodel covers a large range of hospital configurations and takes into account hospital resources in terms of staff and infrastructures, operational efficiency, and the possible existence of an emergency plan; maximum capacity; and behavior both in saturated and overcapacitated conditions. The sensitivity of the model to different arrival rates, hospital configurations, and capacities and the technical and organizational policies applied during and before a disaster were investigated. This model becomes an important tool in the decision process either for the engineering profession or for policy makers.

  18. Property preservation and quality measures in meta-models

    NARCIS (Netherlands)

    Siem, A.Y.D.

    2008-01-01

    This thesis consists of three parts. Each part considers different sorts of meta-models. In the first part so-called Sandwich models are considered. In the second part Kriging models are considered. Finally, in the third part, (trigonometric) Polynomials and Rational models are studied.

  19. Computational Intelligence and Wavelet Transform Based Metamodel for Efficient Generation of Not-Yet Simulated Waveforms

    Science.gov (United States)

    Oltean, Gabriel; Ivanciu, Laura-Nicoleta

    2016-01-01

    The design and verification of complex electronic systems, especially the analog and mixed-signal ones, prove to be extremely time consuming tasks, if only circuit-level simulations are involved. A significant amount of time can be saved if a cost effective solution is used for the extensive analysis of the system, under all conceivable conditions. This paper proposes a data-driven method to build fast to evaluate, but also accurate metamodels capable of generating not-yet simulated waveforms as a function of different combinations of the parameters of the system. The necessary data are obtained by early-stage simulation of an electronic control system from the automotive industry. The metamodel development is based on three key elements: a wavelet transform for waveform characterization, a genetic algorithm optimization to detect the optimal wavelet transform and to identify the most relevant decomposition coefficients, and an artificial neuronal network to derive the relevant coefficients of the wavelet transform for any new parameters combination. The resulted metamodels for three different waveform families are fully reliable. They satisfy the required key points: high accuracy (a maximum mean squared error of 7.1x10-5 for the unity-based normalized waveforms), efficiency (fully affordable computational effort for metamodel build-up: maximum 18 minutes on a general purpose computer), and simplicity (less than 1 second for running the metamodel, the user only provides the parameters combination). The metamodels can be used for very efficient generation of new waveforms, for any possible combination of dependent parameters, offering the possibility to explore the entire design space. A wide range of possibilities becomes achievable for the user, such as: all design corners can be analyzed, possible worst-case situations can be investigated, extreme values of waveforms can be discovered, sensitivity analyses can be performed (the influence of each parameter on the

  20. Computational Intelligence and Wavelet Transform Based Metamodel for Efficient Generation of Not-Yet Simulated Waveforms.

    Directory of Open Access Journals (Sweden)

    Gabriel Oltean

    Full Text Available The design and verification of complex electronic systems, especially the analog and mixed-signal ones, prove to be extremely time consuming tasks, if only circuit-level simulations are involved. A significant amount of time can be saved if a cost effective solution is used for the extensive analysis of the system, under all conceivable conditions. This paper proposes a data-driven method to build fast to evaluate, but also accurate metamodels capable of generating not-yet simulated waveforms as a function of different combinations of the parameters of the system. The necessary data are obtained by early-stage simulation of an electronic control system from the automotive industry. The metamodel development is based on three key elements: a wavelet transform for waveform characterization, a genetic algorithm optimization to detect the optimal wavelet transform and to identify the most relevant decomposition coefficients, and an artificial neuronal network to derive the relevant coefficients of the wavelet transform for any new parameters combination. The resulted metamodels for three different waveform families are fully reliable. They satisfy the required key points: high accuracy (a maximum mean squared error of 7.1x10-5 for the unity-based normalized waveforms, efficiency (fully affordable computational effort for metamodel build-up: maximum 18 minutes on a general purpose computer, and simplicity (less than 1 second for running the metamodel, the user only provides the parameters combination. The metamodels can be used for very efficient generation of new waveforms, for any possible combination of dependent parameters, offering the possibility to explore the entire design space. A wide range of possibilities becomes achievable for the user, such as: all design corners can be analyzed, possible worst-case situations can be investigated, extreme values of waveforms can be discovered, sensitivity analyses can be performed (the influence of each

  1. Computational Intelligence and Wavelet Transform Based Metamodel for Efficient Generation of Not-Yet Simulated Waveforms.

    Science.gov (United States)

    Oltean, Gabriel; Ivanciu, Laura-Nicoleta

    2016-01-01

    The design and verification of complex electronic systems, especially the analog and mixed-signal ones, prove to be extremely time consuming tasks, if only circuit-level simulations are involved. A significant amount of time can be saved if a cost effective solution is used for the extensive analysis of the system, under all conceivable conditions. This paper proposes a data-driven method to build fast to evaluate, but also accurate metamodels capable of generating not-yet simulated waveforms as a function of different combinations of the parameters of the system. The necessary data are obtained by early-stage simulation of an electronic control system from the automotive industry. The metamodel development is based on three key elements: a wavelet transform for waveform characterization, a genetic algorithm optimization to detect the optimal wavelet transform and to identify the most relevant decomposition coefficients, and an artificial neuronal network to derive the relevant coefficients of the wavelet transform for any new parameters combination. The resulted metamodels for three different waveform families are fully reliable. They satisfy the required key points: high accuracy (a maximum mean squared error of 7.1x10-5 for the unity-based normalized waveforms), efficiency (fully affordable computational effort for metamodel build-up: maximum 18 minutes on a general purpose computer), and simplicity (less than 1 second for running the metamodel, the user only provides the parameters combination). The metamodels can be used for very efficient generation of new waveforms, for any possible combination of dependent parameters, offering the possibility to explore the entire design space. A wide range of possibilities becomes achievable for the user, such as: all design corners can be analyzed, possible worst-case situations can be investigated, extreme values of waveforms can be discovered, sensitivity analyses can be performed (the influence of each parameter on the

  2. Meta-modeling soil organic carbon sequestration potential and its application at regional scale.

    Science.gov (United States)

    Luo, Zhongkui; Wang, Enli; Bryan, Brett A; King, Darran; Zhao, Gang; Pan, Xubin; Bende-Michl, Ulrike

    2013-03-01

    Upscaling the results from process-based soil-plant models to assess regional soil organic carbon (SOC) change and sequestration potential is a great challenge due to the lack of detailed spatial information, particularly soil properties. Meta-modeling can be used to simplify and summarize process-based models and significantly reduce the demand for input data and thus could be easily applied on regional scales. We used the pre-validated Agricultural Production Systems sIMulator (APSIM) to simulate the impact of climate, soil, and management on SOC at 613 reference sites across Australia's cereal-growing regions under a continuous wheat system. We then developed a simple meta-model to link the APSIM-modeled SOC change to primary drivers, i.e., the amount of recalcitrant SOC, plant available water capacity of soil, soil pH, and solar radiation, temperature, and rainfall in the growing season. Based on high-resolution soil texture data and 8165 climate data points across the study area, we used the meta-model to assess SOC sequestration potential and the uncertainty associated with the variability of soil characteristics. The meta-model explained 74% of the variation of final SOC content as simulated by APSIM. Applying the meta-model to Australia's cereal-growing regions reveals regional patterns in SOC, with higher SOC stock in cool, wet regions. Overall, the potential SOC stock ranged from 21.14 to 152.71 Mg/ha with a mean of 52.18 Mg/ha. Variation of soil properties induced uncertainty ranging from 12% to 117% with higher uncertainty in warm, wet regions. In general, soils in Australia's cereal-growing regions under continuous wheat production were simulated as a sink of atmospheric carbon dioxide with a mean sequestration potential of 8.17 Mg/ha.

  3. Meta-modelling, visualization and emulation of multi-dimensional data for virtual production intelligence

    Science.gov (United States)

    Schulz, Wolfgang; Hermanns, Torsten; Al Khawli, Toufik

    2017-07-01

    Decision making for competitive production in high-wage countries is a daily challenge where rational and irrational methods are used. The design of decision making processes is an intriguing, discipline spanning science. However, there are gaps in understanding the impact of the known mathematical and procedural methods on the usage of rational choice theory. Following Benjamin Franklin's rule for decision making formulated in London 1772, he called "Prudential Algebra" with the meaning of prudential reasons, one of the major ingredients of Meta-Modelling can be identified finally leading to one algebraic value labelling the results (criteria settings) of alternative decisions (parameter settings). This work describes the advances in Meta-Modelling techniques applied to multi-dimensional and multi-criterial optimization by identifying the persistence level of the corresponding Morse-Smale Complex. Implementations for laser cutting and laser drilling are presented, including the generation of fast and frugal Meta-Models with controlled error based on mathematical model reduction Reduced Models are derived to avoid any unnecessary complexity. Both, model reduction and analysis of multi-dimensional parameter space are used to enable interactive communication between Discovery Finders and Invention Makers. Emulators and visualizations of a metamodel are introduced as components of Virtual Production Intelligence making applicable the methods of Scientific Design Thinking and getting the developer as well as the operator more skilled.

  4. Developing and applying metamodels of high resolution process-based simulations for high throughput exposure assessment of organic chemicals in riverine ecosystems.

    Science.gov (United States)

    Craig Barber, M; Isaacs, Kristin K; Tebes-Stevens, Caroline

    2017-12-15

    As defined by Wikipedia (https://en.wikipedia.org/wiki/Metamodeling), "(a) metamodel or surrogate model is a model of a model, and metamodeling is the process of generating such metamodels." The goals of metamodeling include, but are not limited to (1) developing functional or statistical relationships between a model's input and output variables for model analysis, interpretation, or information consumption by users' clients; (2) quantifying a model's sensitivity to alternative or uncertain forcing functions, initial conditions, or parameters; and (3) characterizing the model's response or state space. Using five models developed by the US Environmental Protection Agency, we generate a metamodeling database of the expected environmental and biological concentrations of 644 organic chemicals released into nine US rivers from wastewater treatment works (WTWs) assuming multiple loading rates and sizes of populations serviced. The chemicals of interest have log n-octanol/water partition coefficients (logK OW ) ranging from 3 to 14, and the rivers of concern have mean annual discharges ranging from 1.09 to 3240m 3 /s. Log-linear regression models are derived to predict mean annual dissolved and total water concentrations and total sediment concentrations of chemicals of concern based on their logK OW, Henry's Law Constant, and WTW loading rate and on the mean annual discharges of the receiving rivers. Metamodels are also derived to predict mean annual chemical concentrations in fish, invertebrates, and periphyton. We corroborate a subset of these metamodels using field studies focused on brominated flame retardants and discuss their application for high throughput screening of exposures to human and ecological populations and for analysis and interpretation of field data. Published by Elsevier B.V.

  5. A toolkit for detecting technical surprise.

    Energy Technology Data Exchange (ETDEWEB)

    Trahan, Michael Wayne; Foehse, Mark C.

    2010-10-01

    The detection of a scientific or technological surprise within a secretive country or institute is very difficult. The ability to detect such surprises would allow analysts to identify the capabilities that could be a military or economic threat to national security. Sandia's current approach utilizing ThreatView has been successful in revealing potential technological surprises. However, as data sets become larger, it becomes critical to use algorithms as filters along with the visualization environments. Our two-year LDRD had two primary goals. First, we developed a tool, a Self-Organizing Map (SOM), to extend ThreatView and improve our understanding of the issues involved in working with textual data sets. Second, we developed a toolkit for detecting indicators of technical surprise in textual data sets. Our toolkit has been successfully used to perform technology assessments for the Science & Technology Intelligence (S&TI) program.

  6. Evaluation of convergence behavior of metamodeling techniques for bridging scales in multi-scale multimaterial simulation

    International Nuclear Information System (INIS)

    Sen, Oishik; Davis, Sean; Jacobs, Gustaaf; Udaykumar, H.S.

    2015-01-01

    The effectiveness of several metamodeling techniques, viz. the Polynomial Stochastic Collocation method, Adaptive Stochastic Collocation method, a Radial Basis Function Neural Network, a Kriging Method and a Dynamic Kriging Method is evaluated. This is done with the express purpose of using metamodels to bridge scales between micro- and macro-scale models in a multi-scale multimaterial simulation. The rate of convergence of the error when used to reconstruct hypersurfaces of known functions is studied. For sufficiently large number of training points, Stochastic Collocation methods generally converge faster than the other metamodeling techniques, while the DKG method converges faster when the number of input points is less than 100 in a two-dimensional parameter space. Because the input points correspond to computationally expensive micro/meso-scale computations, the DKG is favored for bridging scales in a multi-scale solver

  7. Constraints on the nuclear equation of state from nuclear masses and radii in a Thomas-Fermi meta-modeling approach

    Science.gov (United States)

    Chatterjee, D.; Gulminelli, F.; Raduta, Ad. R.; Margueron, J.

    2017-12-01

    The question of correlations among empirical equation of state (EoS) parameters constrained by nuclear observables is addressed in a Thomas-Fermi meta-modeling approach. A recently proposed meta-modeling for the nuclear EoS in nuclear matter is augmented with a single finite size term to produce a minimal unified EoS functional able to describe the smooth part of the nuclear ground state properties. This meta-model can reproduce the predictions of a large variety of models, and interpolate continuously between them. An analytical approximation to the full Thomas-Fermi integrals is further proposed giving a fully analytical meta-model for nuclear masses. The parameter space is sampled and filtered through the constraint of nuclear mass reproduction with Bayesian statistical tools. We show that this simple analytical meta-modeling has a predictive power on masses, radii, and skins comparable to full Hartree-Fock or extended Thomas-Fermi calculations with realistic energy functionals. The covariance analysis on the posterior distribution shows that no physical correlation is present between the different EoS parameters. Concerning nuclear observables, a strong correlation between the slope of the symmetry energy and the neutron skin is observed, in agreement with previous studies.

  8. Feature and Meta-Models in Clafer: Mixed, Specialized, and Coupled

    DEFF Research Database (Denmark)

    Bąk, Kacper; Czarnecki, Krzysztof; Wasowski, Andrzej

    2011-01-01

    constraints (such as mapping feature configurations to component configurations or model templates). Clafer also allows arranging models into multiple specialization and extension layers via constraints and inheritance. We identify four key mechanisms allowing a meta-modeling language to express feature...

  9. A knowledge representation meta-model for rule-based modelling of signalling networks

    Directory of Open Access Journals (Sweden)

    Adrien Basso-Blandin

    2016-03-01

    Full Text Available The study of cellular signalling pathways and their deregulation in disease states, such as cancer, is a large and extremely complex task. Indeed, these systems involve many parts and processes but are studied piecewise and their literatures and data are consequently fragmented, distributed and sometimes—at least apparently—inconsistent. This makes it extremely difficult to build significant explanatory models with the result that effects in these systems that are brought about by many interacting factors are poorly understood. The rule-based approach to modelling has shown some promise for the representation of the highly combinatorial systems typically found in signalling where many of the proteins are composed of multiple binding domains, capable of simultaneous interactions, and/or peptide motifs controlled by post-translational modifications. However, the rule-based approach requires highly detailed information about the precise conditions for each and every interaction which is rarely available from any one single source. Rather, these conditions must be painstakingly inferred and curated, by hand, from information contained in many papers—each of which contains only part of the story. In this paper, we introduce a graph-based meta-model, attuned to the representation of cellular signalling networks, which aims to ease this massive cognitive burden on the rule-based curation process. This meta-model is a generalization of that used by Kappa and BNGL which allows for the flexible representation of knowledge at various levels of granularity. In particular, it allows us to deal with information which has either too little, or too much, detail with respect to the strict rule-based meta-model. Our approach provides a basis for the gradual aggregation of fragmented biological knowledge extracted from the literature into an instance of the meta-model from which we can define an automated translation into executable Kappa programs.

  10. Foundations of Meta-Pyramids: Languages vs. Metamodels -- Episode II: Story of Thotus the Baboon1

    OpenAIRE

    Favre, Jean-Marie

    2005-01-01

    Despite the recent interest for Model Driven Engineering approaches, the so-called four-layers metamodelling architecture is subject to a lot of debate. The relationship that exists between a model and a metamodel is often called instanceOf, but this terminology, which comes directly from the object oriented technology, is not appropriate for the modelling of similar meta-pyramids in other domains. The goal of this paper is to study which are the foundations of the meta-pyra...

  11. Surprise, Recipes for Surprise, and Social Influence.

    Science.gov (United States)

    Loewenstein, Jeffrey

    2018-02-07

    Surprising people can provide an opening for influencing them. Surprises garner attention, are arousing, are memorable, and can prompt shifts in understanding. Less noted is that, as a result, surprises can serve to persuade others by leading them to shifts in attitudes. Furthermore, because stories, pictures, and music can generate surprises and those can be widely shared, surprise can have broad social influence. People also tend to share surprising items with others, as anyone on social media has discovered. This means that in addition to broadcasting surprising information, surprising items can also spread through networks. The joint result is that surprise not only has individual effects on beliefs and attitudes but also collective effects on the content of culture. Items that generate surprise need not be random or accidental. There are predictable methods or recipes for generating surprise. One such recipe is discussed, the repetition-break plot structure, to explore the psychological and social possibilities of examining surprise. Recipes for surprise offer a useful means for understanding how surprise works and offer prospects for harnessing surprise to a wide array of ends. Copyright © 2017 Cognitive Science Society, Inc.

  12. Kolb's Experiential Learning Theory: A Meta-Model for Career Exploration.

    Science.gov (United States)

    Atkinson, George, Jr.; Murrell, Patricia H.

    1988-01-01

    Kolb's experiential learning theory offers the career counselor a meta-model with which to structure career exploration exercises and ensure a thorough investigation of self and the world of work in a manner that provides the client with an optimal amount of learning and personal development. (Author)

  13. Linear regression metamodeling as a tool to summarize and present simulation model results.

    Science.gov (United States)

    Jalal, Hawre; Dowd, Bryan; Sainfort, François; Kuntz, Karen M

    2013-10-01

    Modelers lack a tool to systematically and clearly present complex model results, including those from sensitivity analyses. The objective was to propose linear regression metamodeling as a tool to increase transparency of decision analytic models and better communicate their results. We used a simplified cancer cure model to demonstrate our approach. The model computed the lifetime cost and benefit of 3 treatment options for cancer patients. We simulated 10,000 cohorts in a probabilistic sensitivity analysis (PSA) and regressed the model outcomes on the standardized input parameter values in a set of regression analyses. We used the regression coefficients to describe measures of sensitivity analyses, including threshold and parameter sensitivity analyses. We also compared the results of the PSA to deterministic full-factorial and one-factor-at-a-time designs. The regression intercept represented the estimated base-case outcome, and the other coefficients described the relative parameter uncertainty in the model. We defined simple relationships that compute the average and incremental net benefit of each intervention. Metamodeling produced outputs similar to traditional deterministic 1-way or 2-way sensitivity analyses but was more reliable since it used all parameter values. Linear regression metamodeling is a simple, yet powerful, tool that can assist modelers in communicating model characteristics and sensitivity analyses.

  14. Aggregate meta-models for evolutionary multiobjective and many-objective optimization

    Czech Academy of Sciences Publication Activity Database

    Pilát, Martin; Neruda, Roman

    Roč. 116, 20 September (2013), s. 392-402 ISSN 0925-2312 R&D Projects: GA ČR GAP202/11/1368 Institutional support: RVO:67985807 Keywords : evolutionary algorithms * multiobjective optimization * many-objective optimization * surrogate models * meta-models * memetic algorithm Subject RIV: IN - Informatics, Computer Science Impact factor: 2.005, year: 2013

  15. A software complex intended for constructing applied models and meta-models on the basis of mathematical programming principles

    Directory of Open Access Journals (Sweden)

    Михаил Юрьевич Чернышов

    2013-12-01

    Full Text Available A software complex (SC elaborated by the authors on the basis of the language LMPL and representing a software tool intended for synthesis of applied software models and meta-models constructed on the basis of mathematical programming (MP principles is described. LMPL provides for an explicit form of declarative representation of MP-models, presumes automatic constructing and transformation of models and the capability of adding external software packages. The following software versions of the SC have been implemented: 1 a SC intended for representing the process of choosing an optimal hydroelectric power plant model (on the principles of meta-modeling and 2 a SC intended for representing the logic-sense relations between the models of a set of discourse formations in the discourse meta-model.

  16. Customized sequential designs for random simulation experiments: Kriging metamodeling and bootstrapping

    NARCIS (Netherlands)

    Beers, van W.C.M.; Kleijnen, J.P.C.

    2005-01-01

    This paper proposes a novel method to select an experimental design for interpolation in random simulation, especially discrete event simulation. (Though the paper focuses on Kriging, this design approach may also apply to other types of metamodels such as linear regression models.) Assuming that

  17. Ufo-element presentation in metamodel structure of triune continuum paradigm

    OpenAIRE

    Ukrayinets, ?.

    2006-01-01

    This paper describes results of UFO-element formal description in metamodel structure of Triune Continuum Paradigm. This can promote the solution of a problem of development of methods of mutual system-object UFO- and UML-models transformation for providing of more effective information systems designing, in particular, for visual modelling CASE-tools Rational Rose and UFO-Toolkit integration.

  18. Accelerated optimizations of an electromagnetic acoustic transducer with artificial neural networks as metamodels

    Directory of Open Access Journals (Sweden)

    S. Wang

    2017-08-01

    Full Text Available Electromagnetic acoustic transducers (EMATs are noncontact transducers generating ultrasonic waves directly in the conductive sample. Despite the advantages, their transduction efficiencies are relatively low, so it is imperative to build accurate multiphysics models of EMATs and optimize the structural parameters accordingly, using a suitable optimization algorithm. The optimizing process often involves a large number of runs of the computationally expensive numerical models, so metamodels as substitutes for the real numerical models are helpful for the optimizations. In this work the focus is on the artificial neural networks as the metamodels of an omnidirectional EMAT, including the multilayer feedforward networks trained with the basic and improved back propagation algorithms and the radial basis function networks with exact and nonexact interpolations. The developed neural-network programs are tested on an example problem. Then the model of an omnidirectional EMAT generating Lamb waves in a linearized steel plate is introduced, and various approaches to calculate the amplitudes of the displacement component waveforms are discussed. The neural-network metamodels are then built for the EMAT model and compared to the displacement component amplitude (or ratio of amplitudes surface data on a discrete grid of the design variables as the reference, applying a multifrequency model with FFT (fast Fourier transform/IFFT (inverse FFT processing. Finally the two-objective optimization problem is formulated with one objective function minimizing the ratio of the amplitude of the S0-mode Lamb wave to that of the A0 mode, and the other objective function minimizing as the negative amplitude of the A0 mode. Pareto fronts in the criterion space are solved with the neural-network models and the total time consumption is greatly decreased. From the study it could be observed that the radial basis function network with exact interpolation has the best

  19. Application of Metamodels to Identification of Metallic Materials Models

    OpenAIRE

    Pietrzyk, Maciej; Kusiak, Jan; Szeliga, Danuta; Rauch, Łukasz; Sztangret, Łukasz; Górecki, Grzegorz

    2016-01-01

    Improvement of the efficiency of the inverse analysis (IA) for various material tests was the objective of the paper. Flow stress models and microstructure evolution models of various complexity of mathematical formulation were considered. Different types of experiments were performed and the results were used for the identification of models. Sensitivity analysis was performed for all the models and the importance of parameters in these models was evaluated. Metamodels based on artificial ne...

  20. Meta-modeling of the pesticide fate model MACRO for groundwater exposure assessments using artificial neural networks

    Science.gov (United States)

    Stenemo, Fredrik; Lindahl, Anna M. L.; Gärdenäs, Annemieke; Jarvis, Nicholas

    2007-08-01

    Several simple index methods that use easily accessible data have been developed and included in decision-support systems to estimate pesticide leaching across larger areas. However, these methods often lack important process descriptions (e.g. macropore flow), which brings into question their reliability. Descriptions of macropore flow have been included in simulation models, but these are too complex and demanding for spatial applications. To resolve this dilemma, a neural network simulation meta-model of the dual-permeability macropore flow model MACRO was created for pesticide groundwater exposure assessment. The model was parameterized using pedotransfer functions that require as input the clay and sand content of the topsoil and subsoil, and the topsoil organic carbon content. The meta-model also requires the topsoil pesticide half-life and the soil organic carbon sorption coefficient as input. A fully connected feed-forward multilayer perceptron classification network with two hidden layers, linked to fully connected feed-forward multilayer perceptron neural networks with one hidden layer, trained on sub-sets of the target variable, was shown to be a suitable meta-model for the intended purpose. A Fourier amplitude sensitivity test showed that the model output (the 80th percentile average yearly pesticide concentration at 1 m depth for a 20 year simulation period) was sensitive to all input parameters. The two input parameters related to pesticide characteristics (i.e. soil organic carbon sorption coefficient and topsoil pesticide half-life) were the most influential, but texture in the topsoil was also quite important since it was assumed to control the mass exchange coefficient that regulates the strength of macropore flow. This is in contrast to models based on the advection-dispersion equation where soil texture is relatively unimportant. The use of the meta-model is exemplified with a case-study where the spatial variability of pesticide leaching is

  1. A meta-model based approach for rapid formability estimation of continuous fibre reinforced components

    Science.gov (United States)

    Zimmerling, Clemens; Dörr, Dominik; Henning, Frank; Kärger, Luise

    2018-05-01

    Due to their high mechanical performance, continuous fibre reinforced plastics (CoFRP) become increasingly important for load bearing structures. In many cases, manufacturing CoFRPs comprises a forming process of textiles. To predict and optimise the forming behaviour of a component, numerical simulations are applied. However, for maximum part quality, both the geometry and the process parameters must match in mutual regard, which in turn requires numerous numerically expensive optimisation iterations. In both textile and metal forming, a lot of research has focused on determining optimum process parameters, whilst regarding the geometry as invariable. In this work, a meta-model based approach on component level is proposed, that provides a rapid estimation of the formability for variable geometries based on pre-sampled, physics-based draping data. Initially, a geometry recognition algorithm scans the geometry and extracts a set of doubly-curved regions with relevant geometry parameters. If the relevant parameter space is not part of an underlying data base, additional samples via Finite-Element draping simulations are drawn according to a suitable design-table for computer experiments. Time saving parallel runs of the physical simulations accelerate the data acquisition. Ultimately, a Gaussian Regression meta-model is built from the data base. The method is demonstrated on a box-shaped generic structure. The predicted results are in good agreement with physics-based draping simulations. Since evaluations of the established meta-model are numerically inexpensive, any further design exploration (e.g. robustness analysis or design optimisation) can be performed in short time. It is expected that the proposed method also offers great potential for future applications along virtual process chains: For each process step along the chain, a meta-model can be set-up to predict the impact of design variations on manufacturability and part performance. Thus, the method is

  2. Coordination of Project and Current Activities on the Basis of the Strategy Alignment Metamodel in the Oil and Gas Company

    Directory of Open Access Journals (Sweden)

    R. Yu. Dashkov

    2017-01-01

    Full Text Available Purpose: the purpose of this article is to describe the Strategy Alignment Metamodel of the project and current activities, which allows us to connect the Goals and Strategies for Phases of the project with the Goals and Strategies of the company at all levels of the organization through targeted measurement and application of Interpretive Models. Building Networks of Goals and Strategies, and adopting organizational solutions, you coordinate the interaction of the Project office and departments of the company. This methodology is based on a Logical Rationale of the Contexts and Assumptions for establishing Goals and Strategies both for the project and for the company, and on preparation of Contexts and Assumptions, Goals and Strategies Alignment Matrices, which provides a flexible adaptation to the internal and external environment in the process of selecting the most successful Strategies to achieve the Goals. Methods: this article is based on the concept of Goals-Questions-Metrics+ Strategies, which is adapted as another concept of strategic monitoring and control system of projects: Goals-Phases-Metrics+Strategies. These concepts have formed the basis of the Strategy Alignment Metamodel, where a technology of Phases Earned Value Management is used as a measurement system for the project activity, and Balanced scorecard is applied for current operations. Results: strategy Alignment Metamodel of the project and current activities of the company is proposed hereby. It uses modern strategic monitoring and control systems for projects: Goals-Phases-Metrics+Strategies, and for the company: Goals-Questions-Metrics+ Strategies. The interaction between these systems is based on Contexts and Assumptions, Goals and Strategies Alignment Matrices. The existence of such matrices greatly simplifies management decisions and prevents the risk of delays in the execution of project Phases based on rational participation and coordination of the company

  3. An improved version of Inverse Distance Weighting metamodel assisted Harmony Search algorithm for truss design optimization

    Directory of Open Access Journals (Sweden)

    Y. Gholipour

    Full Text Available This paper focuses on a metamodel-based design optimization algorithm. The intention is to improve its computational cost and convergence rate. Metamodel-based optimization method introduced here, provides the necessary means to reduce the computational cost and convergence rate of the optimization through a surrogate. This algorithm is a combination of a high quality approximation technique called Inverse Distance Weighting and a meta-heuristic algorithm called Harmony Search. The outcome is then polished by a semi-tabu search algorithm. This algorithm adopts a filtering system and determines solution vectors where exact simulation should be applied. The performance of the algorithm is evaluated by standard truss design problems and there has been a significant decrease in the computational effort and improvement of convergence rate.

  4. SM4AM: A Semantic Metamodel for Analytical Metadata

    DEFF Research Database (Denmark)

    Varga, Jovan; Romero, Oscar; Pedersen, Torben Bach

    2014-01-01

    Next generation BI systems emerge as platforms where traditional BI tools meet semi-structured and unstructured data coming from the Web. In these settings, the user-centric orientation represents a key characteristic for the acceptance and wide usage by numerous and diverse end users in their data....... We present SM4AM, a Semantic Metamodel for Analytical Metadata created as an RDF formalization of the Analytical Metadata artifacts needed for user assistance exploitation purposes in next generation BI systems. We consider the Linked Data initiative and its relevance for user assistance...

  5. idSpace D2.2 – Semantic meta-model, integration and transformations v1

    DEFF Research Database (Denmark)

    Dolog, Peter; Lin, Yujian; Dols, Roger

    2009-01-01

    This report introduces a topic maps based meta-model for creativity techniques, creativity process, and idea maps as results from creativity process. It proposes a graph based and hierarchical graph based transformation of idea maps for combination and integration of results of different creativi...

  6. Surprise... Surprise..., An Empirical Investigation on How Surprise is Connected to Customer Satisfaction

    NARCIS (Netherlands)

    J. Vanhamme (Joëlle)

    2003-01-01

    textabstractThis research investigates the specific influence of the emotion of surprise on customer transaction-specific satisfaction. Four empirical studies-two field studies (a diary study and a cross section survey) and two experiments-were conducted. The results show that surprise positively

  7. UML Profile for Mining Process: Supporting Modeling and Simulation Based on Metamodels of Activity Diagram

    Directory of Open Access Journals (Sweden)

    Andrea Giubergia

    2014-01-01

    Full Text Available An UML profile describes lightweight extension mechanism to the UML by defining custom stereotypes, tagged values, and constraints. They are used to adapt UML metamodel to different platforms and domains. In this paper we present an UML profile for models supporting event driving simulation. In particular, we use the Arena simulation tool and we focus on the mining process domain. Profiles provide an easy way to obtain well-defined specifications, regulated by the Object Management Group (OMG. They can be used as a presimulation technique to obtain solid models for the mining industry. In this work we present a new profile to extend the UML metamodel; in particular we focus on the activity diagram. This extended model is applied to an industry problem involving loading and transportation of minerals in the field of mining process.

  8. Modeling units of study from a pedagogical perspective: the pedagogical meta-model behind EML

    NARCIS (Netherlands)

    Koper, Rob

    2003-01-01

    This text is a short summary of the work on pedagogical analysis carried out when EML (Educational Modelling Language) was being developed. Because we address pedagogical meta-models the consequence is that I must justify the underlying pedagogical models it describes. I have included a (far from

  9. Tackling Biocomplexity with Meta-models for Species Risk Assessment

    Directory of Open Access Journals (Sweden)

    Philip J. Nyhus

    2007-06-01

    Full Text Available We describe results of a multi-year effort to strengthen consideration of the human dimension into endangered species risk assessments and to strengthen research capacity to understand biodiversity risk assessment in the context of coupled human-natural systems. A core group of social and biological scientists have worked with a network of more than 50 individuals from four countries to develop a conceptual framework illustrating how human-mediated processes influence biological systems and to develop tools to gather, translate, and incorporate these data into existing simulation models. A central theme of our research focused on (1 the difficulties often encountered in identifying and securing diverse bodies of expertise and information that is necessary to adequately address complex species conservation issues; and (2 the development of quantitative simulation modeling tools that could explicitly link these datasets as a way to gain deeper insight into these issues. To address these important challenges, we promote a "meta-modeling" approach where computational links are constructed between discipline-specific models already in existence. In this approach, each model can function as a powerful stand-alone program, but interaction between applications is achieved by passing data structures describing the state of the system between programs. As one example of this concept, an integrated meta-model of wildlife disease and population biology is described. A goal of this effort is to improve science-based capabilities for decision making by scientists, natural resource managers, and policy makers addressing environmental problems in general, and focusing on biodiversity risk assessment in particular.

  10. Providing the meta-model of development of competency using the meta-ethnography approach: Part 2. Synthesis of the available competency development models

    Directory of Open Access Journals (Sweden)

    Shahram Yazdani

    2016-12-01

    Full Text Available Background and Purpose: ConsideringBackground and Purpose: Considering the importance and necessity of competency-based education at a global level and with respect to globalization and the requirement of minimum competencies in medical fields, medical education communities and organizations worldwide have tried to determine the competencies, present frameworks and education models to respond to be sure of the ability of all graduates. In the literature, we observed numerous competency development models that refer to the same issues with different terminologies. It seems that evaluation and synthesis of all these models can finally result in designing a comprehensive meta-model for competency development.Methods: Meta-ethnography is a useful method for synthesis of qualitative research that is used to develop models that interpret the results in several studies. Considering that the aim of this study is to ultimately provide a competency development meta-model, in the previous section of the study, the literature review was conducted to achieve competency development models. Models obtained through the search were studied in details, and the key concepts of the models and overarching concepts were extracted in this section, models’ concepts were reciprocally translated and the available competency development models were synthesized.Results: A presentation of the competency development meta-model and providing a redefinition of the Dreyfus brothers model.Conclusions: Given the importance of competency-based education at a global level and the need to review curricula and competency-based curriculum design, it is required to provide competency development as well as meta-model to be the basis for curriculum development. As there are a variety of competency development models available, in this study, it was tried to develop the curriculum using them.Keywords: Meta-ethnography, Competency development, Meta-model, Qualitative synthesis

  11. Mapping ground water vulnerability to pesticide leaching with a process-based metamodel of EuroPEARL

    NARCIS (Netherlands)

    Tiktak, A.; Boesten, J.J.T.I.; Linden, van der A.M.A.; Vanclooster, M.

    2006-01-01

    To support EU policy, indicators of pesticide leaching at the European level are required. For this reason, a metamodel of the spatially distributed European pesticide leaching model EuroPEARL was developed. EuroPEARL considers transient flow and solute transport and assumes Freundlich adsorption,

  12. Metamodeling-based approach for risk assessment and cost estimation: Application to geological carbon sequestration planning

    Science.gov (United States)

    Sun, Alexander Y.; Jeong, Hoonyoung; González-Nicolás, Ana; Templeton, Thomas C.

    2018-04-01

    Carbon capture and storage (CCS) is being evaluated globally as a geoengineering measure for significantly reducing greenhouse emission. However, long-term liability associated with potential leakage from these geologic repositories is perceived as a main barrier of entry to site operators. Risk quantification and impact assessment help CCS operators to screen candidate sites for suitability of CO2 storage. Leakage risks are highly site dependent, and a quantitative understanding and categorization of these risks can only be made possible through broad participation and deliberation of stakeholders, with the use of site-specific, process-based models as the decision basis. Online decision making, however, requires that scenarios be run in real time. In this work, a Python based, Leakage Assessment and Cost Estimation (PyLACE) web application was developed for quantifying financial risks associated with potential leakage from geologic carbon sequestration sites. PyLACE aims to assist a collaborative, analytic-deliberative decision making processes by automating metamodel creation, knowledge sharing, and online collaboration. In PyLACE, metamodeling, which is a process of developing faster-to-run surrogates of process-level models, is enabled using a special stochastic response surface method and the Gaussian process regression. Both methods allow consideration of model parameter uncertainties and the use of that information to generate confidence intervals on model outputs. Training of the metamodels is delegated to a high performance computing cluster and is orchestrated by a set of asynchronous job scheduling tools for job submission and result retrieval. As a case study, workflow and main features of PyLACE are demonstrated using a multilayer, carbon storage model.

  13. A Developed Meta-model for Selection of Cotton Fabrics Using Design of Experiments and TOPSIS Method

    Science.gov (United States)

    Chakraborty, Shankar; Chatterjee, Prasenjit

    2017-12-01

    Selection of cotton fabrics for providing optimal clothing comfort is often considered as a multi-criteria decision making problem consisting of an array of candidate alternatives to be evaluated based of several conflicting properties. In this paper, design of experiments and technique for order preference by similarity to ideal solution (TOPSIS) are integrated so as to develop regression meta-models for identifying the most suitable cotton fabrics with respect to the computed TOPSIS scores. The applicability of the adopted method is demonstrated using two real time examples. These developed models can also identify the statistically significant fabric properties and their interactions affecting the measured TOPSIS scores and final selection decisions. There exists good degree of congruence between the ranking patterns as derived using these meta-models and the existing methods for cotton fabric ranking and subsequent selection.

  14. Low cost metamodel for robust design of periodic nonlinear coupled micro-systems

    Directory of Open Access Journals (Sweden)

    Chikhaoui K.

    2016-01-01

    Full Text Available To achieve robust design, in presence of uncertainty, nonlinearity and structural periodicity, a metamodel combining the Latin Hypercube Sampling (LHS method for uncertainty propagation and an enriched Craig-Bampton Component Mode Synthesis approach (CB-CMS for model reduction is proposed. Its application to predict the time responses of a stochastic periodic nonlinear micro-system proves its efficiency in terms of accuracy and reduction of computational cost.

  15. Polynomial meta-models with canonical low-rank approximations: Numerical insights and comparison to sparse polynomial chaos expansions

    International Nuclear Information System (INIS)

    Konakli, Katerina; Sudret, Bruno

    2016-01-01

    The growing need for uncertainty analysis of complex computational models has led to an expanding use of meta-models across engineering and sciences. The efficiency of meta-modeling techniques relies on their ability to provide statistically-equivalent analytical representations based on relatively few evaluations of the original model. Polynomial chaos expansions (PCE) have proven a powerful tool for developing meta-models in a wide range of applications; the key idea thereof is to expand the model response onto a basis made of multivariate polynomials obtained as tensor products of appropriate univariate polynomials. The classical PCE approach nevertheless faces the “curse of dimensionality”, namely the exponential increase of the basis size with increasing input dimension. To address this limitation, the sparse PCE technique has been proposed, in which the expansion is carried out on only a few relevant basis terms that are automatically selected by a suitable algorithm. An alternative for developing meta-models with polynomial functions in high-dimensional problems is offered by the newly emerged low-rank approximations (LRA) approach. By exploiting the tensor–product structure of the multivariate basis, LRA can provide polynomial representations in highly compressed formats. Through extensive numerical investigations, we herein first shed light on issues relating to the construction of canonical LRA with a particular greedy algorithm involving a sequential updating of the polynomial coefficients along separate dimensions. Specifically, we examine the selection of optimal rank, stopping criteria in the updating of the polynomial coefficients and error estimation. In the sequel, we confront canonical LRA to sparse PCE in structural-mechanics and heat-conduction applications based on finite-element solutions. Canonical LRA exhibit smaller errors than sparse PCE in cases when the number of available model evaluations is small with respect to the input

  16. Polynomial meta-models with canonical low-rank approximations: Numerical insights and comparison to sparse polynomial chaos expansions

    Energy Technology Data Exchange (ETDEWEB)

    Konakli, Katerina, E-mail: konakli@ibk.baug.ethz.ch; Sudret, Bruno

    2016-09-15

    The growing need for uncertainty analysis of complex computational models has led to an expanding use of meta-models across engineering and sciences. The efficiency of meta-modeling techniques relies on their ability to provide statistically-equivalent analytical representations based on relatively few evaluations of the original model. Polynomial chaos expansions (PCE) have proven a powerful tool for developing meta-models in a wide range of applications; the key idea thereof is to expand the model response onto a basis made of multivariate polynomials obtained as tensor products of appropriate univariate polynomials. The classical PCE approach nevertheless faces the “curse of dimensionality”, namely the exponential increase of the basis size with increasing input dimension. To address this limitation, the sparse PCE technique has been proposed, in which the expansion is carried out on only a few relevant basis terms that are automatically selected by a suitable algorithm. An alternative for developing meta-models with polynomial functions in high-dimensional problems is offered by the newly emerged low-rank approximations (LRA) approach. By exploiting the tensor–product structure of the multivariate basis, LRA can provide polynomial representations in highly compressed formats. Through extensive numerical investigations, we herein first shed light on issues relating to the construction of canonical LRA with a particular greedy algorithm involving a sequential updating of the polynomial coefficients along separate dimensions. Specifically, we examine the selection of optimal rank, stopping criteria in the updating of the polynomial coefficients and error estimation. In the sequel, we confront canonical LRA to sparse PCE in structural-mechanics and heat-conduction applications based on finite-element solutions. Canonical LRA exhibit smaller errors than sparse PCE in cases when the number of available model evaluations is small with respect to the input

  17. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    Science.gov (United States)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  18. Meta-model of EPortfolio Usage in Different Environments

    Directory of Open Access Journals (Sweden)

    Igor Balaban

    2011-09-01

    Full Text Available EPortfolio offers a new philosophy of teaching and learning, giving the learner an opportunity to express oneself, to show one’s past work and experience to all the interested parties ranging from teachers to potential employers. However, an integral model for ePortfolio implementation in academic institutions that would take into account three different levels of stakeholders: 1. Individual (student and teacher; 2. Institution; and 3. Employer, currently does not exist. In this paper the role of ePortfolio in academic environment as well as the context in which ePortfolio operates is analyzed in detail. As a result of the comprehensive analysis that takes into account individual, academic institution and employer, a meta-model of ePortfolio usage in Lifelong Learning is proposed.

  19. SPEM: Software Process Engineering Metamodel

    Directory of Open Access Journals (Sweden)

    Víctor Hugo Menéndez Domínguez

    2015-05-01

    Full Text Available Todas las organizaciones involucradas en el desarrollo de software necesitan establecer, gestionar y soportar el trabajo de desarrollo. El término “proceso de desarrollo de software” tiende a unificar todas las actividades y prácticas que cubren esas necesidades. Modelar el proceso de software es una forma para mejorar el desarrollo y la calidad de las aplicaciones resultantes. De entre todos los lenguajes existentes para el modelado de procesos, aquellos basados en productos de trabajo son los más adecuados. Uno de tales lenguajes es SPEM (Software Process Engineering Metamodel. SPEM fue creado por OMG (Object Management Group como un estándar de alto nivel, que está basado en MOF (MetaObject Facility y es un metamodelo UML (Uniform Model Language. Constituye un tipo de ontología de procesos de desarrollo de software. En este artículo se ofrece una descripción, en términos generales, del estándar SPEM. También se destacan los cambios que ha experimentado entre la versión 1.1 y la versión 2.0, presentando tanto las ventajas como las desventajas encontradas entre ambas versiones.

  20. Hierarchical Cluster-based Partial Least Squares Regression (HC-PLSR is an efficient tool for metamodelling of nonlinear dynamic models

    Directory of Open Access Journals (Sweden)

    Omholt Stig W

    2011-06-01

    Full Text Available Abstract Background Deterministic dynamic models of complex biological systems contain a large number of parameters and state variables, related through nonlinear differential equations with various types of feedback. A metamodel of such a dynamic model is a statistical approximation model that maps variation in parameters and initial conditions (inputs to variation in features of the trajectories of the state variables (outputs throughout the entire biologically relevant input space. A sufficiently accurate mapping can be exploited both instrumentally and epistemically. Multivariate regression methodology is a commonly used approach for emulating dynamic models. However, when the input-output relations are highly nonlinear or non-monotone, a standard linear regression approach is prone to give suboptimal results. We therefore hypothesised that a more accurate mapping can be obtained by locally linear or locally polynomial regression. We present here a new method for local regression modelling, Hierarchical Cluster-based PLS regression (HC-PLSR, where fuzzy C-means clustering is used to separate the data set into parts according to the structure of the response surface. We compare the metamodelling performance of HC-PLSR with polynomial partial least squares regression (PLSR and ordinary least squares (OLS regression on various systems: six different gene regulatory network models with various types of feedback, a deterministic mathematical model of the mammalian circadian clock and a model of the mouse ventricular myocyte function. Results Our results indicate that multivariate regression is well suited for emulating dynamic models in systems biology. The hierarchical approach turned out to be superior to both polynomial PLSR and OLS regression in all three test cases. The advantage, in terms of explained variance and prediction accuracy, was largest in systems with highly nonlinear functional relationships and in systems with positive feedback

  1. Hierarchical cluster-based partial least squares regression (HC-PLSR) is an efficient tool for metamodelling of nonlinear dynamic models.

    Science.gov (United States)

    Tøndel, Kristin; Indahl, Ulf G; Gjuvsland, Arne B; Vik, Jon Olav; Hunter, Peter; Omholt, Stig W; Martens, Harald

    2011-06-01

    Deterministic dynamic models of complex biological systems contain a large number of parameters and state variables, related through nonlinear differential equations with various types of feedback. A metamodel of such a dynamic model is a statistical approximation model that maps variation in parameters and initial conditions (inputs) to variation in features of the trajectories of the state variables (outputs) throughout the entire biologically relevant input space. A sufficiently accurate mapping can be exploited both instrumentally and epistemically. Multivariate regression methodology is a commonly used approach for emulating dynamic models. However, when the input-output relations are highly nonlinear or non-monotone, a standard linear regression approach is prone to give suboptimal results. We therefore hypothesised that a more accurate mapping can be obtained by locally linear or locally polynomial regression. We present here a new method for local regression modelling, Hierarchical Cluster-based PLS regression (HC-PLSR), where fuzzy C-means clustering is used to separate the data set into parts according to the structure of the response surface. We compare the metamodelling performance of HC-PLSR with polynomial partial least squares regression (PLSR) and ordinary least squares (OLS) regression on various systems: six different gene regulatory network models with various types of feedback, a deterministic mathematical model of the mammalian circadian clock and a model of the mouse ventricular myocyte function. Our results indicate that multivariate regression is well suited for emulating dynamic models in systems biology. The hierarchical approach turned out to be superior to both polynomial PLSR and OLS regression in all three test cases. The advantage, in terms of explained variance and prediction accuracy, was largest in systems with highly nonlinear functional relationships and in systems with positive feedback loops. HC-PLSR is a promising approach for

  2. Statistical metamodeling for revealing synergistic antimicrobial interactions.

    Directory of Open Access Journals (Sweden)

    Hsiang Chia Chen

    2010-11-01

    Full Text Available Many bacterial pathogens are becoming drug resistant faster than we can develop new antimicrobials. To address this threat in public health, a metamodel antimicrobial cocktail optimization (MACO scheme is demonstrated for rapid screening of potent antibiotic cocktails using uropathogenic clinical isolates as model systems. With the MACO scheme, only 18 parallel trials were required to determine a potent antimicrobial cocktail out of hundreds of possible combinations. In particular, trimethoprim and gentamicin were identified to work synergistically for inhibiting the bacterial growth. Sensitivity analysis indicated gentamicin functions as a synergist for trimethoprim, and reduces its minimum inhibitory concentration for 40-fold. Validation study also confirmed that the trimethoprim-gentamicin synergistic cocktail effectively inhibited the growths of multiple strains of uropathogenic clinical isolates. With its effectiveness and simplicity, the MACO scheme possesses the potential to serve as a generic platform for identifying synergistic antimicrobial cocktails toward management of bacterial infection in the future.

  3. Extending MAM5 Meta-Model and JaCalIV E Framework to Integrate Smart Devices from Real Environments

    Science.gov (United States)

    2016-01-01

    This paper presents the extension of a meta-model (MAM5) and a framework based on the model (JaCalIVE) for developing intelligent virtual environments. The goal of this extension is to develop augmented mirror worlds that represent a real and virtual world coupled, so that the virtual world not only reflects the real one, but also complements it. A new component called a smart resource artifact, that enables modelling and developing devices to access the real physical world, and a human in the loop agent to place a human in the system have been included in the meta-model and framework. The proposed extension of MAM5 has been tested by simulating a light control system where agents can access both virtual and real sensor/actuators through the smart resources developed. The results show that the use of real environment interactive elements (smart resource artifacts) in agent-based simulations allows to minimize the error between simulated and real system. PMID:26926691

  4. Extending MAM5 Meta-Model and JaCalIV E Framework to Integrate Smart Devices from Real Environments.

    Science.gov (United States)

    Rincon, J A; Poza-Lujan, Jose-Luis; Julian, V; Posadas-Yagüe, Juan-Luis; Carrascosa, C

    2016-01-01

    This paper presents the extension of a meta-model (MAM5) and a framework based on the model (JaCalIVE) for developing intelligent virtual environments. The goal of this extension is to develop augmented mirror worlds that represent a real and virtual world coupled, so that the virtual world not only reflects the real one, but also complements it. A new component called a smart resource artifact, that enables modelling and developing devices to access the real physical world, and a human in the loop agent to place a human in the system have been included in the meta-model and framework. The proposed extension of MAM5 has been tested by simulating a light control system where agents can access both virtual and real sensor/actuators through the smart resources developed. The results show that the use of real environment interactive elements (smart resource artifacts) in agent-based simulations allows to minimize the error between simulated and real system.

  5. Surprise Trips

    DEFF Research Database (Denmark)

    Korn, Matthias; Kawash, Raghid; Andersen, Lisbet Møller

    2010-01-01

    We report on a platform that augments the natural experience of exploration in diverse indoor and outdoor environments. The system builds on the theme of surprises in terms of user expectations and finding points of interest. It utilizes physical icons as representations of users' interests...... and as notification tokens to alert users when they are within proximity of a surprise. To evaluate the concept, we developed mock-ups, a video prototype and conducted a wizard-of-oz user test for a national park in Denmark....

  6. Ontological Surprises

    DEFF Research Database (Denmark)

    Leahu, Lucian

    2016-01-01

    a hybrid approach where machine learning algorithms are used to identify objects as well as connections between them; finally, it argues for remaining open to ontological surprises in machine learning as they may enable the crafting of different relations with and through technologies.......This paper investigates how we might rethink design as the technological crafting of human-machine relations in the context of a machine learning technique called neural networks. It analyzes Google’s Inceptionism project, which uses neural networks for image recognition. The surprising output...

  7. Influence of partially known parameter on flaw characterization in Eddy Current Testing by using a random walk MCMC method based on metamodeling

    International Nuclear Information System (INIS)

    Cai, Caifang; Lambert, Marc; Rodet, Thomas

    2014-01-01

    First, we present the implementation of a random walk Metropolis-within-Gibbs (MWG) sampling method in flaw characterization based on a metamodeling method. The role of metamodeling is to reduce the computational time cost in Eddy Current Testing (ECT) forward model calculation. In such a way, the use of Markov Chain Monte Carlo (MCMC) methods becomes possible. Secondly, we analyze the influence of partially known parameters in Bayesian estimation. The objective is to evaluate the importance of providing more specific prior information. Simulation results show that even partially known information has great interest in providing more accurate flaw parameter estimations. The improvement ratio depends on the parameter dependence and the interest shows only when the provided information is specific enough

  8. Improved metamodel-based importance sampling for the performance assessment of radioactive waste repositories

    International Nuclear Information System (INIS)

    Cadini, F.; Gioletta, A.; Zio, E.

    2015-01-01

    In the context of a probabilistic performance assessment of a radioactive waste repository, the estimation of the probability of exceeding the dose threshold set by a regulatory body is a fundamental task. This may become difficult when the probabilities involved are very small, since the classically used sampling-based Monte Carlo methods may become computationally impractical. This issue is further complicated by the fact that the computer codes typically adopted in this context requires large computational efforts, both in terms of time and memory. This work proposes an original use of a Monte Carlo-based algorithm for (small) failure probability estimation in the context of the performance assessment of a near surface radioactive waste repository. The algorithm, developed within the context of structural reliability, makes use of an estimated optimal importance density and a surrogate, kriging-based metamodel approximating the system response. On the basis of an accurate analytic analysis of the algorithm, a modification is proposed which allows further reducing the computational efforts by a more effective training of the metamodel. - Highlights: • We tackle uncertainty propagation in a radwaste repository performance assessment. • We improve a kriging-based importance sampling for estimating failure probabilities. • We justify the modification by an analytic, comparative analysis of the algorithms. • The probability of exceeding dose thresholds in radwaste repositories is estimated. • The algorithm is further improved reducing the number of its free parameters

  9. Modeling Enterprise Authorization: A Unified Metamodel and Initial Validation

    Directory of Open Access Journals (Sweden)

    Matus Korman

    2016-07-01

    Full Text Available Authorization and its enforcement, access control, have stood at the beginning of the art and science of information security, and remain being crucial pillar of security in the information technology (IT and enterprises operations. Dozens of different models of access control have been proposed. Although Enterprise Architecture as the discipline strives to support the management of IT, support for modeling access policies in enterprises is often lacking, both in terms of supporting the variety of individual models of access control nowadays used, and in terms of providing a unified ontology capable of flexibly expressing access policies for all or the most of the models. This study summarizes a number of existing models of access control, proposes a unified metamodel mapped to ArchiMate, and illustrates its use on a selection of example scenarios and two business cases.

  10. Equation of state for dense nucleonic matter from metamodeling. II. Predictions for neutron star properties

    Science.gov (United States)

    Margueron, Jérôme; Hoffmann Casali, Rudiney; Gulminelli, Francesca

    2018-02-01

    Employing recently proposed metamodeling for the nucleonic matter equation of state, we analyze neutron star global properties such as masses, radii, momentum of inertia, and others. The impact of the uncertainty on empirical parameters on these global properties is analyzed in a Bayesian statistical approach. Physical constraints, such as causality and stability, are imposed on the equation of state and different hypotheses for the direct Urca (dUrca) process are investigated. In addition, only metamodels with maximum masses above 2 M⊙ are selected. Our main results are the following: the equation of state exhibits a universal behavior against the dUrca hypothesis under the condition of charge neutrality and β equilibrium; neutron stars, if composed exclusively of nucleons and leptons, have a radius of 12.7 ±0.4 km for masses ranging from 1 up to 2 M⊙ ; a small radius lower than 11 km is very marginally compatible with our present knowledge of the nuclear empirical parameters; and finally, the most important empirical parameters which are still affected by large uncertainties and play an important role in determining the radius of neutrons stars are the slope and curvature of the symmetry energy (Lsym and Ksym) and, to a lower extent, the skewness parameters (Qsat /sym).

  11. Implementing a collaborative virtual environment — specification for a usability metamodel

    Directory of Open Access Journals (Sweden)

    Maria L Villegas R

    2009-01-01

    Full Text Available This research presents the results of the first phase of a macro-project for constructing a collaborative virtual environment. It was aimed at selecting a graphical interface from five proposed for such environment, considering each one’s level of usability. Seve- ral standards of usability and user-centered design patterns were studied for specifying interface measurment criteria for speci- fying a usability metamodel; this defined the variables and rules to be taken into accout when measuring graphic user interface (GUI usability level for collaborative virtual environments. The use of metaphors when specifying graphic user interfaces is also briefly looked at as a source of new usability and satisfaction related to such interface use.

  12. Metamodel for Efficient Estimation of Capacity-Fade Uncertainty in Li-Ion Batteries for Electric Vehicles

    Directory of Open Access Journals (Sweden)

    Jaewook Lee

    2015-06-01

    Full Text Available This paper presents an efficient method for estimating capacity-fade uncertainty in lithium-ion batteries (LIBs in order to integrate them into the battery-management system (BMS of electric vehicles, which requires simple and inexpensive computation for successful application. The study uses the pseudo-two-dimensional (P2D electrochemical model, which simulates the battery state by solving a system of coupled nonlinear partial differential equations (PDEs. The model parameters that are responsible for electrode degradation are identified and estimated, based on battery data obtained from the charge cycles. The Bayesian approach, with parameters estimated by probability distributions, is employed to account for uncertainties arising in the model and battery data. The Markov Chain Monte Carlo (MCMC technique is used to draw samples from the distributions. The complex computations that solve a PDE system for each sample are avoided by employing a polynomial-based metamodel. As a result, the computational cost is reduced from 5.5 h to a few seconds, enabling the integration of the method into the vehicle BMS. Using this approach, the conservative bound of capacity fade can be determined for the vehicle in service, which represents the safety margin reflecting the uncertainty.

  13. Genomic prediction of complex human traits: relatedness, trait architecture and predictive meta-models

    Science.gov (United States)

    Spiliopoulou, Athina; Nagy, Reka; Bermingham, Mairead L.; Huffman, Jennifer E.; Hayward, Caroline; Vitart, Veronique; Rudan, Igor; Campbell, Harry; Wright, Alan F.; Wilson, James F.; Pong-Wong, Ricardo; Agakov, Felix; Navarro, Pau; Haley, Chris S.

    2015-01-01

    We explore the prediction of individuals' phenotypes for complex traits using genomic data. We compare several widely used prediction models, including Ridge Regression, LASSO and Elastic Nets estimated from cohort data, and polygenic risk scores constructed using published summary statistics from genome-wide association meta-analyses (GWAMA). We evaluate the interplay between relatedness, trait architecture and optimal marker density, by predicting height, body mass index (BMI) and high-density lipoprotein level (HDL) in two data cohorts, originating from Croatia and Scotland. We empirically demonstrate that dense models are better when all genetic effects are small (height and BMI) and target individuals are related to the training samples, while sparse models predict better in unrelated individuals and when some effects have moderate size (HDL). For HDL sparse models achieved good across-cohort prediction, performing similarly to the GWAMA risk score and to models trained within the same cohort, which indicates that, for predicting traits with moderately sized effects, large sample sizes and familial structure become less important, though still potentially useful. Finally, we propose a novel ensemble of whole-genome predictors with GWAMA risk scores and demonstrate that the resulting meta-model achieves higher prediction accuracy than either model on its own. We conclude that although current genomic predictors are not accurate enough for diagnostic purposes, performance can be improved without requiring access to large-scale individual-level data. Our methodologically simple meta-model is a means of performing predictive meta-analysis for optimizing genomic predictions and can be easily extended to incorporate multiple population-level summary statistics or other domain knowledge. PMID:25918167

  14. Flexible software process lines in practice: A metamodel-based approach to effectively construct and manage families of software process models

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Ternité, Thomas; Friedrich, Jan

    2017-01-01

    Process flexibility and adaptability is frequently discussed, and several proposals aim to improve software processes for a given organization-/project context. A software process line (SPrL) is an instrument to systematically construct and manage variable software processes, by combining pre-def......: A metamodel-based approach to effectively construct and manage families of software process models [Ku16]. This paper was published as original research article in the Journal of Systems and Software.......Process flexibility and adaptability is frequently discussed, and several proposals aim to improve software processes for a given organization-/project context. A software process line (SPrL) is an instrument to systematically construct and manage variable software processes, by combining pre...... to construct flexible SPrLs and show its practical application in the German V-Modell XT. We contribute a proven approach that is presented as metamodel fragment for reuse and implementation in further process modeling approaches. This summary refers to the paper Flexible software process lines in practice...

  15. Increasing the efficiency of designing hemming processes by using an element-based metamodel approach

    Science.gov (United States)

    Kaiser, C.; Roll, K.; Volk, W.

    2017-09-01

    In the automotive industry, the manufacturing of automotive outer panels requires hemming processes in which two sheet metal parts are joined together by bending the flange of the outer part over the inner part. Because of decreasing development times and the steadily growing number of vehicle derivatives, an efficient digital product and process validation is necessary. Commonly used simulations, which are based on the finite element method, demand significant modelling effort, which results in disadvantages especially in the early product development phase. To increase the efficiency of designing hemming processes this paper presents a hemming-specific metamodel approach. The approach includes a part analysis in which the outline of the automotive outer panels is initially split into individual segments. By doing a para-metrization of each of the segments and assigning basic geometric shapes, the outline of the part is approximated. Based on this, the hemming parameters such as flange length, roll-in, wrinkling and plastic strains are calculated for each of the geometric basic shapes by performing a meta-model-based segmental product validation. The metamodel is based on an element similar formulation that includes a reference dataset of various geometric basic shapes. A random automotive outer panel can now be analysed and optimized based on the hemming-specific database. By implementing this approach into a planning system, an efficient optimization of designing hemming processes will be enabled. Furthermore, valuable time and cost benefits can be realized in a vehicle’s development process.

  16. Metamodel-based design optimization of injection molding process variables and gates of an automotive glove box for enhancing its quality

    International Nuclear Information System (INIS)

    Kang, Gyung Ju; Park, Chang Hyun; Choi, Dong Hoon

    2016-01-01

    Injection molding process variables and gates of an automotive glove box were optimally determined to enhance its injection molding quality. We minimized warpage with satisfying constraints on clamp force, weldline, and profiles of filling and packing. Design variables concerning the injection molding process are temperatures of the mold and the resin, ram speeds, and packing pressures and durations; design variables concerning the gates are the shape of the center gate and locations of two side gates. To optimally determine the design variables in an efficient way, we adopted metamodel-based design optimization, sequentially using an optimal Latin hypercube design as a design of experiment, Kriging models as metamodels that replace time-consuming injection molding simulations, and a micro genetic algorithm as an optimization algorithm. In the optimization process, a commercial injection molding analysis software, MoldflowTM, was employed to evaluate the injection molding quality at design points specified. Using the proposed design approach, the warpage was found reduced by 20.5% compared to the initial warpage, while all the design constraints were satisfied, which clearly shows the validity of the proposed design approach

  17. Metamodel-based design optimization of injection molding process variables and gates of an automotive glove box for enhancing its quality

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Gyung Ju [Pusan National University, Busan (Korea, Republic of); Park, Chang Hyun; Choi, Dong Hoon [Hanyang University, Seoul (Korea, Republic of)

    2016-04-15

    Injection molding process variables and gates of an automotive glove box were optimally determined to enhance its injection molding quality. We minimized warpage with satisfying constraints on clamp force, weldline, and profiles of filling and packing. Design variables concerning the injection molding process are temperatures of the mold and the resin, ram speeds, and packing pressures and durations; design variables concerning the gates are the shape of the center gate and locations of two side gates. To optimally determine the design variables in an efficient way, we adopted metamodel-based design optimization, sequentially using an optimal Latin hypercube design as a design of experiment, Kriging models as metamodels that replace time-consuming injection molding simulations, and a micro genetic algorithm as an optimization algorithm. In the optimization process, a commercial injection molding analysis software, MoldflowTM, was employed to evaluate the injection molding quality at design points specified. Using the proposed design approach, the warpage was found reduced by 20.5% compared to the initial warpage, while all the design constraints were satisfied, which clearly shows the validity of the proposed design approach.

  18. Calculations of Sobol indices for the Gaussian process metamodel

    Energy Technology Data Exchange (ETDEWEB)

    Marrel, Amandine [CEA, DEN, DTN/SMTM/LMTE, F-13108 Saint Paul lez Durance (France)], E-mail: amandine.marrel@cea.fr; Iooss, Bertrand [CEA, DEN, DER/SESI/LCFR, F-13108 Saint Paul lez Durance (France); Laurent, Beatrice [Institut de Mathematiques, Universite de Toulouse (UMR 5219) (France); Roustant, Olivier [Ecole des Mines de Saint-Etienne (France)

    2009-03-15

    Global sensitivity analysis of complex numerical models can be performed by calculating variance-based importance measures of the input variables, such as the Sobol indices. However, these techniques, requiring a large number of model evaluations, are often unacceptable for time expensive computer codes. A well-known and widely used decision consists in replacing the computer code by a metamodel, predicting the model responses with a negligible computation time and rending straightforward the estimation of Sobol indices. In this paper, we discuss about the Gaussian process model which gives analytical expressions of Sobol indices. Two approaches are studied to compute the Sobol indices: the first based on the predictor of the Gaussian process model and the second based on the global stochastic process model. Comparisons between the two estimates, made on analytical examples, show the superiority of the second approach in terms of convergence and robustness. Moreover, the second approach allows to integrate the modeling error of the Gaussian process model by directly giving some confidence intervals on the Sobol indices. These techniques are finally applied to a real case of hydrogeological modeling.

  19. Calculations of Sobol indices for the Gaussian process metamodel

    International Nuclear Information System (INIS)

    Marrel, Amandine; Iooss, Bertrand; Laurent, Beatrice; Roustant, Olivier

    2009-01-01

    Global sensitivity analysis of complex numerical models can be performed by calculating variance-based importance measures of the input variables, such as the Sobol indices. However, these techniques, requiring a large number of model evaluations, are often unacceptable for time expensive computer codes. A well-known and widely used decision consists in replacing the computer code by a metamodel, predicting the model responses with a negligible computation time and rending straightforward the estimation of Sobol indices. In this paper, we discuss about the Gaussian process model which gives analytical expressions of Sobol indices. Two approaches are studied to compute the Sobol indices: the first based on the predictor of the Gaussian process model and the second based on the global stochastic process model. Comparisons between the two estimates, made on analytical examples, show the superiority of the second approach in terms of convergence and robustness. Moreover, the second approach allows to integrate the modeling error of the Gaussian process model by directly giving some confidence intervals on the Sobol indices. These techniques are finally applied to a real case of hydrogeological modeling

  20. Mono and multi-objective optimization techniques applied to a large range of industrial test cases using Metamodel assisted Evolutionary Algorithms

    Science.gov (United States)

    Fourment, Lionel; Ducloux, Richard; Marie, Stéphane; Ejday, Mohsen; Monnereau, Dominique; Massé, Thomas; Montmitonnet, Pierre

    2010-06-01

    The use of material processing numerical simulation allows a strategy of trial and error to improve virtual processes without incurring material costs or interrupting production and therefore save a lot of money, but it requires user time to analyze the results, adjust the operating conditions and restart the simulation. Automatic optimization is the perfect complement to simulation. Evolutionary Algorithm coupled with metamodelling makes it possible to obtain industrially relevant results on a very large range of applications within a few tens of simulations and without any specific automatic optimization technique knowledge. Ten industrial partners have been selected to cover the different area of the mechanical forging industry and provide different examples of the forming simulation tools. It aims to demonstrate that it is possible to obtain industrially relevant results on a very large range of applications within a few tens of simulations and without any specific automatic optimization technique knowledge. The large computational time is handled by a metamodel approach. It allows interpolating the objective function on the entire parameter space by only knowing the exact function values at a reduced number of "master points". Two algorithms are used: an evolution strategy combined with a Kriging metamodel and a genetic algorithm combined with a Meshless Finite Difference Method. The later approach is extended to multi-objective optimization. The set of solutions, which corresponds to the best possible compromises between the different objectives, is then computed in the same way. The population based approach allows using the parallel capabilities of the utilized computer with a high efficiency. An optimization module, fully embedded within the Forge2009 IHM, makes possible to cover all the defined examples, and the use of new multi-core hardware to compute several simulations at the same time reduces the needed time dramatically. The presented examples

  1. Dynamic sensitivity analysis of long running landslide models through basis set expansion and meta-modelling

    Science.gov (United States)

    Rohmer, Jeremy

    2016-04-01

    Predicting the temporal evolution of landslides is typically supported by numerical modelling. Dynamic sensitivity analysis aims at assessing the influence of the landslide properties on the time-dependent predictions (e.g., time series of landslide displacements). Yet two major difficulties arise: 1. Global sensitivity analysis require running the landslide model a high number of times (> 1000), which may become impracticable when the landslide model has a high computation time cost (> several hours); 2. Landslide model outputs are not scalar, but function of time, i.e. they are n-dimensional vectors with n usually ranging from 100 to 1000. In this article, I explore the use of a basis set expansion, such as principal component analysis, to reduce the output dimensionality to a few components, each of them being interpreted as a dominant mode of variation in the overall structure of the temporal evolution. The computationally intensive calculation of the Sobol' indices for each of these components are then achieved through meta-modelling, i.e. by replacing the landslide model by a "costless-to-evaluate" approximation (e.g., a projection pursuit regression model). The methodology combining "basis set expansion - meta-model - Sobol' indices" is then applied to the La Frasse landslide to investigate the dynamic sensitivity analysis of the surface horizontal displacements to the slip surface properties during the pore pressure changes. I show how to extract information on the sensitivity of each main modes of temporal behaviour using a limited number (a few tens) of long running simulations. In particular, I identify the parameters, which trigger the occurrence of a turning point marking a shift between a regime of low values of landslide displacements and one of high values.

  2. AMFIBIA: A Meta-Model for the Integration of Business Process Modelling Aspects

    DEFF Research Database (Denmark)

    Axenath, Björn; Kindler, Ekkart; Rubin, Vladimir

    2007-01-01

    AMFIBIA is a meta-model that formalises the essential aspects and concepts of business processes. Though AMFIBIA is not the first approach to formalising the aspects and concepts of business processes, it is more ambitious in the following respects: Firstly, it is independent from particular...... modelling formalisms of business processes and it is designed in such a way that any formalism for modelling some aspect of a business process can be plugged into AMFIBIA. Therefore, AMFIBIA is formalism-independent. Secondly, it is not biased toward any aspect of business processes; the different aspects...... can be considered and modelled independently of each other. Moreover, AMFIBIA is not restricted to a fixed set of aspects; new aspects of business processes can be easily integrated. Thirdly, AMFIBIA does not only name and relate the concepts of business process modelling, as it is typically done...

  3. Uncertainty in Bus Arrival Time Predictions: Treating Heteroscedasticity With a Metamodel Approach

    DEFF Research Database (Denmark)

    O'Sullivan, Aidan; Pereira, Francisco Camara; Zhao, Jinhua

    2016-01-01

    Arrival time predictions for the next available bus or train are a key component of modern traveler information systems (TISs). A great deal of research has been conducted within the intelligent transportation system community in developing an assortment of different algorithms that seek...... sources. In this paper, we tackle the issue of uncertainty in bus arrival time predictions using an alternative approach. Rather than endeavor to develop a superior method for prediction, we take existing predictions from a TIS and treat the algorithm generating them as a black box. The presence...... of heteroscedasticity in the predictions is demonstrated and then a metamodel approach is deployed, which augments existing predictive systems using quantile regression to place bounds on the associated error. As a case study, this approach is applied to data from a real-world TIS in Boston. This method allows bounds...

  4. A proposed metamodel for the implementation of object oriented software through the automatic generation of source code

    Directory of Open Access Journals (Sweden)

    CARVALHO, J. S. C.

    2008-12-01

    Full Text Available During the development of software one of the most visible risks and perhaps the biggest implementation obstacle relates to the time management. All delivery deadlines software versions must be followed, but it is not always possible, sometimes due to delay in coding. This paper presents a metamodel for software implementation, which will rise to a development tool for automatic generation of source code, in order to make any development pattern transparent to the programmer, significantly reducing the time spent in coding artifacts that make up the software.

  5. Some Surprising Introductory Physics Facts and Numbers

    Science.gov (United States)

    Mallmann, A. James

    2016-01-01

    In the entertainment world, people usually like, and find memorable, novels, short stories, and movies with surprise endings. This suggests that classroom teachers might want to present to their students examples of surprising facts associated with principles of physics. Possible benefits of finding surprising facts about principles of physics are…

  6. Climate Change as a Predictable Surprise

    International Nuclear Information System (INIS)

    Bazerman, M.H.

    2006-01-01

    In this article, I analyze climate change as a 'predictable surprise', an event that leads an organization or nation to react with surprise, despite the fact that the information necessary to anticipate the event and its consequences was available (Bazerman and Watkins, 2004). I then assess the cognitive, organizational, and political reasons why society fails to implement wise strategies to prevent predictable surprises generally and climate change specifically. Finally, I conclude with an outline of a set of response strategies to overcome barriers to change

  7. Corrugator Activity Confirms Immediate Negative Affect in Surprise

    Directory of Open Access Journals (Sweden)

    Sascha eTopolinski

    2015-02-01

    Full Text Available The emotion of surprise entails a complex of immediate responses, such as cognitive interruption, attention allocation to, and more systematic processing of the surprising stimulus. All these processes serve the ultimate function to increase processing depth and thus cognitively master the surprising stimulus. The present account introduces phasic negative affect as the underlying mechanism responsible for these consequences. Surprising stimuli are schema-discrepant and thus entail cognitive disfluency, which elicits immediate negative affect. This affect in turn works like a phasic cognitive tuning switching the current processing mode from more automatic and heuristic to more systematic and reflective processing. Directly testing the initial elicitation of negative affect by suprising events, the present experiment presented high and low surprising neutral trivia statements to N = 28 participants while assessing their spontaneous facial expressions via facial electromyography. High compared to low suprising trivia elicited higher corrugator activity, indicative of negative affect and mental effort, while leaving zygomaticus (positive affect and frontalis (cultural surprise expression activity unaffected. Future research shall investigate the mediating role of negative affect in eliciting surprise-related outcomes.

  8. The Influence of Negative Surprise on Hedonic Adaptation

    Directory of Open Access Journals (Sweden)

    Ana Paula Kieling

    2016-01-01

    Full Text Available After some time using a product or service, the consumer tends to feel less pleasure with consumption. This reduction of pleasure is known as hedonic adaptation. One of the emotions that interfere in this process is surprise. Based on two experiments, we suggest that negative surprise – differently to positive – influences with the level of pleasure foreseen and experienced by the consumer. Study 1 analyzes the influence of negative (vs. positive surprise on the consumer’s post-purchase hedonic adaptation expectation. Results showed that negative surprise influences the intensity of adaptation, augmenting its strength. Study 2 verifies the influence of negative (vs positive surprise over hedonic adaptation. The findings suggested that negative surprise makes adaptation happen more intensively and faster as time goes by, which brings consequences to companies and consumers in the post-purchase process, such as satisfaction and loyalty.

  9. Surprise: a belief or an emotion?

    Science.gov (United States)

    Mellers, Barbara; Fincher, Katrina; Drummond, Caitlin; Bigony, Michelle

    2013-01-01

    Surprise is a fundamental link between cognition and emotion. It is shaped by cognitive assessments of likelihood, intuition, and superstition, and it in turn shapes hedonic experiences. We examine this connection between cognition and emotion and offer an explanation called decision affect theory. Our theory predicts the affective consequences of mistaken beliefs, such as overconfidence and hindsight. It provides insight about why the pleasure of a gain can loom larger than the pain of a comparable loss. Finally, it explains cross-cultural differences in emotional reactions to surprising events. By changing the nature of the unexpected (from chance to good luck), one can alter the emotional reaction to surprising events. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Exploring the concept of climate surprises. A review of the literature on the concept of surprise and how it is related to climate change

    International Nuclear Information System (INIS)

    Glantz, M.H.; Moore, C.M.; Streets, D.G.; Bhatti, N.; Rosa, C.H.

    1998-01-01

    This report examines the concept of climate surprise and its implications for environmental policymaking. Although most integrated assessment models of climate change deal with average values of change, it is usually the extreme events or surprises that cause the most damage to human health and property. Current models do not help the policymaker decide how to deal with climate surprises. This report examines the literature of surprise in many aspects of human society: psychology, military, health care, humor, agriculture, etc. It draws together various ways to consider the concept of surprise and examines different taxonomies of surprise that have been proposed. In many ways, surprise is revealed to be a subjective concept, triggered by such factors as prior experience, belief system, and level of education. How policymakers have reacted to specific instances of climate change or climate surprise in the past is considered, particularly with regard to the choices they made between proactive and reactive measures. Finally, the report discusses techniques used in the current generation of assessment models and makes suggestions as to how climate surprises might be included in future models. The report concludes that some kinds of surprises are simply unpredictable, but there are several types that could in some way be anticipated and assessed, and their negative effects forestalled

  11. Exploring the concept of climate surprises. A review of the literature on the concept of surprise and how it is related to climate change

    Energy Technology Data Exchange (ETDEWEB)

    Glantz, M.H.; Moore, C.M. [National Center for Atmospheric Research, Boulder, CO (United States); Streets, D.G.; Bhatti, N.; Rosa, C.H. [Argonne National Lab., IL (United States). Decision and Information Sciences Div.; Stewart, T.R. [State Univ. of New York, Albany, NY (United States)

    1998-01-01

    This report examines the concept of climate surprise and its implications for environmental policymaking. Although most integrated assessment models of climate change deal with average values of change, it is usually the extreme events or surprises that cause the most damage to human health and property. Current models do not help the policymaker decide how to deal with climate surprises. This report examines the literature of surprise in many aspects of human society: psychology, military, health care, humor, agriculture, etc. It draws together various ways to consider the concept of surprise and examines different taxonomies of surprise that have been proposed. In many ways, surprise is revealed to be a subjective concept, triggered by such factors as prior experience, belief system, and level of education. How policymakers have reacted to specific instances of climate change or climate surprise in the past is considered, particularly with regard to the choices they made between proactive and reactive measures. Finally, the report discusses techniques used in the current generation of assessment models and makes suggestions as to how climate surprises might be included in future models. The report concludes that some kinds of surprises are simply unpredictable, but there are several types that could in some way be anticipated and assessed, and their negative effects forestalled.

  12. The role of surprise in satisfaction judgements

    NARCIS (Netherlands)

    Vanhamme, J.; Snelders, H.M.J.J.

    2001-01-01

    Empirical findings suggest that surprise plays an important role in consumer satisfaction, but there is a lack of theory to explain why this is so. The present paper provides explanations for the process through which positive (negative) surprise might enhance (reduce) consumer satisfaction. First,

  13. A Dichotomic Analysis of the Surprise Examination Paradox

    OpenAIRE

    Franceschi, Paul

    2002-01-01

    This paper presents a dichotomic analysis of the surprise examination paradox. In section 1, I analyse the surprise notion in detail. I introduce then in section 2, the distinction between a monist and dichotomic analysis of the paradox. I also present there a dichotomy leading to distinguish two basically and structurally different versions of the paradox, respectively based on a conjoint and a disjoint definition of the surprise. In section 3, I describe the solution to SEP corresponding to...

  14. Exploration, Novelty, Surprise and Free Energy Minimisation

    Directory of Open Access Journals (Sweden)

    Philipp eSchwartenbeck

    2013-10-01

    Full Text Available This paper reviews recent developments under the free energy principle that introduce a normative perspective on classical economic (utilitarian decision-making based on (active Bayesian inference. It has been suggested that the free energy principle precludes novelty and complexity, because it assumes that biological systems – like ourselves - try to minimise the long-term average of surprise to maintain their homeostasis. However, recent formulations show that minimising surprise leads naturally to concepts such as exploration and novelty bonuses. In this approach, agents infer a policy that minimises surprise by minimising the difference (or relative entropy between likely and desired outcomes, which involves both pursuing the goal-state that has the highest expected utility (often termed ‘exploitation’ and visiting a number of different goal-states (‘exploration’. Crucially, the opportunity to visit new states increases the value of the current state. Casting decision-making problems within a variational framework, therefore, predicts that our behaviour is governed by both the entropy and expected utility of future states. This dissolves any dialectic between minimising surprise and exploration or novelty seeking.

  15. Metamodeling and optimization of the THF process with pulsating pressure

    Science.gov (United States)

    Bucconi, Marco; Strano, Matteo

    2018-05-01

    Tube hydroforming is a process used in various applications to form the tube in a desired complex shape, by combining the use of internal pressure, which provides the required stress to yield the material, and axial feeding, which helps the material to flow towards the bulging zone. In many studies it has been demonstrated how wrinkling and bursting defects can be severely reduced by means of a pulsating pressure, and how the so-called hammering hydroforming enhances the formability of the material. The definition of the optimum pressure and axial feeding profiles represent a daunting challenge in the designing phase of the hydroforming operation of a new part. The quality of the formed part is highly dependent on the amplitude and the peak value of the pulsating pressure, along with the axial stroke. In this paper, a research is reported, conducted by means of explicit finite element simulations of a hammering THF operation and metamodeling techniques aimed at optimizing the process parameters for the production of a complex part. The improved formability is explored for different factors and an optimization strategy is used to determine the most convenient pressure and axial feed profile curves for the hammering THF process of the examined part. It is shown how the pulsating pressure allows the minimization of the energy input in the process, still respecting final quality requirements.

  16. A Meta-Model of Inter-Organisational Cooperation for the Transition to a Circular Economy

    Directory of Open Access Journals (Sweden)

    Alessandro Ruggieri

    2016-11-01

    Full Text Available The transition to a circular economy bodes well for a future of environmentally sustainable growth and economic development. The implications and advantages of a shift to a circular economy have been extensively demonstrated by the literature on the subject. What has not been sufficiently investigated is how this paradigm can be enabled through the inter-organisational cooperation among different business enterprises. In order to illustrate this point, in this paper we aim to contribute to the circular economy debate by describing and discussing such a meta-model of inter-organisational cooperation. The present study is therefore based on the analysis of three cases from an equal number of industries, from which we identified factors of potential impact for the stimulation of cooperation in a circular economy perspective. Last, but not least, we discuss the relations between the case studies and try to formulate all possible implications for both managers and research.

  17. A Contrast-Based Computational Model of Surprise and Its Applications.

    Science.gov (United States)

    Macedo, Luis; Cardoso, Amílcar

    2017-11-19

    We review our work on a contrast-based computational model of surprise and its applications. The review is contextualized within related research from psychology, philosophy, and particularly artificial intelligence. Influenced by psychological theories of surprise, the model assumes that surprise-eliciting events initiate a series of cognitive processes that begin with the appraisal of the event as unexpected, continue with the interruption of ongoing activity and the focusing of attention on the unexpected event, and culminate in the analysis and evaluation of the event and the revision of beliefs. It is assumed that the intensity of surprise elicited by an event is a nonlinear function of the difference or contrast between the subjective probability of the event and that of the most probable alternative event (which is usually the expected event); and that the agent's behavior is partly controlled by actual and anticipated surprise. We describe applications of artificial agents that incorporate the proposed surprise model in three domains: the exploration of unknown environments, creativity, and intelligent transportation systems. These applications demonstrate the importance of surprise for decision making, active learning, creative reasoning, and selective attention. Copyright © 2017 Cognitive Science Society, Inc.

  18. The Role of Surprise in Game-Based Learning for Mathematics

    NARCIS (Netherlands)

    Wouters, Pieter; van Oostendorp, Herre; ter Vrugte, Judith; Vandercruysse, Sylke; de Jong, Anthonius J.M.; Elen, Jan; De Gloria, Alessandro; Veltkamp, Remco

    2016-01-01

    In this paper we investigate the potential of surprise on learning with prevocational students in the domain of proportional reasoning. Surprise involves an emotional reaction, but it also serves a cognitive goal as it directs attention to explain why the surprising event occurred and to learn for

  19. A Neural Mechanism for Surprise-related Interruptions of Visuospatial Working Memory.

    Science.gov (United States)

    Wessel, Jan R

    2018-01-01

    Surprising perceptual events recruit a fronto-basal ganglia mechanism for inhibition, which suppresses motor activity following surprise. A recent study found that this inhibitory mechanism also disrupts the maintenance of verbal working memory (WM) after surprising tones. However, it is unclear whether this same mechanism also relates to surprise-related interruptions of non-verbal WM. We tested this hypothesis using a change-detection task, in which surprising tones impaired visuospatial WM. Participants also performed a stop-signal task (SST). We used independent component analysis and single-trial scalp-electroencephalogram to test whether the same inhibitory mechanism that reflects motor inhibition in the SST relates to surprise-related visuospatial WM decrements, as was the case for verbal WM. As expected, surprising tones elicited activity of the inhibitory mechanism, and this activity correlated strongly with the trial-by-trial level of surprise. However, unlike for verbal WM, the activity of this mechanism was unrelated to visuospatial WM accuracy. Instead, inhibition-independent activity that immediately succeeded the inhibitory mechanism was increased when visuospatial WM was disrupted. This shows that surprise-related interruptions of visuospatial WM are not effected by the same inhibitory mechanism that interrupts verbal WM, and instead provides evidence for a 2-stage model of distraction. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  20. Human amygdala response to dynamic facial expressions of positive and negative surprise.

    Science.gov (United States)

    Vrticka, Pascal; Lordier, Lara; Bediou, Benoît; Sander, David

    2014-02-01

    Although brain imaging evidence accumulates to suggest that the amygdala plays a key role in the processing of novel stimuli, only little is known about its role in processing expressed novelty conveyed by surprised faces, and even less about possible interactive encoding of novelty and valence. Those investigations that have already probed human amygdala involvement in the processing of surprised facial expressions either used static pictures displaying negative surprise (as contained in fear) or "neutral" surprise, and manipulated valence by contextually priming or subjectively associating static surprise with either negative or positive information. Therefore, it still remains unresolved how the human amygdala differentially processes dynamic surprised facial expressions displaying either positive or negative surprise. Here, we created new artificial dynamic 3-dimensional facial expressions conveying surprise with an intrinsic positive (wonderment) or negative (fear) connotation, but also intrinsic positive (joy) or negative (anxiety) emotions not containing any surprise, in addition to neutral facial displays either containing ("typical surprise" expression) or not containing ("neutral") surprise. Results showed heightened amygdala activity to faces containing positive (vs. negative) surprise, which may either correspond to a specific wonderment effect as such, or to the computation of a negative expected value prediction error. Findings are discussed in the light of data obtained from a closely matched nonsocial lottery task, which revealed overlapping activity within the left amygdala to unexpected positive outcomes. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  1. Data for developing metamodels to assess the fate, transport, and bioaccumulation of organic chemicals in rivers. Chemicals have log Kow ranging from 3 to 14, and rivers have mean annual discharges ranging from 1.09 to 3240 m3/s.

    Data.gov (United States)

    U.S. Environmental Protection Agency — This dataset was developed to demonstrate how metamodels of high resolution, process-based models that simulate the fate, transport, and bioaccumulation of organic...

  2. The Value of Surprising Findings for Research on Marketing

    OpenAIRE

    JS Armstrong

    2004-01-01

    In the work of Armstrong (Journal of Business Research, 2002), I examined empirical research on the scientific process and related these to marketing science. The findings of some studies were surprising. In this reply, I address surprising findings and other issues raised by commentators.

  3. An efficient community detection algorithm using greedy surprise maximization

    International Nuclear Information System (INIS)

    Jiang, Yawen; Jia, Caiyan; Yu, Jian

    2014-01-01

    Community detection is an important and crucial problem in complex network analysis. Although classical modularity function optimization approaches are widely used for identifying communities, the modularity function (Q) suffers from its resolution limit. Recently, the surprise function (S) was experimentally proved to be better than the Q function. However, up until now, there has been no algorithm available to perform searches to directly determine the maximal surprise values. In this paper, considering the superiority of the S function over the Q function, we propose an efficient community detection algorithm called AGSO (algorithm based on greedy surprise optimization) and its improved version FAGSO (fast-AGSO), which are based on greedy surprise optimization and do not suffer from the resolution limit. In addition, (F)AGSO does not need the number of communities K to be specified in advance. Tests on experimental networks show that (F)AGSO is able to detect optimal partitions in both simple and even more complex networks. Moreover, algorithms based on surprise maximization perform better than those algorithms based on modularity maximization, including Blondel–Guillaume–Lambiotte–Lefebvre (BGLL), Clauset–Newman–Moore (CNM) and the other state-of-the-art algorithms such as Infomap, order statistics local optimization method (OSLOM) and label propagation algorithm (LPA). (paper)

  4. Radar Design to Protect Against Surprise

    Energy Technology Data Exchange (ETDEWEB)

    Doerry, Armin W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Technological and doctrinal surprise is about rendering preparations for conflict as irrelevant or ineffective . For a sensor, this means essentially rendering the sensor as irrelevant or ineffective in its ability to help determine truth. Recovery from this sort of surprise is facilitated by flexibility in our own technology and doctrine. For a sensor, this mean s flexibility in its architecture, design, tactics, and the designing organizations ' processes. - 4 - Acknowledgements This report is the result of a n unfunded research and development activity . Sandia National Laboratories is a multi - program laboratory manage d and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE - AC04 - 94AL85000.

  5. Surprise as a design strategy

    NARCIS (Netherlands)

    Ludden, G.D.S.; Schifferstein, H.N.J.; Hekkert, P.P.M.

    2008-01-01

    Imagine yourself queuing for the cashier’s desk in a supermarket. Naturally, you have picked the wrong line, the one that does not seem to move at all. Soon, you get tired of waiting. Now, how would you feel if the cashier suddenly started to sing? Many of us would be surprised and, regardless of

  6. Surprisal analysis and probability matrices for rotational energy transfer

    International Nuclear Information System (INIS)

    Levine, R.D.; Bernstein, R.B.; Kahana, P.; Procaccia, I.; Upchurch, E.T.

    1976-01-01

    The information-theoretic approach is applied to the analysis of state-to-state rotational energy transfer cross sections. The rotational surprisal is evaluated in the usual way, in terms of the deviance of the cross sections from their reference (''prior'') values. The surprisal is found to be an essentially linear function of the energy transferred. This behavior accounts for the experimentally observed exponential gap law for the hydrogen halide systems. The data base here analyzed (taken from the literature) is largely computational in origin: quantal calculations for the hydrogenic systems H 2 +H, He, Li + ; HD+He; D 2 +H and for the N 2 +Ar system; and classical trajectory results for H 2 +Li + ; D 2 +Li + and N 2 +Ar. The surprisal analysis not only serves to compact a large body of data but also aids in the interpretation of the results. A single surprisal parameter theta/subR/ suffices to account for the (relative) magnitude of all state-to-state inelastic cross sections at a given energy

  7. Efficiency enhancement of optimized Latin hypercube sampling strategies: Application to Monte Carlo uncertainty analysis and meta-modeling

    Science.gov (United States)

    Rajabi, Mohammad Mahdi; Ataie-Ashtiani, Behzad; Janssen, Hans

    2015-02-01

    The majority of literature regarding optimized Latin hypercube sampling (OLHS) is devoted to increasing the efficiency of these sampling strategies through the development of new algorithms based on the combination of innovative space-filling criteria and specialized optimization schemes. However, little attention has been given to the impact of the initial design that is fed into the optimization algorithm, on the efficiency of OLHS strategies. Previous studies, as well as codes developed for OLHS, have relied on one of the following two approaches for the selection of the initial design in OLHS: (1) the use of random points in the hypercube intervals (random LHS), and (2) the use of midpoints in the hypercube intervals (midpoint LHS). Both approaches have been extensively used, but no attempt has been previously made to compare the efficiency and robustness of their resulting sample designs. In this study we compare the two approaches and show that the space-filling characteristics of OLHS designs are sensitive to the initial design that is fed into the optimization algorithm. It is also illustrated that the space-filling characteristics of OLHS designs based on midpoint LHS are significantly better those based on random LHS. The two approaches are compared by incorporating their resulting sample designs in Monte Carlo simulation (MCS) for uncertainty propagation analysis, and then, by employing the sample designs in the selection of the training set for constructing non-intrusive polynomial chaos expansion (NIPCE) meta-models which subsequently replace the original full model in MCSs. The analysis is based on two case studies involving numerical simulation of density dependent flow and solute transport in porous media within the context of seawater intrusion in coastal aquifers. We show that the use of midpoint LHS as the initial design increases the efficiency and robustness of the resulting MCSs and NIPCE meta-models. The study also illustrates that this

  8. Surprising Incentive: An Instrument for Promoting Safety Performance of Construction Employees

    Directory of Open Access Journals (Sweden)

    Fakhradin Ghasemi

    2015-09-01

    Conclusion: The results of this study proved that the surprising incentive would improve the employees' safety performance just in the short term because the surprising value of the incentives dwindle over time. For this reason and to maintain the surprising value of the incentive system, the amount and types of incentives need to be evaluated and modified annually or biannually.

  9. Dividend announcements reconsidered: Dividend changes versus dividend surprises

    OpenAIRE

    Andres, Christian; Betzer, André; van den Bongard, Inga; Haesner, Christian; Theissen, Erik

    2012-01-01

    This paper reconsiders the issue of share price reactions to dividend announcements. Previous papers rely almost exclusively on a naive dividend model in which the dividend change is used as a proxy for the dividend surprise. We use the difference between the actual dividend and the analyst consensus forecast as obtained from I/B/E/S as a proxy for the dividend surprise. Using data from Germany, we find significant share price reactions after dividend announcements. Once we control for analys...

  10. Charming surprise

    CERN Multimedia

    Antonella Del Rosso

    2011-01-01

    The CP violation in charm quarks has always been thought to be extremely small. So, looking at particle decays involving matter and antimatter, the LHCb experiment has recently been surprised to observe that things might be different. Theorists are on the case.   The study of the physics of the charm quark was not in the initial plans of the LHCb experiment, whose letter “b” stands for “beauty quark”. However, already one year ago, the Collaboration decided to look into a wider spectrum of processes that involve charm quarks among other things. The LHCb trigger allows a lot of these processes to be selected, and, among them, one has recently shown interesting features. Other experiments at b-factories have already performed the same measurement but this is the first time that it has been possible to achieve such high precision, thanks to the huge amount of data provided by the very high luminosity of the LHC. “We have observed the decay modes of t...

  11. Reflection, A Meta-Model for Learning, and a Proposal To Improve the Quality of University Teaching = Reflexion, el meta-modelo del aprendizaje, y la propuesta del mejoramiento de la calidad de la docencia.

    Science.gov (United States)

    Montgomery, Joel R.

    This paper, in both English and Spanish, offers a meta-model of the learning process which focuses on the importance of the reflective learning process in enhancing the quality of learning in higher education. This form of innovative learning is offered as a means of helping learners to realize the relevance of what they are learning to their life…

  12. Salience and attention in surprisal-based accounts of language processing

    Directory of Open Access Journals (Sweden)

    Alessandra eZarcone

    2016-06-01

    Full Text Available The notion of salience has been singled out as the explanatory factor for a diverse range oflinguistic phenomena. In particular, perceptual salience (e.g. visual salience of objects in the world,acoustic prominence of linguistic sounds and semantic-pragmatic salience (e.g. prominence ofrecently mentioned or topical referents have been shown to influence language comprehensionand production. A different line of research has sought to account for behavioral correlates ofcognitive load during comprehension as well as for certain patterns in language usage usinginformation-theoretic notions, such as surprisal. Surprisal and salience both affect languageprocessing at different levels, but the relationship between the two has not been adequatelyelucidated, and the question of whether salience can be reduced to surprisal / predictability isstill open. Our review identifies two main challenges in addressing this question: terminologicalinconsistency and lack of integration between high and low levels of representations in salience-based accounts and surprisal-based accounts. We capitalise upon work in visual cognition inorder to orient ourselves in surveying the different facets of the notion of salience in linguisticsand their relation with models of surprisal. We find that work on salience highlights aspects oflinguistic communication that models of surprisal tend to overlook, namely the role of attentionand relevance to current goals, and we argue that the Predictive Coding framework provides aunified view which can account for the role played by attention and predictability at different levelsof processing and which can clarify the interplay between low and high levels of processes andbetween predictability-driven expectation and attention-driven focus.

  13. Viral marketing: the use of surprise

    NARCIS (Netherlands)

    Lindgreen, A.; Vanhamme, J.; Clarke, I.; Flaherty, T.B.

    2005-01-01

    Viral marketing involves consumers passing along a company's marketing message to their friends, family, and colleagues. This chapter reviews viral marketing campaigns and argues that the emotion of surprise often is at work and that this mechanism resembles that of word-of-mouth marketing.

  14. Distinct medial temporal networks encode surprise during motivation by reward versus punishment

    Science.gov (United States)

    Murty, Vishnu P.; LaBar, Kevin S.; Adcock, R. Alison

    2016-01-01

    Adaptive motivated behavior requires predictive internal representations of the environment, and surprising events are indications for encoding new representations of the environment. The medial temporal lobe memory system, including the hippocampus and surrounding cortex, encodes surprising events and is influenced by motivational state. Because behavior reflects the goals of an individual, we investigated whether motivational valence (i.e., pursuing rewards versus avoiding punishments) also impacts neural and mnemonic encoding of surprising events. During functional magnetic resonance imaging (fMRI), participants encountered perceptually unexpected events either during the pursuit of rewards or avoidance of punishments. Despite similar levels of motivation across groups, reward and punishment facilitated the processing of surprising events in different medial temporal lobe regions. Whereas during reward motivation, perceptual surprises enhanced activation in the hippocampus, during punishment motivation surprises instead enhanced activation in parahippocampal cortex. Further, we found that reward motivation facilitated hippocampal coupling with ventromedial PFC, whereas punishment motivation facilitated parahippocampal cortical coupling with orbitofrontal cortex. Behaviorally, post-scan testing revealed that reward, but not punishment, motivation resulted in greater memory selectivity for surprising events encountered during goal pursuit. Together these findings demonstrate that neuromodulatory systems engaged by anticipation of reward and punishment target separate components of the medial temporal lobe, modulating medial temporal lobe sensitivity and connectivity. Thus, reward and punishment motivation yield distinct neural contexts for learning, with distinct consequences for how surprises are incorporated into predictive mnemonic models of the environment. PMID:26854903

  15. Distinct medial temporal networks encode surprise during motivation by reward versus punishment.

    Science.gov (United States)

    Murty, Vishnu P; LaBar, Kevin S; Adcock, R Alison

    2016-10-01

    Adaptive motivated behavior requires predictive internal representations of the environment, and surprising events are indications for encoding new representations of the environment. The medial temporal lobe memory system, including the hippocampus and surrounding cortex, encodes surprising events and is influenced by motivational state. Because behavior reflects the goals of an individual, we investigated whether motivational valence (i.e., pursuing rewards versus avoiding punishments) also impacts neural and mnemonic encoding of surprising events. During functional magnetic resonance imaging (fMRI), participants encountered perceptually unexpected events either during the pursuit of rewards or avoidance of punishments. Despite similar levels of motivation across groups, reward and punishment facilitated the processing of surprising events in different medial temporal lobe regions. Whereas during reward motivation, perceptual surprises enhanced activation in the hippocampus, during punishment motivation surprises instead enhanced activation in parahippocampal cortex. Further, we found that reward motivation facilitated hippocampal coupling with ventromedial PFC, whereas punishment motivation facilitated parahippocampal cortical coupling with orbitofrontal cortex. Behaviorally, post-scan testing revealed that reward, but not punishment, motivation resulted in greater memory selectivity for surprising events encountered during goal pursuit. Together these findings demonstrate that neuromodulatory systems engaged by anticipation of reward and punishment target separate components of the medial temporal lobe, modulating medial temporal lobe sensitivity and connectivity. Thus, reward and punishment motivation yield distinct neural contexts for learning, with distinct consequences for how surprises are incorporated into predictive mnemonic models of the environment. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Emotional Intelligence and Successful Leadership.

    Science.gov (United States)

    Maulding, Wanda S.

    Cognitive intelligence is often equated with eventual success in many areas. However, there are many instances where people of high IQ flounder whereas those of modest IQ do surprisingly well. Author and renowned psychologist Daniel Goleman believes that the explanation for this fact lies in abilities called "emotional intelligence,"…

  17. Salience and Attention in Surprisal-Based Accounts of Language Processing.

    Science.gov (United States)

    Zarcone, Alessandra; van Schijndel, Marten; Vogels, Jorrig; Demberg, Vera

    2016-01-01

    The notion of salience has been singled out as the explanatory factor for a diverse range of linguistic phenomena. In particular, perceptual salience (e.g., visual salience of objects in the world, acoustic prominence of linguistic sounds) and semantic-pragmatic salience (e.g., prominence of recently mentioned or topical referents) have been shown to influence language comprehension and production. A different line of research has sought to account for behavioral correlates of cognitive load during comprehension as well as for certain patterns in language usage using information-theoretic notions, such as surprisal. Surprisal and salience both affect language processing at different levels, but the relationship between the two has not been adequately elucidated, and the question of whether salience can be reduced to surprisal / predictability is still open. Our review identifies two main challenges in addressing this question: terminological inconsistency and lack of integration between high and low levels of representations in salience-based accounts and surprisal-based accounts. We capitalize upon work in visual cognition in order to orient ourselves in surveying the different facets of the notion of salience in linguistics and their relation with models of surprisal. We find that work on salience highlights aspects of linguistic communication that models of surprisal tend to overlook, namely the role of attention and relevance to current goals, and we argue that the Predictive Coding framework provides a unified view which can account for the role played by attention and predictability at different levels of processing and which can clarify the interplay between low and high levels of processes and between predictability-driven expectation and attention-driven focus.

  18. Salience and Attention in Surprisal-Based Accounts of Language Processing

    Science.gov (United States)

    Zarcone, Alessandra; van Schijndel, Marten; Vogels, Jorrig; Demberg, Vera

    2016-01-01

    The notion of salience has been singled out as the explanatory factor for a diverse range of linguistic phenomena. In particular, perceptual salience (e.g., visual salience of objects in the world, acoustic prominence of linguistic sounds) and semantic-pragmatic salience (e.g., prominence of recently mentioned or topical referents) have been shown to influence language comprehension and production. A different line of research has sought to account for behavioral correlates of cognitive load during comprehension as well as for certain patterns in language usage using information-theoretic notions, such as surprisal. Surprisal and salience both affect language processing at different levels, but the relationship between the two has not been adequately elucidated, and the question of whether salience can be reduced to surprisal / predictability is still open. Our review identifies two main challenges in addressing this question: terminological inconsistency and lack of integration between high and low levels of representations in salience-based accounts and surprisal-based accounts. We capitalize upon work in visual cognition in order to orient ourselves in surveying the different facets of the notion of salience in linguistics and their relation with models of surprisal. We find that work on salience highlights aspects of linguistic communication that models of surprisal tend to overlook, namely the role of attention and relevance to current goals, and we argue that the Predictive Coding framework provides a unified view which can account for the role played by attention and predictability at different levels of processing and which can clarify the interplay between low and high levels of processes and between predictability-driven expectation and attention-driven focus. PMID:27375525

  19. Charming surprise

    CERN Multimedia

    Antonella Del Rosso

    2011-01-01

    The CP violation in charm quarks has always been thought to be extremely small. So, looking at particle decays involving matter and antimatter, the LHCb experiment has recently been surprised to observe that things might be different. Theorists are on the case. The study of the physics of the charm quark was not in the initial plans of the LHCb experiment, whose letter “b” stands for “beauty quark”. However, already one year ago, the Collaboration decided to look into a wider spectrum of processes that involve charm quarks among other things. The LHCb trigger allows a lot of these processes to be selected, and, among them, one has recently shown interesting features. Other experiments at b-factories have already performed the same measurement but this is the first time that it has been possible to achieve such high precision, thanks to the huge amount of data provided by the very high luminosity of the LHC. “We have observed the decay modes of the D0, a pa...

  20. The Surprise Examination Paradox and the Second Incompleteness Theorem

    OpenAIRE

    Kritchman, Shira; Raz, Ran

    2010-01-01

    We give a new proof for Godel's second incompleteness theorem, based on Kolmogorov complexity, Chaitin's incompleteness theorem, and an argument that resembles the surprise examination paradox. We then go the other way around and suggest that the second incompleteness theorem gives a possible resolution of the surprise examination paradox. Roughly speaking, we argue that the flaw in the derivation of the paradox is that it contains a hidden assumption that one can prove the consistency of the...

  1. Pupil size tracks perceptual content and surprise.

    Science.gov (United States)

    Kloosterman, Niels A; Meindertsma, Thomas; van Loon, Anouk M; Lamme, Victor A F; Bonneh, Yoram S; Donner, Tobias H

    2015-04-01

    Changes in pupil size at constant light levels reflect the activity of neuromodulatory brainstem centers that control global brain state. These endogenously driven pupil dynamics can be synchronized with cognitive acts. For example, the pupil dilates during the spontaneous switches of perception of a constant sensory input in bistable perceptual illusions. It is unknown whether this pupil dilation only indicates the occurrence of perceptual switches, or also their content. Here, we measured pupil diameter in human subjects reporting the subjective disappearance and re-appearance of a physically constant visual target surrounded by a moving pattern ('motion-induced blindness' illusion). We show that the pupil dilates during the perceptual switches in the illusion and a stimulus-evoked 'replay' of that illusion. Critically, the switch-related pupil dilation encodes perceptual content, with larger amplitude for disappearance than re-appearance. This difference in pupil response amplitude enables prediction of the type of report (disappearance vs. re-appearance) on individual switches (receiver-operating characteristic: 61%). The amplitude difference is independent of the relative durations of target-visible and target-invisible intervals and subjects' overt behavioral report of the perceptual switches. Further, we show that pupil dilation during the replay also scales with the level of surprise about the timing of switches, but there is no evidence for an interaction between the effects of surprise and perceptual content on the pupil response. Taken together, our results suggest that pupil-linked brain systems track both the content of, and surprise about, perceptual events. © 2015 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  2. The conceptualization model problem—surprise

    Science.gov (United States)

    Bredehoeft, John

    2005-03-01

    The foundation of model analysis is the conceptual model. Surprise is defined as new data that renders the prevailing conceptual model invalid; as defined here it represents a paradigm shift. Limited empirical data indicate that surprises occur in 20-30% of model analyses. These data suggest that groundwater analysts have difficulty selecting the appropriate conceptual model. There is no ready remedy to the conceptual model problem other than (1) to collect as much data as is feasible, using all applicable methods—a complementary data collection methodology can lead to new information that changes the prevailing conceptual model, and (2) for the analyst to remain open to the fact that the conceptual model can change dramatically as more information is collected. In the final analysis, the hydrogeologist makes a subjective decision on the appropriate conceptual model. The conceptualization problem does not render models unusable. The problem introduces an uncertainty that often is not widely recognized. Conceptual model uncertainty is exacerbated in making long-term predictions of system performance. C'est le modèle conceptuel qui se trouve à base d'une analyse sur un modèle. On considère comme une surprise lorsque le modèle est invalidé par des données nouvelles; dans les termes définis ici la surprise est équivalente à un change de paradigme. Des données empiriques limitées indiquent que les surprises apparaissent dans 20 à 30% des analyses effectuées sur les modèles. Ces données suggèrent que l'analyse des eaux souterraines présente des difficultés lorsqu'il s'agit de choisir le modèle conceptuel approprié. Il n'existe pas un autre remède au problème du modèle conceptuel que: (1) rassembler autant des données que possible en utilisant toutes les méthodes applicables—la méthode des données complémentaires peut conduire aux nouvelles informations qui vont changer le modèle conceptuel, et (2) l'analyste doit rester ouvert au fait

  3. Glial heterotopia of maxilla: A clinical surprise

    Directory of Open Access Journals (Sweden)

    Santosh Kumar Mahalik

    2011-01-01

    Full Text Available Glial heterotopia is a rare congenital mass lesion which often presents as a clinical surprise. We report a case of extranasal glial heterotopia in a neonate with unusual features. The presentation, management strategy, etiopathogenesis and histopathology of the mass lesion has been reviewed.

  4. Beyond surprise : A longitudinal study on the experience of visual-tactual incongruities in products

    NARCIS (Netherlands)

    Ludden, G.D.S.; Schifferstein, H.N.J.; Hekkert, P.

    2012-01-01

    When people encounter products with visual-tactual incongruities, they are likely to be surprised because the product feels different than expected. In this paper, we investigate (1) the relationship between surprise and the overall liking of the products, (2) the emotions associated with surprise,

  5. A Statistical Analysis of the Relationship between Harmonic Surprise and Preference in Popular Music.

    Science.gov (United States)

    Miles, Scott A; Rosen, David S; Grzywacz, Norberto M

    2017-01-01

    Studies have shown that some musical pieces may preferentially activate reward centers in the brain. Less is known, however, about the structural aspects of music that are associated with this activation. Based on the music cognition literature, we propose two hypotheses for why some musical pieces are preferred over others. The first, the Absolute-Surprise Hypothesis, states that unexpected events in music directly lead to pleasure. The second, the Contrastive-Surprise Hypothesis, proposes that the juxtaposition of unexpected events and subsequent expected events leads to an overall rewarding response. We tested these hypotheses within the framework of information theory, using the measure of "surprise." This information-theoretic variable mathematically describes how improbable an event is given a known distribution. We performed a statistical investigation of surprise in the harmonic structure of songs within a representative corpus of Western popular music, namely, the McGill Billboard Project corpus. We found that chords of songs in the top quartile of the Billboard chart showed greater average surprise than those in the bottom quartile. We also found that the different sections within top-quartile songs varied more in their average surprise than the sections within bottom-quartile songs. The results of this study are consistent with both the Absolute- and Contrastive-Surprise Hypotheses. Although these hypotheses seem contradictory to one another, we cannot yet discard the possibility that both absolute and contrastive types of surprise play roles in the enjoyment of popular music. We call this possibility the Hybrid-Surprise Hypothesis. The results of this statistical investigation have implications for both music cognition and the human neural mechanisms of esthetic judgments.

  6. Verifiable metamodels for nitrate losses to drains and groundwater in the Corn Belt, USA

    Science.gov (United States)

    Nolan, Bernard T.; Malone, Robert W.; Gronberg, Jo Ann M.; Thorp, K.R.; Ma, Liwang

    2012-01-01

    Nitrate leaching in the unsaturated zone poses a risk to groundwater, whereas nitrate in tile drainage is conveyed directly to streams. We developed metamodels (MMs) consisting of artificial neural networks to simplify and upscale mechanistic fate and transport models for prediction of nitrate losses by drains and leaching in the Corn Belt, USA. The two final MMs predicted nitrate concentration and flux, respectively, in the shallow subsurface. Because each MM considered both tile drainage and leaching, they represent an integrated approach to vulnerability assessment. The MMs used readily available data comprising farm fertilizer nitrogen (N), weather data, and soil properties as inputs; therefore, they were well suited for regional extrapolation. The MMs effectively related the outputs of the underlying mechanistic model (Root Zone Water Quality Model) to the inputs (R2 = 0.986 for the nitrate concentration MM). Predicted nitrate concentration was compared with measured nitrate in 38 samples of recently recharged groundwater, yielding a Pearson’s r of 0.466 (p = 0.003). Predicted nitrate generally was higher than that measured in groundwater, possibly as a result of the time-lag for modern recharge to reach well screens, denitrification in groundwater, or interception of recharge by tile drains. In a qualitative comparison, predicted nitrate concentration also compared favorably with results from a previous regression model that predicted total N in streams.

  7. Metamodeling as a tool to size vegetative filter strips for surface runoff pollution control in European watersheds.

    Science.gov (United States)

    Lauvernet, Claire; Muñoz-Carpena, Rafael; Carluer, Nadia

    2015-04-01

    influence and interactions, and set priorities for data collecting and management. Based on GSA results, we compared several mathematical methods to compute the metamodel, and then validated it on an agricultural watershed with real data in the North-West of France. The analysis procedure allows for a robust and validated metamodel, before extending it on other climatic conditions in order to make the application on a large range of european watersheds possible. The tool will allow comparison of field scenarios, and to validate/improve actual existing placements and VFS sizing.

  8. Ignorance, Vulnerability and the Occurrence of "Radical Surprises": Theoretical Reflections and Empirical Findings

    Science.gov (United States)

    Kuhlicke, C.

    2009-04-01

    By definition natural disasters always contain a moment of surprise. Their occurrence is mostly unforeseen and unexpected. They hit people unprepared, overwhelm them and expose their helplessness. Yet, there is surprisingly little known on the reasons for their being surprised. Aren't natural disasters expectable and foreseeable after all? Aren't the return rates of most hazards well known and shouldn't people be better prepared? The central question of this presentation is hence: Why do natural disasters so often radically surprise people at all (and how can we explain this being surprised)? In the first part of the presentation, it is argued that most approaches to vulnerability are not able to grasp this moment of surprise. On the contrary, they have their strength in unravelling the expectable: A person who is marginalized or even oppressed in everyday life is also vulnerable during times of crisis and stress, at least this is the central assumption of most vulnerability studies. In the second part, an understanding of vulnerability is developed, which allows taking into account such radical surprises. First, two forms of the unknown are differentiated: An area of the unknown an actor is more or less aware of (ignorance), and an area, which is not even known to be not known (nescience). The discovery of the latter is mostly associated with a "radical surprise", since it is per definition impossible to prepare for it. Second, a definition of vulnerability is proposed, which allows capturing the dynamics of surprises: People are vulnerable when they discover their nescience exceeding by definition previously established routines, stocks of knowledge and resources—in a general sense their capacities—to deal with their physical and/or social environment. This definition explicitly takes the view of different actors serious and departs from their being surprised. In the third part findings of a case study are presented, the 2002 flood in Germany. It is shown

  9. Decision-making under surprise and uncertainty: Arsenic contamination of water supplies

    Science.gov (United States)

    Randhir, Timothy O.; Mozumder, Pallab; Halim, Nafisa

    2018-05-01

    With ignorance and potential surprise dominating decision making in water resources, a framework for dealing with such uncertainty is a critical need in hydrology. We operationalize the 'potential surprise' criterion proposed by Shackle, Vickers, and Katzner (SVK) to derive decision rules to manage water resources under uncertainty and ignorance. We apply this framework to managing water supply systems in Bangladesh that face severe, naturally occurring arsenic contamination. The uncertainty involved with arsenic in water supplies makes the application of conventional analysis of decision-making ineffective. Given the uncertainty and surprise involved in such cases, we find that optimal decisions tend to favor actions that avoid irreversible outcomes instead of conventional cost-effective actions. We observe that a diversification of the water supply system also emerges as a robust strategy to avert unintended outcomes of water contamination. Shallow wells had a slight higher optimal level (36%) compare to deep wells and surface treatment which had allocation levels of roughly 32% under each. The approach can be applied in a variety of other cases that involve decision making under uncertainty and surprise, a frequent situation in natural resources management.

  10. Sleeping beauties in theoretical physics 26 surprising insights

    CERN Document Server

    Padmanabhan, Thanu

    2015-01-01

    This book addresses a fascinating set of questions in theoretical physics which will both entertain and enlighten all students, teachers and researchers and other physics aficionados. These range from Newtonian mechanics to quantum field theory and cover several puzzling issues that do not appear in standard textbooks. Some topics cover conceptual conundrums, the solutions to which lead to surprising insights; some correct popular misconceptions in the textbook discussion of certain topics; others illustrate deep connections between apparently unconnected domains of theoretical physics; and a few provide remarkably simple derivations of results which are not often appreciated. The connoisseur of theoretical physics will enjoy a feast of pleasant surprises skilfully prepared by an internationally acclaimed theoretical physicist. Each topic is introduced with proper background discussion and special effort is taken to make the discussion self-contained, clear and comprehensible to anyone with an undergraduate e...

  11. The June surprises: balls, strikes, and the fog of war.

    Science.gov (United States)

    Fried, Charles

    2013-04-01

    At first, few constitutional experts took seriously the argument that the Patient Protection and Affordable Care Act exceeded Congress's power under the commerce clause. The highly political opinions of two federal district judges - carefully chosen by challenging plaintiffs - of no particular distinction did not shake that confidence that the act was constitutional. This disdain for the challengers' arguments was only confirmed when the act was upheld by two highly respected conservative court of appeals judges in two separate circuits. But after the hostile, even mocking questioning of the government's advocate in the Supreme Court by the five Republican-appointed justices, the expectation was that the act would indeed be struck down on that ground. So it came as no surprise when the five opined the act did indeed exceed Congress's commerce clause power. But it came as a great surprise when Chief Justice John Roberts, joined by the four Democrat-appointed justices, ruled that the act could be sustained as an exercise of Congress's taxing power - a ground urged by the government almost as an afterthought. It was further surprising, even shocking, that Justices Antonin Scalia, Anthony Kennedy, Clarence Thomas, and Samuel Alito not only wrote a joint opinion on the commerce clause virtually identical to that of their chief, but that in writing it they did not refer to or even acknowledge his opinion. Finally surprising was the fact that Justices Ruth Bader Ginsburg and Stephen Breyer joined the chief in holding that aspects of the act's Medicaid expansion were unconstitutional. This essay ponders and tries to unravel some of these puzzles.

  12. Parameter identification and global sensitivity analysis of Xin'anjiang model using meta-modeling approach

    Directory of Open Access Journals (Sweden)

    Xiao-meng Song

    2013-01-01

    Full Text Available Parameter identification, model calibration, and uncertainty quantification are important steps in the model-building process, and are necessary for obtaining credible results and valuable information. Sensitivity analysis of hydrological model is a key step in model uncertainty quantification, which can identify the dominant parameters, reduce the model calibration uncertainty, and enhance the model optimization efficiency. There are, however, some shortcomings in classical approaches, including the long duration of time and high computation cost required to quantitatively assess the sensitivity of a multiple-parameter hydrological model. For this reason, a two-step statistical evaluation framework using global techniques is presented. It is based on (1 a screening method (Morris for qualitative ranking of parameters, and (2 a variance-based method integrated with a meta-model for quantitative sensitivity analysis, i.e., the Sobol method integrated with the response surface model (RSMSobol. First, the Morris screening method was used to qualitatively identify the parameters' sensitivity, and then ten parameters were selected to quantify the sensitivity indices. Subsequently, the RSMSobol method was used to quantify the sensitivity, i.e., the first-order and total sensitivity indices based on the response surface model (RSM were calculated. The RSMSobol method can not only quantify the sensitivity, but also reduce the computational cost, with good accuracy compared to the classical approaches. This approach will be effective and reliable in the global sensitivity analysis of a complex large-scale distributed hydrological model.

  13. X rays and radioactivity: a complete surprise

    International Nuclear Information System (INIS)

    Radvanyi, P.; Bordry, M.

    1995-01-01

    The discoveries of X rays and of radioactivity came as complete experimental surprises; the physicists, at that time, had no previous hint of a possible structure of atoms. It is difficult now, knowing what we know, to replace ourselves in the spirit, astonishment and questioning of these years, between 1895 and 1903. The nature of X rays was soon hypothesized, but the nature of the rays emitted by uranium, polonium and radium was much more difficult to disentangle, as they were a mixture of different types of radiations. The origin of the energy continuously released in radioactivity remained a complete mystery for a few years. The multiplicity of the radioactive substances became soon a difficult matter: what was real and what was induced ? Isotopy was still far ahead. It appeared that some radioactive substances had ''half-lifes'': were they genuine radioactive elements or was it just a transitory phenomenon ? Henri Becquerel (in 1900) and Pierre and Marie Curie (in 1902) hesitated on the correct answer. Only after Ernest Rutherford and Frederick Soddy established that radioactivity was the transmutation of one element into another, could one understand that a solid element transformed into a gaseous element, which in turn transformed itself into a succession of solid radioactive elements. It was only in 1913 - after the discovery of the atomic nucleus -, through precise measurements of X ray spectra, that Henry Moseley showed that the number of electrons of a given atom - and the charge of its nucleus - was equal to its atomic number in the periodic table. (authors)

  14. X rays and radioactivity: a complete surprise

    Energy Technology Data Exchange (ETDEWEB)

    Radvanyi, P. [Laboratoire National Saturne, Centre d`Etudes de Saclay, 91 - Gif-sur-Yvette (France); Bordry, M. [Institut du Radium, 75 - Paris (France)

    1995-12-31

    The discoveries of X rays and of radioactivity came as complete experimental surprises; the physicists, at that time, had no previous hint of a possible structure of atoms. It is difficult now, knowing what we know, to replace ourselves in the spirit, astonishment and questioning of these years, between 1895 and 1903. The nature of X rays was soon hypothesized, but the nature of the rays emitted by uranium, polonium and radium was much more difficult to disentangle, as they were a mixture of different types of radiations. The origin of the energy continuously released in radioactivity remained a complete mystery for a few years. The multiplicity of the radioactive substances became soon a difficult matter: what was real and what was induced ? Isotopy was still far ahead. It appeared that some radioactive substances had ``half-lifes``: were they genuine radioactive elements or was it just a transitory phenomenon ? Henri Becquerel (in 1900) and Pierre and Marie Curie (in 1902) hesitated on the correct answer. Only after Ernest Rutherford and Frederick Soddy established that radioactivity was the transmutation of one element into another, could one understand that a solid element transformed into a gaseous element, which in turn transformed itself into a succession of solid radioactive elements. It was only in 1913 - after the discovery of the atomic nucleus -, through precise measurements of X ray spectra, that Henry Moseley showed that the number of electrons of a given atom - and the charge of its nucleus - was equal to its atomic number in the periodic table. (authors).

  15. Distinct medial temporal networks encode surprise during motivation by reward versus punishment

    OpenAIRE

    Murty, Vishnu P.; LaBar, Kevin S.; Adcock, R. Alison

    2016-01-01

    Adaptive motivated behavior requires predictive internal representations of the environment, and surprising events are indications for encoding new representations of the environment. The medial temporal lobe memory system, including the hippocampus and surrounding cortex, encodes surprising events and is influenced by motivational state. Because behavior reflects the goals of an individual, we investigated whether motivational valence (i.e., pursuing rewards versus avoiding punishments) also...

  16. Successful and unsuccessful psychopaths: a neurobiological model.

    Science.gov (United States)

    Gao, Yu; Raine, Adrian

    2010-01-01

    Despite increasing interest in psychopathy research, surprisingly little is known about the etiology of non-incarcerated, successful psychopaths. This review provides an analysis of current knowledge on the similarities and differences between successful and unsuccessful psychopaths derived from five population sources: community samples, individuals from employment agencies, college students, industrial psychopaths, and serial killers. An initial neurobiological model of successful and unsuccessful psychopathy is outlined. It is hypothesized that successful psychopaths have intact or enhanced neurobiological functioning that underlies their normal or even superior cognitive functioning, which in turn helps them to achieve their goals using more covert and nonviolent methods. In contrast, in unsuccessful, caught psychopaths, brain structural and functional impairments together with autonomic nervous system dysfunction are hypothesized to underlie cognitive and emotional deficits and more overt violent offending.

  17. Managing Uncertainity: Soviet Views on Deception, Surprise, and Control

    National Research Council Canada - National Science Library

    Hull, Andrew

    1989-01-01

    .... In the first two cases (deception and surprise), the emphasis is on how the Soviets seek to sow uncertainty in the minds of the enemy and how the Soviets then plan to use that uncertainty to gain military advantage...

  18. Successful Internet Entrepreneurs Don't Have to Be College Dropouts: A Model for Nurturing College Students to Become Successful Internet Entrepreneurs

    Science.gov (United States)

    Zhang, Sonya

    2014-01-01

    Some of today's most successful Internet entrepreneurs didn't graduate from college. Many young people today followed the same path to pursue their dreams however ended up failing, not a surprise because 80% of the startups fail in first 5 years. As technology innovation and market competition on Internet continue to accelerate, college students…

  19. Effects of surprisal and locality on Danish sentence processing

    DEFF Research Database (Denmark)

    Balling, Laura Winther; Kizach, Johannes

    2017-01-01

    An eye-tracking experiment in Danish investigates two dominant accounts of sentence processing: locality-based theories that predict a processing advantage for sentences where the distance between the major syntactic heads is minimized, and the surprisal theory which predicts that processing time...

  20. Potential success factors in brand development

    DEFF Research Database (Denmark)

    Esbjerg, Lars; Grunert, Klaus G.; Poulsen, Carsten Stig

    2005-01-01

    to the marketing of the brand." The branding literature mentions many important aspects, factors, issues, brand requirements, steps, building blocks or guidelines for building strong brands. However, these are all quite general and abstract. Given the substantial body of literature on branding, surprisingly few......? This is the question we want to answer. More specifically, we want to identify potential success factors in building strong brands, understood as brands with high consumer-based brand equity. Keller (1993, p. 2) defined customer-based brand equity as "the differential effect of brand knowledge on consumer response...... of this paper is to identify potential success factors in developing strong brands and to test whether these factors can be used to discriminate between strong and weak brands. It does so through a review of the literature for potential success factors. Furthermore, to ensure that important factors have...

  1. Factors favorable to public participation success

    International Nuclear Information System (INIS)

    Peelle, E.; Schweitzer, M.; Munro, J.; Carnes, S.; Wolfe, A.

    1996-01-01

    Categories of factors linked to successful public participation (PP) program outcomes include PP process, organizational context, sociopolitical context, strategic considerations and unique (special circumstances) factors. We re-order the long list factors according to how essential, important, and unique they are and discuss their significance and interrelationships. It is argued that bureacratic structure and operational modes are basically in conflict with features of successful PP programs (openness, two-way education, communication with nonexpert outsiders). If this is so, then it is not surprising that the factors essential for PP success in bureacracies involve extraordinary management efforts by agencies to bypass, compensate for, or overcome structural constraints. We conclude by speculating about the long-term viability of PP practices in the agency setting as well as the consequences for agencies that attempt the problematic task of introducing PP into their complex, mission-oriented organizations

  2. Factors favorable to public participation success

    Energy Technology Data Exchange (ETDEWEB)

    Peelle, E.; Schweitzer, M.; Munro, J.; Carnes, S.; Wolfe, A.

    1996-05-01

    Categories of factors linked to successful public participation (PP) program outcomes include PP process, organizational context, sociopolitical context, strategic considerations and unique (special circumstances) factors. We re-order the long list factors according to how essential, important, and unique they are and discuss their significance and interrelationships. It is argued that bureacratic structure and operational modes are basically in conflict with features of successful PP programs (openness, two-way education, communication with nonexpert outsiders). If this is so, then it is not surprising that the factors essential for PP success in bureacracies involve extraordinary management efforts by agencies to bypass, compensate for, or overcome structural constraints. We conclude by speculating about the long-term viability of PP practices in the agency setting as well as the consequences for agencies that attempt the problematic task of introducing PP into their complex, mission-oriented organizations.

  3. Diagnostic reasoning strategies and diagnostic success.

    Science.gov (United States)

    Coderre, S; Mandin, H; Harasym, P H; Fick, G H

    2003-08-01

    Cognitive psychology research supports the notion that experts use mental frameworks or "schemes", both to organize knowledge in memory and to solve clinical problems. The central purpose of this study was to determine the relationship between problem-solving strategies and the likelihood of diagnostic success. Think-aloud protocols were collected to determine the diagnostic reasoning used by experts and non-experts when attempting to diagnose clinical presentations in gastroenterology. Using logistic regression analysis, the study found that there is a relationship between diagnostic reasoning strategy and the likelihood of diagnostic success. Compared to hypothetico-deductive reasoning, the odds of diagnostic success were significantly greater when subjects used the diagnostic strategies of pattern recognition and scheme-inductive reasoning. Two other factors emerged as independent determinants of diagnostic success: expertise and clinical presentation. Not surprisingly, experts outperformed novices, while the content area of the clinical cases in each of the four clinical presentations demonstrated varying degrees of difficulty and thus diagnostic success. These findings have significant implications for medical educators. It supports the introduction of "schemes" as a means of enhancing memory organization and improving diagnostic success.

  4. Surprise and Memory as Indices of Concrete Operational Development

    Science.gov (United States)

    Achenbach, Thomas M.

    1973-01-01

    Normal and retarded children's use of color, number, length and continuous quantity as attributes of identification was assessed by presenting them with contrived changes in three properties. Surprise and correct memory responses for color preceded those to number, which preceded logical verbal responses to a conventional number-conservation task.…

  5. FLCNDEMF: An Event Metamodel for Flood Process Information Management under the Sensor Web Environment

    Directory of Open Access Journals (Sweden)

    Nengcheng Chen

    2015-06-01

    Full Text Available Significant economic losses, large affected populations, and serious environmental damage caused by recurrent natural disaster events (NDE worldwide indicate insufficiency in emergency preparedness and response. The barrier of full life cycle data preparation and information support is one of the main reasons. This paper adopts the method of integrated environmental modeling, incorporates information from existing event protocols, languages, and models, analyzes observation demands from different event stages, and forms the abstract full life cycle natural disaster event metamodel (FLCNDEM based on meta-object facility. Then task library and knowledge base for floods are built to instantiate FLCNDEM, forming the FLCNDEM for floods (FLCNDEMF. FLCNDEMF is formalized according to Event Pattern Markup Language, and a prototype system, Natural Disaster Event Manager, is developed to assist in the template-based modeling and management. The flood in Liangzi (LZ Lake of Hubei, China on 16 July 2010 is adopted to illustrate how to apply FLCNDEM in real scenarios. FLCNDEM-based modeling is realized, and the candidate remote sensing (RS dataset for different observing missions are provided for LZ Lake flood. Taking the mission of flood area extraction as an example, the appropriate RS data are selected via the model of simplified general perturbation version 4, and the flood area in different phases are calculated and displayed on the map. The phase-based modeling and visualization intuitively display the spatial-temporal distribution and the evolution process of the LZ Lake flood, and it is of great significance for flood responding. In addition, through the extension mechanism, FLCNDEM can also be applied in other environmental applications, providing important support for full life cycle information sharing and rapid responding.

  6. A META-MODELLING SERVICE PARADIGM FOR CLOUD COMPUTING AND ITS IMPLEMENTATION

    Directory of Open Access Journals (Sweden)

    F. Cheng

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT:Service integrators seek opportunities to align the way they manage resources in the service supply chain. Many business organisations can operate new, more flexible business processes that harness the value of a service approach from the customer’s perspective. As a relatively new concept, cloud computing and related technologies have rapidly gained momentum in the IT world. This article seeks to shed light on service supply chain issues associated with cloud computing by examining several interrelated questions: service supply chain architecture from a service perspective; the basic clouds of service supply chain; managerial insights into these clouds; and the commercial value of implementing cloud computing. In particular, to show how those services can be used, and involved in their utilisation processes, a hypothetical meta-modelling service of cloud computing is proposed. Moreover, the paper defines the managed cloud architecture for a service vendor or service integrator in the cloud computing infrastructure in the service supply chain: IT services, business services, business processes, which create atomic and composite software services that are used to perform business processes with business service choreographies.

    AFRIKAANSE OPSOMMING: Diensintegreeders is op soek na geleenthede om die bestuur van hulpbronne in die diensketting te belyn. Talle organisasies kan nuwe, meer buigsame besigheidprosesse, wat die waarde van ‘n diensaanslag uit die kliënt se oogpunt inspan, gebruik. As ‘n relatiewe nuwe konsep het wolkberekening en verwante tegnologie vinnig momentum gekry in die IT-wêreld. Die artikel poog om lig te werp op kwessies van die diensketting wat verband hou met wolkberekening deur verskeie verwante vrae te ondersoek: dienkettingargitektuur uit ‘n diensoogpunt; die basiese wolk van die diensketting; bestuursinsigte oor sodanige wolke; en die kommersiële waarde van die implementering van

  7. A surprising palmar nevus: A case report

    Directory of Open Access Journals (Sweden)

    Rana Rafiei

    2018-02-01

    Full Text Available Raised palmar or plantar nevus especially in white people is an unusual feature. We present an uncommon palmar compound nevus in a 26-year-old woman with a large diameter (6 mm which had a collaret-shaped margin. In histopathologic evaluation intralymphatic protrusions of nevic nests were noted. This case was surprising to us for these reasons: size, shape, location and histopathology of the lesion. Palmar nevi are usually junctional (flat and below 3 mm diameter and intra lymphatic protrusion or invasion in nevi is an extremely rare phenomenon.

  8. Optimization model of a system of crude oil distillation units with heat integration and metamodeling

    International Nuclear Information System (INIS)

    Lopez, Diana C; Mahecha, Cesar A; Hoyos, Luis J; Acevedo, Leonardo; Villamizar Jaime F

    2010-01-01

    The process of crude distillation impacts the economy of any refinery in a considerable manner. Therefore, it is necessary to improve it taking good advantage of the available infrastructure, generating products that conform to the specifications without violating the equipment operating constraints or plant restrictions at industrial units. The objective of this paper is to present the development of an optimization model for a Crude Distillation Unit (CDU) system at a ECOPETROL S.A. refinery in Barrancabermeja, involving the typical restrictions (flow according to pipeline capacity, pumps, distillation columns, etc) and a restriction that has not been included in bibliographic reports for this type of models: the heat integration of streams from Atmospheric Distillation Towers (ADTs) and Vacuum Distillation Towers (VDT) with the heat exchanger networks for crude pre-heating. On the other hand, ADTs were modeled with Metamodels in function of column temperatures and pressures, pump a rounds flows and return temperatures, stripping steam flows, Jet EBP ASTM D-86 and Diesel EBP ASTM D-86. Pre-heating trains were modeled with mass and energy balances, and design equation of each heat exchanger. The optimization model is NLP, maximizing the system profit. This model was implemented in GAMSide 22,2 using the CONOPT solver and it found new operating points with better economic results than those obtained with the normal operation in the real plants. It predicted optimum operation conditions of 3 ADTs for constant composition crude and calculated the yields and properties of atmospheric products, additional to temperatures and duties of 27 Crude Oil exchangers.

  9. Things may not be as expected: Surprising findings when updating ...

    African Journals Online (AJOL)

    2015-05-14

    May 14, 2015 ... Things may not be as expected: Surprising findings when updating .... (done at the end of three months after the first review month) ..... Allen G. Getting beyond form filling: The role of institutional governance in human research ...

  10. A PROPOSAL OF DATA QUALITY FOR DATA WAREHOUSES ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Leo Willyanto Santoso

    2006-01-01

    Full Text Available The quality of the data provided is critical to the success of data warehousing initiatives. There is strong evidence that many organisations have significant data quality problems, and that these have substantial social and economic impacts. This paper describes a study which explores modeling of the dynamic parts of the data warehouse. This metamodel enables data warehouse management, design and evolution based on a high level conceptual perspective, which can be linked to the actual structural and physical aspects of the data warehouse architecture. Moreover, this metamodel is capable of modeling complex activities, their interrelationships, the relationship of activities with data sources and execution details.

  11. Estimations of expectedness and potential surprise in possibility theory

    Science.gov (United States)

    Prade, Henri; Yager, Ronald R.

    1992-01-01

    This note investigates how various ideas of 'expectedness' can be captured in the framework of possibility theory. Particularly, we are interested in trying to introduce estimates of the kind of lack of surprise expressed by people when saying 'I would not be surprised that...' before an event takes place, or by saying 'I knew it' after its realization. In possibility theory, a possibility distribution is supposed to model the relative levels of mutually exclusive alternatives in a set, or equivalently, the alternatives are assumed to be rank-ordered according to their level of possibility to take place. Four basic set-functions associated with a possibility distribution, including standard possibility and necessity measures, are discussed from the point of view of what they estimate when applied to potential events. Extensions of these estimates based on the notions of Q-projection or OWA operators are proposed when only significant parts of the possibility distribution are retained in the evaluation. The case of partially-known possibility distributions is also considered. Some potential applications are outlined.

  12. Automation surprise : results of a field survey of Dutch pilots

    NARCIS (Netherlands)

    de Boer, R.J.; Hurts, Karel

    2017-01-01

    Automation surprise (AS) has often been associated with aviation safety incidents. Although numerous laboratory studies have been conducted, few data are available from routine flight operations. A survey among a representative sample of 200 Dutch airline pilots was used to determine the prevalence

  13. Models of Automation Surprise: Results of a Field Survey in Aviation

    Directory of Open Access Journals (Sweden)

    Robert De Boer

    2017-09-01

    Full Text Available Automation surprises in aviation continue to be a significant safety concern and the community’s search for effective strategies to mitigate them are ongoing. The literature has offered two fundamentally divergent directions, based on different ideas about the nature of cognition and collaboration with automation. In this paper, we report the results of a field study that empirically compared and contrasted two models of automation surprises: a normative individual-cognition model and a sensemaking model based on distributed cognition. Our data prove a good fit for the sense-making model. This finding is relevant for aviation safety, since our understanding of the cognitive processes that govern human interaction with automation drive what we need to do to reduce the frequency of automation-induced events.

  14. ORMS IN SURPRISING PLACES: CLINICAL AND MORPHOLOGICAL FEATURES

    Directory of Open Access Journals (Sweden)

    Myroshnychenko MS

    2013-06-01

    Full Text Available Helminthes are the most common human diseases, which are characterized by involvement in the pathological process of all organs and systems. In this article, the authors discuss a few cases of typical and atypical localizations for parasitic worms such as filarial and pinworms which were recovered from surprising places in the bodies of patients in Kharkiv region. This article will allow the doctors of practical health care to pay special attention to the timely prevention and diagnostics of this pathology.

  15. Primary Care Practice: Uncertainty and Surprise

    Science.gov (United States)

    Crabtree, Benjamin F.

    I will focus my comments on uncertainty and surprise in primary care practices. I am a medical anthropologist by training, and have been a full-time researcher in family medicine for close to twenty years. In this talk I want to look at primary care practices as complex systems, particularly taking the perspective of translating evidence into practice. I am going to discuss briefly the challenges we have in primary care, and in medicine in general, of translating new evidence into the everyday care of patients. To do this, I will look at two studies that we have conducted on family practices, then think about how practices can be best characterized as complex adaptive systems. Finally, I will focus on the implications of this portrayal for disseminating new knowledge into practice.

  16. What is a surprise earthquake? The example of the 2002, San Giuliano (Italy event

    Directory of Open Access Journals (Sweden)

    M. Mucciarelli

    2005-06-01

    Full Text Available Both in scientific literature and in the mass media, some earthquakes are defined as «surprise earthquakes». Based on his own judgment, probably any geologist, seismologist or engineer may have his own list of past «surprise earthquakes». This paper tries to quantify the underlying individual perception that may lead a scientist to apply such a definition to a seismic event. The meaning is different, depending on the disciplinary approach. For geologists, the Italian database of seismogenic sources is still too incomplete to allow for a quantitative estimate of the subjective degree of belief. For seismologists, quantification is possible defining the distance between an earthquake and its closest previous neighbor. Finally, for engineers, the San Giuliano quake could not be considered a surprise, since probabilistic site hazard estimates reveal that the change before and after the earthquake is just 4%.

  17. Analysis of physiological signals for recognition of boredom, pain, and surprise emotions.

    Science.gov (United States)

    Jang, Eun-Hye; Park, Byoung-Jun; Park, Mi-Sook; Kim, Sang-Hyeob; Sohn, Jin-Hun

    2015-06-18

    The aim of the study was to examine the differences of boredom, pain, and surprise. In addition to that, it was conducted to propose approaches for emotion recognition based on physiological signals. Three emotions, boredom, pain, and surprise, are induced through the presentation of emotional stimuli and electrocardiography (ECG), electrodermal activity (EDA), skin temperature (SKT), and photoplethysmography (PPG) as physiological signals are measured to collect a dataset from 217 participants when experiencing the emotions. Twenty-seven physiological features are extracted from the signals to classify the three emotions. The discriminant function analysis (DFA) as a statistical method, and five machine learning algorithms (linear discriminant analysis (LDA), classification and regression trees (CART), self-organizing map (SOM), Naïve Bayes algorithm, and support vector machine (SVM)) are used for classifying the emotions. The result shows that the difference of physiological responses among emotions is significant in heart rate (HR), skin conductance level (SCL), skin conductance response (SCR), mean skin temperature (meanSKT), blood volume pulse (BVP), and pulse transit time (PTT), and the highest recognition accuracy of 84.7% is obtained by using DFA. This study demonstrates the differences of boredom, pain, and surprise and the best emotion recognizer for the classification of the three emotions by using physiological signals.

  18. Sensitivity analysis and metamodeling of a toolchain of models to help sizing vetetative filter strips in a watershed.

    Science.gov (United States)

    Lauvernet, Claire; Noll, Dorothea; Muñoz-Carpena, Rafael; Carluer, Nadia

    2014-05-01

    agricultural field and the VFS characteristics. These scenarios are based on: 2 types of climates (North and South-west of France), different rainfall intensities and durations, different lengths and slopes of hillslope, different humidity conditions, 4 soil types (silt loam, sandy loam, clay loam, sandy clay loam), 2 crops (wheat and corn) for the contributive area, 2 water table depths (1m and 2.5m) and 4 soil types for the VFS. The sizing method was applied for all these scenarios, and a sensitivity analysis of the VFS optimal length was performed for all the input parameters in order to understand their influence, and to identify for which a special care has to be given. Based on that sensitivity analysis, a metamodel has been developed. The idea is to simplify the whole toolchain and to make it possible to perform the buffer sizing by using a unique tool and a smaller set of parameters, given the available information from the end users. We first compared several mathematical methods to compute the metamodel, and then validated them on an agricultural watershed with real data in the North-West of France.

  19. Teacher Supply and Demand: Surprises from Primary Research

    Directory of Open Access Journals (Sweden)

    Andrew J. Wayne

    2000-09-01

    Full Text Available An investigation of primary research studies on public school teacher supply and demand revealed four surprises. Projections show that enrollments are leveling off. Relatedly, annual hiring increases should be only about two or three percent over the next few years. Results from studies of teacher attrition also yield unexpected results. Excluding retirements, only about one in 20 teachers leaves each year, and the novice teachers who quit mainly cite personal and family reasons, not job dissatisfaction. Each of these findings broadens policy makers' options for teacher supply.

  20. Conference of “Uncertainty and Surprise: Questions on Working with the Unexpected and Unknowable”

    CERN Document Server

    McDaniel, Reuben R; Uncertainty and Surprise in Complex Systems : Questions on Working with the Unexpected

    2005-01-01

    Complexity science has been a source of new insight in physical and social systems and has demonstrated that unpredictability and surprise are fundamental aspects of the world around us. This book is the outcome of a discussion meeting of leading scholars and critical thinkers with expertise in complex systems sciences and leaders from a variety of organizations sponsored by the Prigogine Center at The University of Texas at Austin and the Plexus Institute to explore strategies for understanding uncertainty and surprise. Besides distributions to the conference it includes a key digest by the editors as well as a commentary by the late nobel laureat Ilya Prigogine, "Surprises in half of a century". The book is intended for researchers and scientists in complexity science as well as for a broad interdisciplinary audience of both practitioners and scholars. It will well serve those interested in the research issues and in the application of complexity science to physical and social systems.

  1. Models of Automation surprise : results of a field survey in aviation

    NARCIS (Netherlands)

    De Boer, Robert; Dekker, Sidney

    2017-01-01

    Automation surprises in aviation continue to be a significant safety concern and the community’s search for effective strategies to mitigate them are ongoing. The literature has offered two fundamentally divergent directions, based on different ideas about the nature of cognition and collaboration

  2. Surprise Gift” Purchases of Small Electric Appliances: A Pilot Study

    NARCIS (Netherlands)

    J. Vanhamme (Joëlle); C.J.P.M. de Bont (Cees)

    2005-01-01

    textabstractUnderstanding decision-making processes for gifts is of strategic importance for companies selling small electrical appliances as gifts account for a large part of their sales. Among all gifts, the ones that are surprising are the most valued by recipients. However, research about

  3. Surprising results: HIV testing and changes in contraceptive practices among young women in Malawi

    Science.gov (United States)

    Sennott, Christie; Yeatman, Sara

    2015-01-01

    This study uses eight waves of data from the population-based Tsogolo la Thanzi study (2009–2011) in rural Malawi to examine changes in young women’s contraceptive practices, including the use of condoms, non-barrier contraceptive methods, and abstinence, following positive and negative HIV tests. The analysis factors in women’s prior perceptions of their HIV status that may already be shaping their behaviour and separates surprise HIV test results from those that merely confirm what was already believed. Fixed effects logistic regression models show that HIV testing frequently affects the contraceptive practices of young Malawian women, particularly when the test yields an unexpected result. Specifically, women who are surprised to test HIV positive increase their condom use and are more likely to use condoms consistently. Following an HIV negative test (whether a surprise or expected), women increase their use of condoms and decrease their use of non-barrier contraceptives; the latter may be due to an increase in abstinence following a surprise negative result. Changes in condom use following HIV testing are robust to the inclusion of potential explanatory mechanisms including fertility preferences, relationship status, and the perception that a partner is HIV positive. The results demonstrate that both positive and negative tests can influence women’s sexual and reproductive behaviours, and emphasise the importance of conceptualizing of HIV testing as offering new information only insofar as results deviate from prior perceptions of HIV status. PMID:26160156

  4. The effect of emotionally valenced eye region images on visuocortical processing of surprised faces.

    Science.gov (United States)

    Li, Shuaixia; Li, Ping; Wang, Wei; Zhu, Xiangru; Luo, Wenbo

    2018-05-01

    In this study, we presented pictorial representations of happy, neutral, and fearful expressions projected in the eye regions to determine whether the eye region alone is sufficient to produce a context effect. Participants were asked to judge the valence of surprised faces that had been preceded by a picture of an eye region. Behavioral results showed that affective ratings of surprised faces were context dependent. Prime-related ERPs with presentation of happy eyes elicited a larger P1 than those for neutral and fearful eyes, likely due to the recognition advantage provided by a happy expression. Target-related ERPs showed that surprised faces in the context of fearful and happy eyes elicited dramatically larger C1 than those in the neutral context, which reflected the modulation by predictions during the earliest stages of face processing. There were larger N170 with neutral and fearful eye contexts compared to the happy context, suggesting faces were being integrated with contextual threat information. The P3 component exhibited enhanced brain activity in response to faces preceded by happy and fearful eyes compared with neutral eyes, indicating motivated attention processing may be involved at this stage. Altogether, these results indicate for the first time that the influence of isolated eye regions on the perception of surprised faces involves preferential processing at the early stages and elaborate processing at the late stages. Moreover, higher cognitive processes such as predictions and attention can modulate face processing from the earliest stages in a top-down manner. © 2017 Society for Psychophysiological Research.

  5. On the surprising rigidity of the Pauli exclusion principle

    International Nuclear Information System (INIS)

    Greenberg, O.W.

    1989-01-01

    I review recent attempts to construct a local quantum field theory of small violations of the Pauli exclusion principle and suggest a qualitative reason for the surprising rigidity of the Pauli principle. I suggest that small violations can occur in our four-dimensional world as a consequence of the compactification of a higher-dimensional theory in which the exclusion principle is exactly valid. I briefly mention a recent experiment which places a severe limit on possible violations of the exclusion principle. (orig.)

  6. Academic Innovation in the Commercial Domain: Case Studies of Successful Transfers of University-Developed Technologies.

    Science.gov (United States)

    Powers, Joshua B.

    In recent years, considerable attention has been directed toward higher educations role as a driver of economic reform. Yet, surprisingly little is known about the processes and mechanisms by which academic innovations are successfully commercialized. The specific question is, what factors explain why some licensed innovations become bona fide…

  7. Cloud Surprises in Moving NASA EOSDIS Applications into Amazon Web Services

    Science.gov (United States)

    Mclaughlin, Brett

    2017-01-01

    NASA ESDIS has been moving a variety of data ingest, distribution, and science data processing applications into a cloud environment over the last 2 years. As expected, there have been a number of challenges in migrating primarily on-premises applications into a cloud-based environment, related to architecture and taking advantage of cloud-based services. What was not expected is a number of issues that were beyond purely technical application re-architectures. We ran into surprising network policy limitations, billing challenges in a government-based cost model, and difficulty in obtaining certificates in an NASA security-compliant manner. On the other hand, this approach has allowed us to move a number of applications from local hosting to the cloud in a matter of hours (yes, hours!!), and our CMR application now services 95% of granule searches and an astonishing 99% of all collection searches in under a second. And most surprising of all, well, you'll just have to wait and see the realization that caught our entire team off guard!

  8. A surprising exception. Himachal's success in promoting female education.

    Science.gov (United States)

    Dreze, J

    1999-01-01

    Gender inequalities in India are derived partly from the economic dependence of women on men. Low levels of formal education among women reinforce the asymmetry of power between the sexes. A general pattern of sharp gender bias in education levels is noted in most Indian states; however, in the small state of Himachal Pradesh, school participation rates are almost as high for girls as for boys. Rates of school participation for girls at the primary level is close to universal in this state, and while gender bias persists at higher levels of education, it is much lower than elsewhere in India and rapidly declining. This was not the case 50 years ago; educational levels in Himachal Pradesh were no higher than in Bihar or Uttar Pradesh. Today, the spectacular transition towards universal elementary education in Himachal Pradesh has contributed to the impressive reduction of poverty, mortality, illness, undernutrition, and related deprivations.

  9. Risk, surprises and black swans fundamental ideas and concepts in risk assessment and risk management

    CERN Document Server

    Aven, Terje

    2014-01-01

    Risk, Surprises and Black Swans provides an in depth analysis of the risk concept with a focus on the critical link to knowledge; and the lack of knowledge, that risk and probability judgements are based on.Based on technical scientific research, this book presents a new perspective to help you understand how to assess and manage surprising, extreme events, known as 'Black Swans'. This approach looks beyond the traditional probability-based principles to offer a broader insight into the important aspects of uncertain events and in doing so explores the ways to manage them.

  10. Spatiotemporal neural characterization of prediction error valence and surprise during reward learning in humans.

    Science.gov (United States)

    Fouragnan, Elsa; Queirazza, Filippo; Retzler, Chris; Mullinger, Karen J; Philiastides, Marios G

    2017-07-06

    Reward learning depends on accurate reward associations with potential choices. These associations can be attained with reinforcement learning mechanisms using a reward prediction error (RPE) signal (the difference between actual and expected rewards) for updating future reward expectations. Despite an extensive body of literature on the influence of RPE on learning, little has been done to investigate the potentially separate contributions of RPE valence (positive or negative) and surprise (absolute degree of deviation from expectations). Here, we coupled single-trial electroencephalography with simultaneously acquired fMRI, during a probabilistic reversal-learning task, to offer evidence of temporally overlapping but largely distinct spatial representations of RPE valence and surprise. Electrophysiological variability in RPE valence correlated with activity in regions of the human reward network promoting approach or avoidance learning. Electrophysiological variability in RPE surprise correlated primarily with activity in regions of the human attentional network controlling the speed of learning. Crucially, despite the largely separate spatial extend of these representations our EEG-informed fMRI approach uniquely revealed a linear superposition of the two RPE components in a smaller network encompassing visuo-mnemonic and reward areas. Activity in this network was further predictive of stimulus value updating indicating a comparable contribution of both signals to reward learning.

  11. Efficient reconfigurable hardware architecture for accurately computing success probability and data complexity of linear attacks

    DEFF Research Database (Denmark)

    Bogdanov, Andrey; Kavun, Elif Bilge; Tischhauser, Elmar

    2012-01-01

    An accurate estimation of the success probability and data complexity of linear cryptanalysis is a fundamental question in symmetric cryptography. In this paper, we propose an efficient reconfigurable hardware architecture to compute the success probability and data complexity of Matsui's Algorithm...... block lengths ensures that any empirical observations are not due to differences in statistical behavior for artificially small block lengths. Rather surprisingly, we observed in previous experiments a significant deviation between the theory and practice for Matsui's Algorithm 2 for larger block sizes...

  12. Surprises and counterexamples in real function theory

    CERN Document Server

    Rajwade, A R

    2007-01-01

    This book presents a variety of intriguing, surprising and appealing topics and nonroutine theorems in real function theory. It is a reference book to which one can turn for finding that arise while studying or teaching analysis.Chapter 1 is an introduction to algebraic, irrational and transcendental numbers and contains the Cantor ternary set. Chapter 2 contains functions with extraordinary properties; functions that are continuous at each point but differentiable at no point. Chapters 4 and intermediate value property, periodic functions, Rolle's theorem, Taylor's theorem, points of tangents. Chapter 6 discusses sequences and series. It includes the restricted harmonic series, of alternating harmonic series and some number theoretic aspects. In Chapter 7, the infinite peculiar range of convergence is studied. Appendix I deal with some specialized topics. Exercises at the end of chapters and their solutions are provided in Appendix II.This book will be useful for students and teachers alike.

  13. Surprises in the suddenly-expanded infinite well

    International Nuclear Information System (INIS)

    Aslangul, Claude

    2008-01-01

    I study the time evolution of a particle prepared in the ground state of an infinite well after the latter is suddenly expanded. It turns out that the probability density |Ψ(x, t)| 2 shows up quite a surprising behaviour: for definite times, plateaux appear for which |Ψ(x, t)| 2 is constant on finite intervals for x. Elements of theoretical explanation are given by analysing the singular component of the second derivative ∂ xx Ψ(x, t). Analytical closed expressions are obtained for some specific times, which easily allow us to show that, at these times, the density organizes itself into regular patterns provided the size of the box is large enough; more, above some critical size depending on the specific time, the density patterns are independent of the expansion parameter. It is seen how the density at these times simply results from a construction game with definite rules acting on the pieces of the initial density

  14. NR sulphur vulcanization: Interaction study between TBBS and DPG by means of a combined experimental rheometer and meta-model best fitting strategy

    Energy Technology Data Exchange (ETDEWEB)

    Milani, G., E-mail: gabriele.milani@polimi.it [Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milan (Italy); Hanel, T.; Donetti, R. [Pirelli Tyre, Via Alberto e Piero Pirelli 25, 20126 Milan (Italy); Milani, F. [Chem. Co, Via J.F. Kennedy 2, 45030 Occhiobello (Italy)

    2016-06-08

    The paper is aimed at studying the possible interaction between two different accelerators (DPG and TBBS) in the chemical kinetic of Natural Rubber (NR) vulcanized with sulphur. The same blend with several DPG and TBBS concentrations is deeply analyzed from an experimental point of view, varying the curing temperature in the range 150-180°C and obtaining rheometer curves with a step of 10°C. In order to study any possible interaction between the two accelerators –and eventually evaluating its engineering relevance-rheometer data are normalized by means of the well known Sun and Isayev normalization approach and two output parameters are assumed as meaningful to have an insight into the possible interaction, namely time at maximum torque and reversion percentage. Two different numerical meta-models, which belong to the family of the so-called response surfaces RS are compared. The first is linear against TBBS and DPG and therefore well reproduces no interaction between the accelerators, whereas the latter is a non-linear RS with bilinear term. Both RS are deduced from standard best fitting of experimental data available. It is found that, generally, there is a sort of interaction between TBBS and DPG, but that the error introduced making use of a linear model (no interaction) is generally lower than 10%, i.e. fully acceptable from an engineering standpoint.

  15. X-rays from comets - a surprising discovery

    CERN Document Server

    CERN. Geneva

    2000-01-01

    Comets are kilometre-size aggregates of ice and dust, which remained from the formation of the solar system. It was not obvious to expect X-ray emission from such objects. Nevertheless, when comet Hyakutake (C/1996 B2) was observed with the ROSAT X-ray satellite during its close approach to Earth in March 1996, bright X-ray emission from this comet was discovered. This finding triggered a search in archival ROSAT data for comets, which might have accidentally crossed the field of view during observations of unrelated targets. To increase the surprise even more, X-ray emission was detected from four additional comets, which were optically 300 to 30 000 times fainter than Hyakutake. For one of them, comet Arai (C/1991 A2), X-ray emission was even found in data which were taken six weeks before the comet was optically discovered. These findings showed that comets represent a new class of celestial X-ray sources. The subsequent detection of X-ray emission from several other comets in dedicated observations confir...

  16. Beyond interests and institutions: US health policy reform and the surprising silence of big business.

    Science.gov (United States)

    Smyrl, Marc E

    2014-02-01

    Interest-based arguments do not provide satisfying explanations for the surprising reticence of major US employers to take a more active role in the debate surrounding the 2010 Patient Protection and Affordable Care Act (ACA). Through focused comparison with the Bismarckian systems of France and Germany, on the one hand, and with the 1950s and 1960s in the United States, on the other, this article concludes that while institutional elements do account for some of the observed behavior of big business, a necessary complement to this is a fuller understanding of the historically determined legitimating ideology of US firms. From the era of the "corporate commonwealth," US business inherited the principles of private welfare provision and of resistance to any expansion of government control. Once complementary, these principles are now mutually exclusive: employer-provided health insurance increasingly is possible only at the cost of ever-increasing government subsidy and regulation. Paralyzed by the uncertainty that followed from this clash of legitimate ideas, major employers found themselves unable to take a coherent and unified stand for or against the law. As a consequence, they failed either to oppose it successfully or to secure modifications to it that would have been useful to them.

  17. Human Amygdala Tracks a Feature-Based Valence Signal Embedded within the Facial Expression of Surprise.

    Science.gov (United States)

    Kim, M Justin; Mattek, Alison M; Bennett, Randi H; Solomon, Kimberly M; Shin, Jin; Whalen, Paul J

    2017-09-27

    Human amygdala function has been traditionally associated with processing the affective valence (negative vs positive) of an emotionally charged event, especially those that signal fear or threat. However, this account of human amygdala function can be explained by alternative views, which posit that the amygdala might be tuned to either (1) general emotional arousal (activation vs deactivation) or (2) specific emotion categories (fear vs happy). Delineating the pure effects of valence independent of arousal or emotion category is a challenging task, given that these variables naturally covary under many circumstances. To circumvent this issue and test the sensitivity of the human amygdala to valence values specifically, we measured the dimension of valence within the single facial expression category of surprise. Given the inherent valence ambiguity of this category, we show that surprised expression exemplars are attributed valence and arousal values that are uniquely and naturally uncorrelated. We then present fMRI data from both sexes, showing that the amygdala tracks these consensus valence values. Finally, we provide evidence that these valence values are linked to specific visual features of the mouth region, isolating the signal by which the amygdala detects this valence information. SIGNIFICANCE STATEMENT There is an open question as to whether human amygdala function tracks the valence value of cues in the environment, as opposed to either a more general emotional arousal value or a more specific emotion category distinction. Here, we demonstrate the utility of surprised facial expressions because exemplars within this emotion category take on valence values spanning the dimension of bipolar valence (positive to negative) at a consistent level of emotional arousal. Functional neuroimaging data showed that amygdala responses tracked the valence of surprised facial expressions, unconfounded by arousal. Furthermore, a machine learning classifier identified

  18. Bagpipes and Artichokes: Surprise as a Stimulus to Learning in the Elementary Music Classroom

    Science.gov (United States)

    Jacobi, Bonnie Schaffhauser

    2016-01-01

    Incorporating surprise into music instruction can stimulate student attention, curiosity, and interest. Novelty focuses attention in the reticular activating system, increasing the potential for brain memory storage. Elementary ages are ideal for introducing novel instruments, pieces, composers, or styles of music. Young children have fewer…

  19. Model-Assisted Control of Flow Front in Resin Transfer Molding Based on Real-Time Estimation of Permeability/Porosity Ratio

    Directory of Open Access Journals (Sweden)

    Bai-Jian Wei

    2016-09-01

    Full Text Available Resin transfer molding (RTM is a popular manufacturing technique that produces fiber reinforced polymer (FRP composites. In this paper, a model-assisted flow front control system is developed based on real-time estimation of permeability/porosity ratio using the information acquired by a visualization system. In the proposed control system, a radial basis function (RBF network meta-model is utilized to predict the position of the future flow front by inputting the injection pressure, the current position of flow front, and the estimated ratio. By conducting optimization based on the meta-model, the value of injection pressure to be implemented at each step is obtained. Moreover, a cascade control structure is established to further improve the control performance. Experiments show that the developed system successfully enhances the performance of flow front control in RTM. Especially, the cascade structure makes the control system robust to model mismatch.

  20. The Educational Philosophies of Mordecai Kaplan and Michael Rosenak: Surprising Similarities and Illuminating Differences

    Science.gov (United States)

    Schein, Jeffrey; Caplan, Eric

    2014-01-01

    The thoughts of Mordecai Kaplan and Michael Rosenak present surprising commonalities as well as illuminating differences. Similarities include the perception that Judaism and Jewish education are in crisis, the belief that Jewish peoplehood must include commitment to meaningful content, the need for teachers to teach from a position of…

  1. The Impact of a Surprise Dividend Increase on a Stocks Performance : the Analysis of Companies Listed on the Warsaw Stock Exchange

    Directory of Open Access Journals (Sweden)

    Tomasz Słoński

    2012-01-01

    Full Text Available The reaction of marginal investors to the announcement of a surprise dividend increase has been measured. Although field research is performed on companies listed on the Warsaw Stock Exchange, the paper has important theoretical implications. Valuation theory gives many clues for the interpretation of changes in dividends. At the start of the literature review, the assumption of the irrelevance of dividends (to investment decisions is described. This assumption is the basis for up-to-date valuation procedures leading to fundamental and fair market valuation of equity (shares. The paper is designed to verify whether the market value of stock is immune to the surprise announcement of a dividend increase. This study of the effect of a surprise dividend increase gives the chance to partially isolate such an event from dividend changes based on long-term expectations. The result of the research explicitly shows that a surprise dividend increase is on average welcomed by investors (an average abnormal return of 2.24% with an associated p-value of 0.001. Abnormal returns are realized by investors when there is a surprise increase in a dividend payout. The subsample of relatively high increases in a dividend payout enables investors to gain a 3.2% return on average. The results show that valuation models should be revised to take into account a possible impact of dividend changes on investors behavior. (original abstract

  2. Dealing with unexpected events on the flight deck : A conceptual model of startle and surprise

    NARCIS (Netherlands)

    Landman, H.M.; Groen, E.L.; Paassen, M.M. van; Bronkhorst, A.W.; Mulder, M.

    2017-01-01

    Objective: A conceptual model is proposed in order to explain pilot performance in surprising and startling situations. Background: Today’s debate around loss of control following in-flight events and the implementation of upset prevention and recovery training has highlighted the importance of

  3. Statistical factors affecting the success of nuclear operations

    International Nuclear Information System (INIS)

    Sunder, S.; Stephenson, J.R.; Hochman, D.

    1999-01-01

    In this article, the authors present a statistical analysis to determine the operational, financial, technical, and managerial factors that most significantly affect the success of nuclear operations. The study analyzes data for over 70 nuclear plants and 40 operating companies over a period of five years in order to draw conclusions that they hope will be of interest to utility companies and public utility commissions as they seek ways to improve rates of success in nuclear operations. Some of these conclusions will not be surprising--for example, that older plants have heavier maintenance requirements--but others are less intuitive. For instance, the observation that operators of fewer plants have lower costs suggests that any experience curve benefits associated with managing multiple nuclear facilities is overshadowed by the logistic problems of multiple facilities. After presenting a brief history of nuclear power in America, the authors outline the motivations of the study and the methodology of their analysis. They end the article with the results of the study and discuss some of the managerial implications of these findings

  4. Semantic relation vs. surprise: the differential effects of related and unrelated co-verbal gestures on neural encoding and subsequent recognition.

    Science.gov (United States)

    Straube, Benjamin; Meyer, Lea; Green, Antonia; Kircher, Tilo

    2014-06-03

    Speech-associated gesturing leads to memory advantages for spoken sentences. However, unexpected or surprising events are also likely to be remembered. With this study we test the hypothesis that different neural mechanisms (semantic elaboration and surprise) lead to memory advantages for iconic and unrelated gestures. During fMRI-data acquisition participants were presented with video clips of an actor verbalising concrete sentences accompanied by iconic gestures (IG; e.g., circular gesture; sentence: "The man is sitting at the round table"), unrelated free gestures (FG; e.g., unrelated up down movements; same sentence) and no gestures (NG; same sentence). After scanning, recognition performance for the three conditions was tested. Videos were evaluated regarding semantic relation and surprise by a different group of participants. The semantic relationship between speech and gesture was rated higher for IG (IG>FG), whereas surprise was rated higher for FG (FG>IG). Activation of the hippocampus correlated with subsequent memory performance of both gesture conditions (IG+FG>NG). For the IG condition we found activation in the left temporal pole and middle cingulate cortex (MCC; IG>FG). In contrast, for the FG condition posterior thalamic structures (FG>IG) as well as anterior and posterior cingulate cortices were activated (FG>NG). Our behavioral and fMRI-data suggest different mechanisms for processing related and unrelated co-verbal gestures, both of them leading to enhanced memory performance. Whereas activation in MCC and left temporal pole for iconic co-verbal gestures may reflect semantic memory processes, memory enhancement for unrelated gestures relies on the surprise response, mediated by anterior/posterior cingulate cortex and thalamico-hippocampal structures. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Hepatobiliary fascioliasis in non-endemic zones: a surprise diagnosis.

    Science.gov (United States)

    Jha, Ashish Kumar; Goenka, Mahesh Kumar; Goenka, Usha; Chakrabarti, Amrita

    2013-03-01

    Fascioliasis is a zoonotic infection caused by Fasciola hepatica. Because of population migration and international food trade, human fascioliasis is being an increasingly recognised entity in nonendemic zones. In most parts of Asia, hepatobiliary fascioliasis is sporadic. Human hepatobiliary infection by this trematode has two distinct phases: an acute hepatic phase and a chronic biliary phase. Hepatobiliary infection is mostly associated with intense peripheral eosinophilia. In addition to classically defined hepatic phase and biliary phase fascioliasis, some cases may have an overlap of these two phases. Chronic liver abscess formation is a rare presentation. We describe a surprise case of hepatobiliary fascioliasis who presented to us with liver abscess without intense peripheral eosinophilia, a rare presentation of human fascioliasis especially in non-endemic zones. Copyright © 2013 Arab Journal of Gastroenterology. Published by Elsevier Ltd. All rights reserved.

  6. Surprising radiation detectors

    CERN Document Server

    Fleischer, Robert

    2003-01-01

    Radiation doses received by the human body can be measured indirectly and retrospectively by counting the tracks left by particles in ordinary objects like pair of spectacles, glassware, compact disks...This method has been successfully applied to determine neutron radiation doses received 50 years ago on the Hiroshima site. Neutrons themselves do not leave tracks in bulk matter but glass contains atoms of uranium that may fission when hurt by a neutron, the recoil of the fission fragments generates a track that is detectable. The most difficult is to find adequate glass items and to evaluate the radiation shield they benefited at their initial place. The same method has been used to determine the radiation dose due to the pile-up of radon in houses. In that case the tracks left by alpha particles due to the radioactive decay of polonium-210 have been counted on the superficial layer of the window panes. Other materials like polycarbonate plastics have been used to determine the radiation dose due to heavy io...

  7. Breeding ecology of the southern shrike, Lanius meridionalis, in an agrosystem of south–eastern Spain: the surprisingly excellent breeding success in a declining population

    Energy Technology Data Exchange (ETDEWEB)

    Moreno-Rueda, G.; Abril-Colon, I.; Lopez-Orta, A.; Alvarez-Benito, I.; Castillo-Gomez, C.; Comas, M.; Rivas, J.M.

    2016-07-01

    The southern shrike, Lanius meridionalis, is declining at the Spanish and European level. One cause of this decline could be low reproductive success due to low availability of prey in agricultural environments. To investigate this possibility we analysed the breeding ecology of a population of southern shrike in an agrosystem in Lomas de Padul (SE Spain). Our results suggest the population is declining in this area. However, contrary to expectations, the population showed the highest reproductive success (% nests in which at least one egg produces a fledgling) reported for this species to date (83.3%), with a productivity of 4.04 fledglings per nest. Reproductive success varied throughout the years, ranging from 75% in the worst year to 92.9% in the best year. Similarly, productivity ranged from 3.25 to 5.0 fledglings per nest depending on the year. Other aspects of reproductive biology, such as clutch size, brood size, and nestling diet, were similar to those reported in other studies. Based on these results, we hypothesise that the determinant of population decline acts on the juvenile fraction, drastically reducing the recruitment rate, or affecting the dispersion of adults and recruits. Nevertheless, the exact factor or factors are unknown. This study shows that a high reproductive success does not guarantee good health status of the population. (Author)

  8. Team play with a powerful and independent agent: operational experiences and automation surprises on the Airbus A-320

    Science.gov (United States)

    Sarter, N. B.; Woods, D. D.

    1997-01-01

    Research and operational experience have shown that one of the major problems with pilot-automation interaction is a lack of mode awareness (i.e., the current and future status and behavior of the automation). As a result, pilots sometimes experience so-called automation surprises when the automation takes an unexpected action or fails to behave as anticipated. A lack of mode awareness and automation surprises can he viewed as symptoms of a mismatch between human and machine properties and capabilities. Changes in automation design can therefore he expected to affect the likelihood and nature of problems encountered by pilots. Previous studies have focused exclusively on early generation "glass cockpit" aircraft that were designed based on a similar automation philosophy. To find out whether similar difficulties with maintaining mode awareness are encountered on more advanced aircraft, a corpus of automation surprises was gathered from pilots of the Airbus A-320, an aircraft characterized by high levels of autonomy, authority, and complexity. To understand the underlying reasons for reported breakdowns in human-automation coordination, we also asked pilots about their monitoring strategies and their experiences with and attitude toward the unique design of flight controls on this aircraft.

  9. Surprisal analysis of Glioblastoma Multiform (GBM) microRNA dynamics unveils tumor specific phenotype.

    Science.gov (United States)

    Zadran, Sohila; Remacle, Francoise; Levine, Raphael

    2014-01-01

    Gliomablastoma multiform (GBM) is the most fatal form of all brain cancers in humans. Currently there are limited diagnostic tools for GBM detection. Here, we applied surprisal analysis, a theory grounded in thermodynamics, to unveil how biomolecule energetics, specifically a redistribution of free energy amongst microRNAs (miRNAs), results in a system deviating from a non-cancer state to the GBM cancer -specific phenotypic state. Utilizing global miRNA microarray expression data of normal and GBM patients tumors, surprisal analysis characterizes a miRNA system response capable of distinguishing GBM samples from normal tissue biopsy samples. We indicate that the miRNAs contributing to this system behavior is a disease phenotypic state specific to GBM and is therefore a unique GBM-specific thermodynamic signature. MiRNAs implicated in the regulation of stochastic signaling processes crucial in the hallmarks of human cancer, dominate this GBM-cancer phenotypic state. With this theory, we were able to distinguish with high fidelity GBM patients solely by monitoring the dynamics of miRNAs present in patients' biopsy samples. We anticipate that the GBM-specific thermodynamic signature will provide a critical translational tool in better characterizing cancer types and in the development of future therapeutics for GBM.

  10. Surprise, Memory, and Retrospective Judgment Making: Testing Cognitive Reconstruction Theories of the Hindsight Bias Effect

    Science.gov (United States)

    Ash, Ivan K.

    2009-01-01

    Hindsight bias has been shown to be a pervasive and potentially harmful decision-making bias. A review of 4 competing cognitive reconstruction theories of hindsight bias revealed conflicting predictions about the role and effect of expectation or surprise in retrospective judgment formation. Two experiments tested these predictions examining the…

  11. An adaptive sampling method for variable-fidelity surrogate models using improved hierarchical kriging

    Science.gov (United States)

    Hu, Jiexiang; Zhou, Qi; Jiang, Ping; Shao, Xinyu; Xie, Tingli

    2018-01-01

    Variable-fidelity (VF) modelling methods have been widely used in complex engineering system design to mitigate the computational burden. Building a VF model generally includes two parts: design of experiments and metamodel construction. In this article, an adaptive sampling method based on improved hierarchical kriging (ASM-IHK) is proposed to refine the improved VF model. First, an improved hierarchical kriging model is developed as the metamodel, in which the low-fidelity model is varied through a polynomial response surface function to capture the characteristics of a high-fidelity model. Secondly, to reduce local approximation errors, an active learning strategy based on a sequential sampling method is introduced to make full use of the already required information on the current sampling points and to guide the sampling process of the high-fidelity model. Finally, two numerical examples and the modelling of the aerodynamic coefficient for an aircraft are provided to demonstrate the approximation capability of the proposed approach, as well as three other metamodelling methods and two sequential sampling methods. The results show that ASM-IHK provides a more accurate metamodel at the same simulation cost, which is very important in metamodel-based engineering design problems.

  12. Design and Analysis of simulation experiments : Tutorial

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2017-01-01

    This tutorial reviews the design and analysis of simulation experiments. These experiments may have various goals: validation, prediction, sensitivity analysis, optimization (possibly robust), and risk or uncertainty analysis. These goals may be realized through metamodels. Two types of metamodels

  13. Physics Nobel prize 2004: Surprising theory wins physics Nobel

    CERN Multimedia

    2004-01-01

    From left to right: David Politzer, David Gross and Frank Wilczek. For their understanding of counter-intuitive aspects of the strong force, which governs quarks inside protons and neutrons, on 5 October three American physicists were awarded the 2004 Nobel Prize in Physics. David J. Gross (Kavli Institute of Theoretical Physics, University of California, Santa Barbara), H. David Politzer (California Institute of Technology), and Frank Wilczek (Massachusetts Institute of Technology) made a key theoretical discovery with a surprising result: the closer quarks are together, the weaker the force - opposite to what is seen with electromagnetism and gravity. Rather, the strong force is analogous to a rubber band stretching, where the force increases as the quarks get farther apart. These physicists discovered this property of quarks, known as asymptotic freedom, in 1976. It later became a key part of the theory of quantum chromodynamics (QCD) and the Standard Model, the current best theory to describe the interac...

  14. Surprising judgments about robot drivers: Experiments on rising expectations and blaming humans

    Directory of Open Access Journals (Sweden)

    Peter Danielson

    2015-05-01

    Full Text Available N-Reasons is an experimental Internet survey platform designed to enhance public participation in applied ethics and policy. N-Reasons encourages individuals to generate reasons to support their judgments, and groups to converge on a common set of reasons pro and con various issues.  In the Robot Ethics Survey some of the reasons contributed surprising judgments about autonomous machines. Presented with a version of the trolley problem with an autonomous train as the agent, participants gave unexpected answers, revealing high expectations for the autonomous machine and shifting blame from the automated device to the humans in the scenario. Further experiments with a standard pair of human-only trolley problems refine these results. While showing the high expectations even when no autonomous machine is involved, human bystanders are only blamed in the machine case. A third experiment explicitly aimed at responsibility for driverless cars confirms our findings about shifting blame in the case of autonomous machine agents. We conclude methodologically that both results point to the power of an experimental survey based approach to public participation to explore surprising assumptions and judgments in applied ethics. However, both results also support using caution when interpreting survey results in ethics, demonstrating the importance of qualitative data to provide further context for evaluating judgments revealed by surveys. On the ethics side, the result about shifting blame to humans interacting with autonomous machines suggests caution about the unintended consequences of intuitive principles requiring human responsibility.http://dx.doi.org/10.5324/eip.v9i1.1727

  15. Negotiated Grammar Transformation

    NARCIS (Netherlands)

    V. Zaytsev (Vadim)

    2012-01-01

    htmlabstractIn this paper, we study controlled adaptability of metamodel transformations. We consider one of the most rigid metamodel transformation formalisms — automated grammar transformation with operator suites, where a transformation script is built in such a way that it is essentially meant

  16. Cloud Surprises Discovered in Moving NASA EOSDIS Applications into Amazon Web Services… and #6 Will Shock You!

    Science.gov (United States)

    McLaughlin, B. D.; Pawloski, A. W.

    2017-12-01

    NASA ESDIS has been moving a variety of data ingest, distribution, and science data processing applications into a cloud environment over the last 2 years. As expected, there have been a number of challenges in migrating primarily on-premises applications into a cloud-based environment, related to architecture and taking advantage of cloud-based services. What was not expected is a number of issues that were beyond purely technical application re-architectures. From surprising network policy limitations, billing challenges in a government-based cost model, and obtaining certificates in an NASA security-compliant manner to working with multiple applications in a shared and resource-constrained AWS account, these have been the relevant challenges in taking advantage of a cloud model. And most surprising of all… well, you'll just have to wait and see the "gotcha" that caught our entire team off guard!

  17. The Value of Change: Surprises and Insights in Stellar Evolution

    Science.gov (United States)

    Bildsten, Lars

    2018-01-01

    Astronomers with large-format cameras regularly scan the sky many times per night to detect what's changing, and telescopes in space such as Kepler and, soon, TESS obtain very accurate brightness measurements of nearly a million stars over time periods of years. These capabilities, in conjunction with theoretical and computational efforts, have yielded surprises and remarkable new insights into the internal properties of stars and how they end their lives. I will show how asteroseismology reveals the properties of the deep interiors of red giants, and highlight how astrophysical transients may be revealing unusual thermonuclear outcomes from exploding white dwarfs and the births of highly magnetic neutron stars. All the while, stellar science has been accelerated by the availability of open source tools, such as Modules for Experiments in Stellar Astrophysics (MESA), and the nearly immediate availability of observational results.

  18. On predicting quantal cross sections by interpolation: Surprisal analysis of j/sub z/CCS and statistical j/sub z/ results

    International Nuclear Information System (INIS)

    Goldflam, R.; Kouri, D.J.

    1976-01-01

    New methods for predicting the full matrix of integral cross sections are developed by combining the surprisal analysis of Bernstein and Levine with the j/sub z/-conserving coupled states method (j/sub z/CCS) of McGuire, Kouri, and Pack and with the statistical j/sub z/ approximation (Sj/sub z/) of Kouri, Shimoni, and Heil. A variety of approaches is possible and only three are studied in the present work. These are (a) a surprisal fit of the j=0→j' column of the j/sub z/CCS cross section matrix (thereby requiring only a solution of the lambda=0 set of j/sub z/CCS equations), (b) a surprisal fit of the lambda-bar=0 Sj/sub z/ cross section matrix (again requiring solution of the lambda=0 set of j/sub z/CCS equations only), and (c) a surprisal fit of a lambda-bar not equal to 0 Sj/sub z/ submatrix (involving input cross sections for j,j'> or =lambda-bar transitions only). The last approach requires the solution of the lambda=lambda-bar set of j/sub z/CCS equations only, which requires less computation effort than the effective potential method. We explore three different choices for the prior and two-parameter (i.e., linear) and three-parameter (i.e., parabolic) fits as applied to Ar--N 2 collisions. The results are in general very encouraging and for one choice of prior give results which are within 20% of the exact j/sub z/CCS results

  19. Changing Perspectives on Basic Research in Adult Learning and Memory

    Science.gov (United States)

    Hultsch, David F.

    1977-01-01

    It is argued that wheather the course of cognitive development is characterized by growth, stability, or decline is less a matter of the metamodel on which the theories and data are based. Such metamodels are representations of reality that are not empirically testable. (Author)

  20. BUSINESS SUCCESS IN TODAY'S ROMANIA: OPINIONS EXPRESSED BY STUDENTS AND ENTREPRENEURS

    Directory of Open Access Journals (Sweden)

    Elena NEDELCU

    2016-06-01

    Full Text Available We consider that a study - which contributes to the further knowledge of the entrepreneurial spirit of the Romanian students (to what extent and in what manner this spirit manifests itself, the students' and entrepreneurs' relation to the business environment and the "nowadays" challenges of the workforce - is both necessary and useful. Moreover, the present study aims at identifying the existence of possible differences between the way in which students evolve and the way in which entrepreneurs assess certain elements that make up the Romanian business environment and that might contribute to their business success. Which are "the keys to success" in business - according to students? What about the entrepreneurs? What would be more useful for business success: the knowledge of success patterns, training and qualification, access to information, to financial resources, competence (knowing what to do or a friendly business environment? The research method that we have used is the social inquiry based on surveys. The survey was applied to 1,500 students from Universities within Bucharest. The analysis of data has surprised because "coping personal abilities" have turned out to be "the keys of success" in business in Romania - according to students (67% and entrepreneurs (86%. The significant differences between the students' and entrepreneurs' answers have been included within the "professional competence" criterion and the "rules observance" criterion. In comparison with entrepreneurs, students appreciate these criteria to a larger extent.

  1. SIMPLIFIED PREDICTIVE MODELS FOR CO₂ SEQUESTRATION PERFORMANCE ASSESSMENT RESEARCH TOPICAL REPORT ON TASK #3 STATISTICAL LEARNING BASED MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, Srikanta; Schuetter, Jared

    2014-11-01

    We compare two approaches for building a statistical proxy model (metamodel) for CO₂ geologic sequestration from the results of full-physics compositional simulations. The first approach involves a classical Box-Behnken or Augmented Pairs experimental design with a quadratic polynomial response surface. The second approach used a space-filling maxmin Latin Hypercube sampling or maximum entropy design with the choice of five different meta-modeling techniques: quadratic polynomial, kriging with constant and quadratic trend terms, multivariate adaptive regression spline (MARS) and additivity and variance stabilization (AVAS). Simulations results for CO₂ injection into a reservoir-caprock system with 9 design variables (and 97 samples) were used to generate the data for developing the proxy models. The fitted models were validated with using an independent data set and a cross-validation approach for three different performance metrics: total storage efficiency, CO₂ plume radius and average reservoir pressure. The Box-Behnken–quadratic polynomial metamodel performed the best, followed closely by the maximin LHS–kriging metamodel.

  2. Global sensitivity analysis using low-rank tensor approximations

    International Nuclear Information System (INIS)

    Konakli, Katerina; Sudret, Bruno

    2016-01-01

    In the context of global sensitivity analysis, the Sobol' indices constitute a powerful tool for assessing the relative significance of the uncertain input parameters of a model. We herein introduce a novel approach for evaluating these indices at low computational cost, by post-processing the coefficients of polynomial meta-models belonging to the class of low-rank tensor approximations. Meta-models of this class can be particularly efficient in representing responses of high-dimensional models, because the number of unknowns in their general functional form grows only linearly with the input dimension. The proposed approach is validated in example applications, where the Sobol' indices derived from the meta-model coefficients are compared to reference indices, the latter obtained by exact analytical solutions or Monte-Carlo simulation with extremely large samples. Moreover, low-rank tensor approximations are confronted to the popular polynomial chaos expansion meta-models in case studies that involve analytical rank-one functions and finite-element models pertinent to structural mechanics and heat conduction. In the examined applications, indices based on the novel approach tend to converge faster to the reference solution with increasing size of the experimental design used to build the meta-model. - Highlights: • A new method is proposed for global sensitivity analysis of high-dimensional models. • Low-rank tensor approximations (LRA) are used as a meta-modeling technique. • Analytical formulas for the Sobol' indices in terms of LRA coefficients are derived. • The accuracy and efficiency of the approach is illustrated in application examples. • LRA-based indices are compared to indices based on polynomial chaos expansions.

  3. Staggering successes amid controversy in California water management

    Science.gov (United States)

    Lund, J. R.

    2012-12-01

    Water in California has always been important and controversial, and it probably always will be. California has a large, growing economy and population in a semi-arid climate. But California's aridity, hydrologic variability, and water controversies have not precluded considerable economic successes. The successes of California's water system have stemmed from the decentralization of water management with historically punctuated periods of more centralized strategic decision-making. Decentralized management has allowed California's water users to efficiently explore incremental solutions to water problems, ranging from early local development of water systems (such as Hetch Hetchy, Owens Valley, and numerous local irrigation projects) to more contemporary efforts at water conservation, water markets, wastewater reuse, and conjunctive use of surface and groundwater. In the cacophony of local and stakeholder interests, strategic decisions have been more difficult, and consequently occur less frequently. California state water projects and Sacramento Valley flood control are examples where decades of effort, crises, floods and droughts were needed to mobilize local interests to agree to major strategic decisions. Currently, the state is faced with making strategic environmental and water management decisions regarding its deteriorating Sacramento-San Joaquin Delta. Not surprisingly, human uncertainties and physical and fiscal non-stationarities dominate this process.

  4. Prediction, Expectation, and Surprise: Methods, Designs, and Study of a Deployed Traffic Forecasting Service

    OpenAIRE

    Horvitz, Eric J.; Apacible, Johnson; Sarin, Raman; Liao, Lin

    2012-01-01

    We present research on developing models that forecast traffic flow and congestion in the Greater Seattle area. The research has led to the deployment of a service named JamBayes, that is being actively used by over 2,500 users via smartphones and desktop versions of the system. We review the modeling effort and describe experiments probing the predictive accuracy of the models. Finally, we present research on building models that can identify current and future surprises, via efforts on mode...

  5. An evolutionary concept of polycystic ovarian disease: does evolution favour reproductive success over survival?

    Science.gov (United States)

    Gleicher, Norbert; Barad, David

    2006-05-01

    Polycystic ovarian disease (PCOD) is currently considered as possibly the most frequent cause of female infertility. It is also closely associated with syndrome XX, which, in turn, is closely linked with premature and excessive mortality. Considering these adverse effects on reproductive success and human survival, the evolutionary survival of PCOD, itself considered by many to be a genetically transmitted condition, would, on first glance, appear surprising, since evolution usually discriminates against both of these traits. However, an analysis of some recently reported characteristics of the condition calls for the reconsideration of PCOD as a condition which, from an evolutionary viewpoint, favours decreased reproductive success. Indeed, the reported observations that patients with PCOD will resume spontaneous ovulation with even relatively minor weight loss, and experience later menopause than controls, suggests exactly the opposite. Under an evolutionary concept, PCOD can thus be seen as a 'fertility storage condition' which in fact favours human reproductive success and allows the human species to maintain fertility even during adverse environmental circumstances, such as famines.

  6. Drought, pollen and nectar availability, and pollination success.

    Science.gov (United States)

    Waser, Nickolas M; Price, Mary V

    2016-06-01

    Pollination success of animal-pollinated flowers depends on rate of pollinator visits and on pollen deposition per visit, both of which should vary with the pollen and nectar "neighborhoods" of a plant, i.e., with pollen and nectar availability in nearby plants. One determinant of these neighborhoods is per-flower production of pollen and nectar, which is likely to respond to environmental influences. In this study, we explored environmental effects on pollen and nectar production and on pollination success in order to follow up a surprising result from a previous study: flowers of Ipomopsis aggregata received less pollen in years of high visitation by their hummingbird pollinators. A new analysis of the earlier data indicated that high bird visitation corresponded to drought years. We hypothesized that drought might contribute to the enigmatic prior result if it decreases both nectar and pollen production: in dry years, low nectar availability could cause hummingbirds to visit flowers at a higher rate, and low pollen availability could cause them to deposit less pollen per visit. A greenhouse experiment demonstrated that drought does reduce both pollen and nectar production by I. aggregata flowers. This result was corroborated across 6 yr of variable precipitation and soil moisture in four unmanipulated field populations. In addition, experimental removal of pollen from flowers reduced the pollen received by nearby flowers. We conclude that there is much to learn about how abiotic and biotic environmental drivers jointly affect pollen and nectar production and availability, and how this contributes to pollen and nectar neighborhoods and thus influences pollination success.

  7. Vascular legacy: HOPE ADVANCEs to EMPA-REG and LEADER: A Surprising similarity

    Directory of Open Access Journals (Sweden)

    Sanjay Kalra

    2017-01-01

    Full Text Available Recently reported cardiovascular outcome studies on empagliflozin (EMPA-REG and liraglutide (LEADER have spurred interest in this field of diabetology. This commentary compares and contrasts these studies with two equally important outcome trials conducted using blood pressure lowering agents. A comparison with MICROHOPE (using ramipril and ADVANCE (using perindopril + indapamide blood pressure arms throws up interesting facts. The degree of blood pressure lowering, dissociation between cardiovascular and cerebrovascular benefits, and discordance between renal and retinal outcomes are surprisingly similar in these trials, conducted using disparate molecules. The time taken to achieve such benefits is similar for all drugs except empagliflozin. Such discussion helps inform rational and evidence-based choice of therapy and forms the framework for future research.

  8. Physical attractiveness and reproductive success in humans: Evidence from the late 20 century United States.

    Science.gov (United States)

    Jokela, Markus

    2009-09-01

    Physical attractiveness has been associated with mating behavior, but its role in reproductive success of contemporary humans has received surprisingly little attention. In the Wisconsin Longitudinal Study (1244 women, 997 men born between 1937 and 1940) we examined whether attractiveness assessed from photographs taken at age ~18 predicted the number of biological children at age 53-56. In women, attractiveness predicted higher reproductive success in a nonlinear fashion, so that attractive (second highest quartile) women had 16% and very attractive (highest quartile) women 6% more children than their less attractive counterparts. In men, there was a threshold effect so that men in the lowest attractiveness quartile had 13% fewer children than others who did not differ from each other in the average number of children. These associations were partly but not completely accounted for by attractive participants' increased marriage probability. A linear regression analysis indicated relatively weak directional selection gradient for attractiveness (β=0.06 in women, β=0.07 in men). These findings indicate that physical attractiveness may be associated with reproductive success in humans living in industrialized settings.

  9. DNA Barcoding the Geometrid Fauna of Bavaria (Lepidoptera): Successes, Surprises, and Questions

    Science.gov (United States)

    Hausmann, Axel; Haszprunar, Gerhard; Hebert, Paul D. N.

    2011-01-01

    Background The State of Bavaria is involved in a research program that will lead to the construction of a DNA barcode library for all animal species within its territorial boundaries. The present study provides a comprehensive DNA barcode library for the Geometridae, one of the most diverse of insect families. Methodology/Principal Findings This study reports DNA barcodes for 400 Bavarian geometrid species, 98 per cent of the known fauna, and approximately one per cent of all Bavarian animal species. Although 98.5% of these species possess diagnostic barcode sequences in Bavaria, records from neighbouring countries suggest that species-level resolution may be compromised in up to 3.5% of cases. All taxa which apparently share barcodes are discussed in detail. One case of modest divergence (1.4%) revealed a species overlooked by the current taxonomic system: Eupithecia goossensiata Mabille, 1869 stat.n. is raised from synonymy with Eupithecia absinthiata (Clerck, 1759) to species rank. Deep intraspecific sequence divergences (>2%) were detected in 20 traditionally recognized species. Conclusions/Significance The study emphasizes the effectiveness of DNA barcoding as a tool for monitoring biodiversity. Open access is provided to a data set that includes records for 1,395 geometrid specimens (331 species) from Bavaria, with 69 additional species from neighbouring regions. Taxa with deep intraspecific sequence divergences are undergoing more detailed analysis to ascertain if they represent cases of cryptic diversity. PMID:21423340

  10. A kinematic comparison of successful and unsuccessful tennis serves across the elite development pathway.

    Science.gov (United States)

    Whiteside, David; Elliott, Bruce; Lay, Brendan; Reid, Machar

    2013-08-01

    While velocity generation is an obvious prerequisite to proficient tennis serve performance, it is also the only stroke where players are obliged to negotiate a unique target constraint. Therefore, the dearth of research attending to the accuracy component of the serve is surprising. This study compared the body, racquet and ball kinematics characterising successful serves and service faults, missed into the net, in two groups of elite junior female players and one professional female tennis player. Three-dimensional body, racquet and ball kinematics were recorded using a 22-camera VICON motion analysis system. There were no differences in body kinematics between successful serves and service faults, suggesting that service faults cannot be attributed to a single source of biomechanical error. However, service faults missing into the net are characterized by projection angles significantly further below the horizontal, implying that consistency in this end-point parameter is critical to successful performance. Regulation of this parameter appears dependent on compensatory adjustments in the distal elbow and wrist joints immediately prior to impact and also perceptual feedback. Accordingly, coordination of the distal degrees of freedom and a refined perception-action coupling appear more important to success than any isolated mechanical component of the service action. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. The influence of psychological resilience on the relation between automatic stimulus evaluation and attentional breadth for surprised faces.

    Science.gov (United States)

    Grol, Maud; De Raedt, Rudi

    2015-01-01

    The broaden-and-build theory relates positive emotions to resilience and cognitive broadening. The theory proposes that the broadening effects underly the relation between positive emotions and resilience, suggesting that resilient people can benefit more from positive emotions at the level of cognitive functioning. Research has investigated the influence of positive emotions on attentional broadening, but the stimulus in the target of attention may also influence attentional breadth, depending on affective stimulus evaluation. Surprised faces are particularly interesting as they are valence ambiguous, therefore, we investigated the relation between affective evaluation--using an affective priming task--and attentional breadth for surprised faces, and how this relation is influenced by resilience. Results show that more positive evaluations are related to more attentional broadening at high levels of resilience, while this relation is reversed at low levels. This indicates that resilient individuals can benefit more from attending to positively evaluated stimuli at the level of attentional broadening.

  12. Atom Surprise: Using Theatre in Primary Science Education

    Science.gov (United States)

    Peleg, Ran; Baram-Tsabari, Ayelet

    2011-10-01

    Early exposure to science may have a lifelong effect on children's attitudes towards science and their motivation to learn science in later life. Out-of-class environments can play a significant role in creating favourable attitudes, while contributing to conceptual learning. Educational science theatre is one form of an out-of-class environment, which has received little research attention. This study aims to describe affective and cognitive learning outcomes of watching such a play and to point to connections between theatrical elements and specific outcomes. "Atom Surprise" is a play portraying several concepts on the topic of matter. A mixed methods approach was adopted to investigate the knowledge and attitudes of children (grades 1-6) from two different school settings who watched the play. Data were gathered using questionnaires and in-depth interviews. Analysis suggested that in both schools children's knowledge on the topic of matter increased after the play with younger children gaining more conceptual knowledge than their older peers. In the public school girls showed greater gains in conceptual knowledge than boys. No significant changes in students' general attitudes towards science were found, however, students demonstrated positive changes towards science learning. Theatrical elements that seemed to be important in children's recollection of the play were the narrative, props and stage effects, and characters. In the children's memory, science was intertwined with the theatrical elements. Nonetheless, children could distinguish well between scientific facts and the fictive narrative.

  13. Farmers Insures Success

    Science.gov (United States)

    Freifeld, Lorri

    2012-01-01

    Farmers Insurance claims the No. 2 spot on the Training Top 125 with a forward-thinking training strategy linked to its primary mission: FarmersFuture 2020. It's not surprising an insurance company would have an insurance policy for the future. But Farmers takes that strategy one step further, setting its sights on 2020 with a far-reaching plan to…

  14. 'Surprise': Outbreak of Campylobacter infection associated with chicken liver pâté at a surprise birthday party, Adelaide, Australia, 2012.

    Science.gov (United States)

    Parry, Amy; Fearnley, Emily; Denehy, Emma

    2012-10-01

    In July 2012, an outbreak of Campylobacter infection was investigated by the South Australian Communicable Disease Control Branch and Food Policy and Programs Branch. The initial notification identified illness at a surprise birthday party held at a restaurant on 14 July 2012. The objective of the investigation was to identify the potential source of infection and institute appropriate intervention strategies to prevent further illness. A guest list was obtained and a retrospective cohort study undertaken. A combination of paper-based and telephone questionnaires were used to collect exposure and outcome information. An environmental investigation was conducted by Food Policy and Programs Branch at the implicated premises. All 57 guests completed the questionnaire (100% response rate), and 15 met the case definition. Analysis showed a significant association between illness and consumption of chicken liver pâté (relative risk: 16.7, 95% confidence interval: 2.4-118.6). No other food or beverage served at the party was associated with illness. Three guests submitted stool samples; all were positive for Campylobacter. The environmental investigation identified that the cooking process used in the preparation of chicken liver pâté may have been inconsistent, resulting in some portions not cooked adequately to inactivate potential Campylobacter contamination. Chicken liver products are a known source of Campylobacter infection; therefore, education of food handlers remains a high priority. To better identify outbreaks among the large number of Campylobacter notifications, routine typing of Campylobacter isolates is recommended.

  15. Clonal structure and variable fertilization success in Florida Keys broadcast-spawning corals

    Science.gov (United States)

    Miller, M. W.; Baums, I. B.; Pausch, R. E.; Bright, A. J.; Cameron, C. M.; Williams, D. E.; Moffitt, Z. J.; Woodley, C. M.

    2018-03-01

    Keystone reef-building corals in the Caribbean are predominantly self-incompatible broadcast spawners and a majority are threatened due to both acute adult mortality and poor recruitment. As population densities decline, concerns about fertilization limitation and effective population size in these species increase and would be further exacerbated by either high clonality or gametic incompatibility of parental genotypes. This study begins to address these concerns for two Caribbean broadcasting species by characterizing clonal structure and quantifying experimental pairwise fertilization success. Orbicella faveolata showed surprisingly high and contrasting levels of clonality between two sampled sites; Acropora palmata was previously known to be highly clonal. Individual pairwise crosses of synchronously spawning genotypes of each species were conducted by combining aliquots of gamete bundles immediately after spawning, and showed high and significant variability in fertilization success. Over half of the individual crosses of O. faveolata and about one-third of A. palmata crosses yielded ≤ 40% fertilization. Total sperm concentration was quantified in only a subset of O. faveolata crosses (range of 1-6 × 107 mL-1), but showed no correlation with fertilization success. We interpret that both parental incompatibility and individual genotypes with low-quality gametes are likely to have contributed to the variable fertilization observed with important implications for conservation. Differential fertilization success implies effective population size may be considerably smaller than hoped and population enhancement efforts need to incorporate many more parental genotypes at the patch scale to ensure successful larval production than indicated by estimates based simply on preserving levels of standing genetic diversity.

  16. Self-organizing weights for Internet AS-graphs and surprisingly simple routing metrics

    DEFF Research Database (Denmark)

    Scholz, Jan Carsten; Greiner, Martin

    2011-01-01

    The transport capacity of Internet-like communication networks and hence their efficiency may be improved by a factor of 5–10 through the use of highly optimized routing metrics, as demonstrated previously. The numerical determination of such routing metrics can be computationally demanding...... to an extent that prohibits both investigation of and application to very large networks. In an attempt to find a numerically less expensive way of constructing a metric with a comparable performance increase, we propose a local, self-organizing iteration scheme and find two surprisingly simple and efficient...... metrics. The new metrics have negligible computational cost and result in an approximately 5-fold performance increase, providing distinguished competitiveness with the computationally costly counterparts. They are applicable to very large networks and easy to implement in today's Internet routing...

  17. RT 24 - Architecture, Modeling & Simulation, and Software Design

    Science.gov (United States)

    2010-11-01

    focus on tool extensions (UPDM, SysML, SoaML, BPMN ) Leverage “best of breed” architecture methodologies Provide tooling to support the methodology DoDAF...Capability 10 Example: BPMN 11 DoDAF 2.0 MetaModel BPMN MetaModel Mapping SysML to DoDAF 2.0 12 DoDAF V2.0 Models OV-2 SysML Diagrams Requirement

  18. From Lithium-Ion to Sodium-Ion Batteries: Advantages, Challenges, and Surprises.

    Science.gov (United States)

    Nayak, Prasant Kumar; Yang, Liangtao; Brehm, Wolfgang; Adelhelm, Philipp

    2018-01-02

    Mobile and stationary energy storage by rechargeable batteries is a topic of broad societal and economical relevance. Lithium-ion battery (LIB) technology is at the forefront of the development, but a massively growing market will likely put severe pressure on resources and supply chains. Recently, sodium-ion batteries (SIBs) have been reconsidered with the aim of providing a lower-cost alternative that is less susceptible to resource and supply risks. On paper, the replacement of lithium by sodium in a battery seems straightforward at first, but unpredictable surprises are often found in practice. What happens when replacing lithium by sodium in electrode reactions? This review provides a state-of-the art overview on the redox behavior of materials when used as electrodes in lithium-ion and sodium-ion batteries, respectively. Advantages and challenges related to the use of sodium instead of lithium are discussed. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Collaborative Resilience to Episodic Shocks and Surprises: A Very Long-Term Case Study of Zanjera Irrigation in the Philippines 1979–2010

    Directory of Open Access Journals (Sweden)

    Ruth Yabes

    2015-07-01

    Full Text Available This thirty-year case study uses surveys, semi-structured interviews, and content analysis to examine the adaptive capacity of Zanjera San Marcelino, an indigenous irrigation management system in the northern Philippines. This common pool resource (CPR system exists within a turbulent social-ecological system (SES characterized by episodic shocks such as large typhoons as well as novel surprises, such as national political regime change and the construction of large dams. The Zanjera nimbly responded to these challenges, although sometimes in ways that left its structure and function substantially altered. While a partial integration with the Philippine National Irrigation Agency was critical to the Zanjera’s success, this relationship required on-going improvisation and renegotiation. Over time, the Zanjera showed an increasing capacity to learn and adapt. A core contribution of this analysis is the integration of a CPR study within an SES framework to examine resilience, made possible the occurrence of a wide range of challenges to the Zanjera’s function and survival over the long period of study. Long-term analyses like this one, however rare, are particularly useful for understanding the adaptive and transformative dimensions of resilience.

  20. The successful conclusion of the Deep Space 1 Mission: important results without a flashy title

    Science.gov (United States)

    Rayman, M. D.

    2002-01-01

    In September 2001, Deep Space 1 (DS1) completed a high-risk and flawless encounter with comet 19P/Borrelly. Its data provide a detailed view of this comet and offere surprising and exciting insights. With this successful conclusion of its extended mission, DS1 undertook a hyperextended mission. Following this period of extremely agressive testing, with no further technology or science objectives, the mission was terminated on December 18, 2001, with the powering off of the spacecraft's trnasmitter, although the receiver was left on. By the end of its mission, DS1 had returned a wealth of important science data and engineering data for future missions.

  1. How to reach clients of female sex workers: a survey by surprise in brothels in Dakar, Senegal.

    Science.gov (United States)

    Espirito Santo, M. E. Gomes do; Etheredge, G. D.

    2002-01-01

    OBJECTIVE: To describe the sampling techniques and survey procedures used in identifying male clients who frequent brothels to buy sexual services from female sex workers in Dakar, Senegal, with the aim of measuring the prevalence of human immunodeficiency virus (HIV) infection and investigating related risk behaviours. METHODS: Surveys were conducted in seven brothels in Dakar, Senegal. Clients were identified "by surprise" and interviewed and requested to donate saliva for HIV testing. RESULTS: Of the 1450 clients of prostitutes who were solicited to enter the study, 1140 (79.8%) agreed to be interviewed; 1083 (95%) of these clients provided saliva samples for testing. Of the samples tested, 47 were positive for HIV-1 or HIV-2, giving an HIV prevalence of 4.4%. CONCLUSION: The procedures adopted were successful in reaching the target population. Men present in the brothels could not deny being there, and it proved possible to explain the purpose of the study and to gain their confidence. Collection of saliva samples was shown to be an excellent method for performing HIV testing in difficult field conditions where it is hard to gain access to the population under study. The surveying of prostitution sites is recommended as a means of identifying core groups for HIV infection with a view to targeting education programmes more effectively. In countries such as Senegal, where the prevalence of HIV infection is still low, interventions among commercial sex workers and their clients may substantially delay the onset of a larger epidemic in the general population. PMID:12378288

  2. Would you be surprised if this patient died?: Preliminary exploration of first and second year residents' approach to care decisions in critically ill patients

    Directory of Open Access Journals (Sweden)

    Armstrong John D

    2003-01-01

    Full Text Available Abstract Background How physicians approach decision-making when caring for critically ill patients is poorly understood. This study aims to explore how residents think about prognosis and approach care decisions when caring for seriously ill, hospitalized patients. Methods Qualitative study where we conducted structured discussions with first and second year internal medicine residents (n = 8 caring for critically ill patients during Medical Intensive Care Unit Ethics and Discharge Planning Rounds. Residents were asked to respond to questions beginning with "Would you be surprised if this patient died?" Results An equal number of residents responded that they would (n = 4 or would not (n = 4 be surprised if their patient died. Reasons for being surprised included the rapid onset of an acute illness, reversible disease, improving clinical course and the patient's prior survival under similar circumstances. Residents reported no surprise with worsening clinical course. Based on the realization that their patient might die, residents cited potential changes in management that included clarifying treatment goals, improving communication with families, spending more time with patients and ordering fewer laboratory tests. Perceived or implied barriers to changes in management included limited time, competing clinical priorities, "not knowing" a patient, limited knowledge and experience, presence of diagnostic or prognostic uncertainty and unclear treatment goals. Conclusions These junior-level residents appear to rely on clinical course, among other factors, when assessing prognosis and the possibility for death in severely ill patients. Further investigation is needed to understand how these factors impact decision-making and whether perceived barriers to changes in patient management influence approaches to care.

  3. DEVSML 2.0: The Language and the Stack

    Science.gov (United States)

    2012-03-01

    problems outside it. For example, HTML for web pages, Verilog and VHDL for hardware description, etc. are DSLs for very specific domains. A DSL can be...Engineering ( MDE ) paradigm where meta-modeling allows such transformations. The metamodeling approach to Model Integrated Computing (MIC) brings...University of Arizona, 2007 [5] Mittal, S, Martin, JLR, Zeigler, BP, "DEVS-Based Web Services for Net-centric T&E", Summer Computer Simulation

  4. Surprising transformation of a block copolymer into a high performance polystyrene ultrafiltration membrane with a hierarchically organized pore structure

    KAUST Repository

    Shevate, Rahul

    2018-02-08

    We describe the preparation of hierarchical polystyrene nanoporous membranes with a very narrow pore size distribution and an extremely high porosity. The nanoporous structure is formed as a result of unusual degradation of the poly(4-vinyl pyridine) block from self-assembled poly(styrene)-b-poly(4-vinyl pyridine) (PS-b-P4VP) membranes through the formation of an unstable pyridinium intermediate in an alkaline medium. During this process, the confined swelling and controlled degradation produced a tunable pore size. We unequivocally confirmed the successful elimination of the P4VP block from a PS-b-P4VPVP membrane using 1D/2D NMR spectroscopy and other characterization techniques. Surprisingly, the long range ordered surface porosity was preserved even after degradation of the P4VP block from the main chain of the diblock copolymer, as revealed by SEM. Aside from a drastically improved water flux (∼67% increase) compared to the PS-b-P4VP membrane, the hydraulic permeability measurements validated pH independent behaviour of the isoporous PS membrane over a wide pH range from 3 to 10. The effect of the pore size on protein transport rate and selectivity (a) was investigated for lysozyme (Lys), bovine serum albumin (BSA) and globulin-γ (IgG). A high selectivity of 42 (Lys/IgG) and 30 (BSA/IgG) was attained, making the membranes attractive for size selective separation of biomolecules from their synthetic model mixture solutions.

  5. A Shocking Surprise in Stephan's Quintet

    Science.gov (United States)

    2006-01-01

    This false-color composite image of the Stephan's Quintet galaxy cluster clearly shows one of the largest shock waves ever seen (green arc). The wave was produced by one galaxy falling toward another at speeds of more than one million miles per hour. The image is made up of data from NASA's Spitzer Space Telescope and a ground-based telescope in Spain. Four of the five galaxies in this picture are involved in a violent collision, which has already stripped most of the hydrogen gas from the interiors of the galaxies. The centers of the galaxies appear as bright yellow-pink knots inside a blue haze of stars, and the galaxy producing all the turmoil, NGC7318b, is the left of two small bright regions in the middle right of the image. One galaxy, the large spiral at the bottom left of the image, is a foreground object and is not associated with the cluster. The titanic shock wave, larger than our own Milky Way galaxy, was detected by the ground-based telescope using visible-light wavelengths. It consists of hot hydrogen gas. As NGC7318b collides with gas spread throughout the cluster, atoms of hydrogen are heated in the shock wave, producing the green glow. Spitzer pointed its infrared spectrograph at the peak of this shock wave (middle of green glow) to learn more about its inner workings. This instrument breaks light apart into its basic components. Data from the instrument are referred to as spectra and are displayed as curving lines that indicate the amount of light coming at each specific wavelength. The Spitzer spectrum showed a strong infrared signature for incredibly turbulent gas made up of hydrogen molecules. This gas is caused when atoms of hydrogen rapidly pair-up to form molecules in the wake of the shock wave. Molecular hydrogen, unlike atomic hydrogen, gives off most of its energy through vibrations that emit in the infrared. This highly disturbed gas is the most turbulent molecular hydrogen ever seen. Astronomers were surprised not only by the turbulence

  6. Should students design or interact with models? Using the Bifocal Modelling Framework to investigate model construction in high school science

    Science.gov (United States)

    Fuhrmann, Tamar; Schneider, Bertrand; Blikstein, Paulo

    2018-05-01

    The Bifocal Modelling Framework (BMF) is an approach for science learning which links students' physical experimentation with computer modelling in real time, focusing on the comparison of the two media. In this paper, we explore how a Bifocal Modelling implementation supported learning outcomes related to both content and metamodeling knowledge, focusing on the role of designing models. Our study consisted of three conditions implemented with a total of 69 9th grade high-school students. The first and second classes were assigned two implementation modes of BMF: with and without a model design module. The third condition, employed as a control, consisted of a class that received instruction in the school's traditional approach. Our results indicate that students participating in both BMF implementations demonstrated improved content knowledge and a better understanding of metamodeling. However, only the 'BMF-with-design' group improved significantly in both content and metamodeling knowledge. Our qualitative analyses indicate that both BMF groups designed detailed models that included scientific explanations. However only students who engaged in the model design component: (1) completed a detailed model displaying molecular interaction; and (2) developed a critical perspective about models. We discuss the implications of those results for teaching scientific science concepts and metamodeling knowledge.

  7. Is Consciousness Reality or Illusion ? A Non-Dualist Interpretation of Consciousness

    Science.gov (United States)

    Schwarz, Eric

    2004-08-01

    This paper proposes a way to approach the "hard problem" of consciousness. First, we present a typology of the main models developed in the litterature to understand consciousness. Most of them adopt a physicalist ontology and a functionalist epistemology. We then present the main features of a metamodel we have elaborated to interpret nonlinear systems evolving toward complexity and autonomy. This systemic metamodel is a general framework that can later be used to make models of specific systems. As an extension of the mechanist paradigm, it is based on three primordial categories objects, relations and wholes or systems. In the last part, we apply it to the cases of the logic of life and the nature of consciousness. Both can be interpreted by the metamodel, in particular, by the autopoiesis proposed by Maturana and Varela for life and self-reference for consciousness.

  8. The influence of the surprising decay properties of element 108 on search experiments for new elements

    International Nuclear Information System (INIS)

    Hofmann, S.; Armbruster, P.; Muenzenberg, G.; Reisdorf, W.; Schmidt, K.H.; Burkhard, H.G.; Hessberger, F.P.; Schoett, H.J.; Agarwal, Y.K.; Berthes, G.; Gollerthan, U.; Folger, H.; Hingmann, J.G.; Keller, J.G.; Leino, M.E.; Lemmertz, P.; Montoya, M.; Poppensieker, K.; Quint, B.; Zychor, I.

    1986-01-01

    Results of experiments to synthesize the heaviest elements are reported. Surprising is the high stability against fission not only of the odd and odd-odd nuclei but also of even isotopes of even elements. Alpha decay data gave an increasing stability of nuclei by shell effects up to 266 109, the heaviest known element. Theoretically, the high stability is explained by an island of nuclei with big quadrupole and hexadecapole deformations around Z=109 and N=162. Future experiments will be planned to prove the island character of these heavy nuclei. (orig.)

  9. Surprises from the resolution of operator mixing in N=4 SYM

    International Nuclear Information System (INIS)

    Bianchi, Massimo; Rossi, Giancarlo; Stanev, Yassen S.

    2004-01-01

    We reexamine the problem of operator mixing in N=4 SYM. Particular attention is paid to the correct definition of composite gauge invariant local operators, which is necessary for the computation of their anomalous dimensions beyond lowest order. As an application we reconsider the case of operators with naive dimension Δ 0 =4, already studied in the literature. Stringent constraints from the resummation of logarithms in power behaviours are exploited and the role of the generalized N=4 Konishi anomaly in the mixing with operators involving fermions is discussed. A general method for the explicit (numerical) resolution of the operator mixing and the computation of anomalous dimensions is proposed. We then resolve the order g 2 mixing for the 15 (purely scalar) singlet operators of naive dimension Δ 0 =6. Rather surprisingly we find one isolated operator which has a vanishing anomalous dimension up to order g 4 , belonging to an apparently long multiplet. We also solve the order g 2 mixing for the 26 operators belonging to the representation 20' of SU(4). We find an operator with the same one-loop anomalous dimension as the Konishi multiplet

  10. Optimization of aerodynamic efficiency for twist morphing MAV wing

    Directory of Open Access Journals (Sweden)

    N.I. Ismail

    2014-06-01

    Full Text Available Twist morphing (TM is a practical control technique in micro air vehicle (MAV flight. However, TM wing has a lower aerodynamic efficiency (CL/CD compared to membrane and rigid wing. This is due to massive drag penalty created on TM wing, which had overwhelmed the successive increase in its lift generation. Therefore, further CL/CDmax optimization on TM wing is needed to obtain the optimal condition for the morphing wing configuration. In this paper, two-way fluid–structure interaction (FSI simulation and wind tunnel testing method are used to solve and study the basic wing aerodynamic performance over (non-optimal TM, membrane and rigid wings. Then, a multifidelity data metamodel based design optimization (MBDO process is adopted based on the Ansys-DesignXplorer frameworks. In the adaptive MBDO process, Kriging metamodel is used to construct the final multifidelity CL/CD responses by utilizing 23 multi-fidelity sample points from the FSI simulation and experimental data. The optimization results show that the optimal TM wing configuration is able to produce better CL/CDmax magnitude by at least 2% than the non-optimal TM wings. The flow structure formation reveals that low TV strength on the optimal TM wing induces low CD generation which in turn improves its overall CL/CDmax performance.

  11. Nongeneric tool support for model-driven product development; Werkzeugunterstuetzung fuer die modellbasierte Produktentwicklung. Maschinenlesbare Spezifikationen selbst erstellen

    Energy Technology Data Exchange (ETDEWEB)

    Bock, C. [Technische Univ. Kaiserslautern (Germany). Lehrstuhl fuer Produktionsautomatisierung; Zuehlke, D. [Technische Univ. Kaiserslautern (Germany). Lehrstuhl fuer Produktionsautomatisierung; Deutsches Forschungszentrum fuer Kuenstliche Intelligenz (DFKI), Kaiserslautern (DE). Zentrum fuer Mensch-Maschine-Interaktion (ZMMI)

    2006-07-15

    A well-defined specification process is a central success factor in human-machine-interface development. Consequently in interdisciplinary development teams specification documents are an important communication instrument. In order to replace todays typically paper-based specification and to leverage the benefits of their electronic equivalents developers demand comprehensive and applicable computer-based tool kits. Manufacturers' increasing awareness of appropriate tool support causes alternative approaches for tool kit creation to emerge. Therefore this article introduces meta-modelling as a promising attempt to create nongeneric tool support with justifiable effort. This enables manufacturers to take advantage of electronic specifications in product development processes.

  12. Pseudohalide (SCN(-))-Doped MAPbI3 Perovskites: A Few Surprises.

    Science.gov (United States)

    Halder, Ansuman; Chulliyil, Ramya; Subbiah, Anand S; Khan, Tuhin; Chattoraj, Shyamtanu; Chowdhury, Arindam; Sarkar, Shaibal K

    2015-09-03

    Pseudohalide thiocyanate anion (SCN(-)) has been used as a dopant in a methylammonium lead tri-iodide (MAPbI3) framework, aiming for its use as an absorber layer for photovoltaic applications. The substitution of SCN(-) pseudohalide anion, as verified using Fourier transform infrared (FT-IR) spectroscopy, results in a comprehensive effect on the optical properties of the original material. Photoluminescence measurements at room temperature reveal a significant enhancement in the emission quantum yield of MAPbI3-x(SCN)x as compared to MAPbI3, suggestive of suppression of nonradiative channels. This increased intensity is attributed to a highly edge specific emission from MAPbI3-x(SCN)x microcrystals as revealed by photoluminescence microscopy. Fluoresence lifetime imaging measurements further established contrasting carrier recombination dynamics for grain boundaries and the bulk of the doped material. Spatially resolved emission spectroscopy on individual microcrystals of MAPbI3-x(SCN)x reveals that the optical bandgap and density of states at various (local) nanodomains are also nonuniform. Surprisingly, several (local) emissive regions within MAPbI3-x(SCN)x microcrystals are found to be optically unstable under photoirradiation, and display unambiguous temporal intermittency in emission (blinking), which is extremely unusual and intriguing. We find diverse blinking behaviors for the undoped MAPbI3 crystals as well, which leads us to speculate that blinking may be a common phenomenon for most hybrid perovskite materials.

  13. Roll-up of validation results to a target application.

    Energy Technology Data Exchange (ETDEWEB)

    Hills, Richard Guy

    2013-09-01

    Suites of experiments are preformed over a validation hierarchy to test computational simulation models for complex applications. Experiments within the hierarchy can be performed at different conditions and configurations than those for an intended application, with each experiment testing only part of the physics relevant for the application. The purpose of the present work is to develop methodology to roll-up validation results to an application, and to assess the impact the validation hierarchy design has on the roll-up results. The roll-up is accomplished through the development of a meta-model that relates validation measurements throughout a hierarchy to the desired response quantities for the target application. The meta-model is developed using the computation simulation models for the experiments and the application. The meta-model approach is applied to a series of example transport problems that represent complete and incomplete coverage of the physics of the target application by the validation experiments.

  14. Optimization of an Intelligent Controller for an Unmanned Underwater Vehicle

    Directory of Open Access Journals (Sweden)

    M. Fauzi Nor Shah

    2011-08-01

    Full Text Available Underwater environment poses a difficult challenge for autonomous underwater navigation. A standard problem of underwater vehicles is to maintain it position at a certain depth in order to perform desired operations. An effective controller is required for this purpose and hence the design of a depth controller for an unmanned underwater vehicle is described in this paper. The control algorithm is simulated by using the marine guidance navigation and control simulator. The project shows a radial basis function metamodel can be used to tune the scaling factors of a fuzzy logic controller. By using offline optimization approach, a comparison between genetic algorithm and metamodeling has been done to minimize the integral square error between the set point and the measured depth of the underwater vehicle. The results showed that it is possible to obtain a reasonably good error using metamodeling approach in much a shorter time compared to the genetic algorithm approach.

  15. Valuation of large variable annuity portfolios: Monte Carlo simulation and synthetic datasets

    Directory of Open Access Journals (Sweden)

    Gan Guojun

    2017-12-01

    Full Text Available Metamodeling techniques have recently been proposed to address the computational issues related to the valuation of large portfolios of variable annuity contracts. However, it is extremely diffcult, if not impossible, for researchers to obtain real datasets frominsurance companies in order to test their metamodeling techniques on such real datasets and publish the results in academic journals. To facilitate the development and dissemination of research related to the effcient valuation of large variable annuity portfolios, this paper creates a large synthetic portfolio of variable annuity contracts based on the properties of real portfolios of variable annuities and implements a simple Monte Carlo simulation engine for valuing the synthetic portfolio. In addition, this paper presents fair market values and Greeks for the synthetic portfolio of variable annuity contracts that are important quantities for managing the financial risks associated with variable annuities. The resulting datasets can be used by researchers to test and compare the performance of various metamodeling techniques.

  16. Presentation of an umbilical cord cyst with a surprising jet: a case report of a patent urachus [v1; ref status: indexed, http://f1000r.es/xx

    Directory of Open Access Journals (Sweden)

    John Svigos

    2013-02-01

    Full Text Available We report a baby with an unusual true umbilical cord cyst detected at 12 weeks gestation which as the pregnancy progressed became increasingly difficult to distinguish from a pseudocyst of the umbilical cord. Concern of the possibility of cord compression/cord accident led to an elective caesarean section being performed at 35+ week’s gestation with delivery of a healthy female infant weighing 2170g. At birth the cyst ruptured and the resultant thickened elongated cord was clamped accordingly. After the cord clamp fell off at 5 days post delivery an elongated umbilical stump was left behind from which a stream of urine surprisingly jetted out from the umbilicus each time the baby cried. A patent urachus was confirmed on ultrasound and the umbilical jet of urine resolved at 4 weeks post delivery after treatment of an Escherichia coli urinary tract infection. At 11 weeks post delivery a laparoscopic excision of the urachus was successfully performed. The baby, now 18 months of age, continues to thrive without incident.

  17. Ensuring a successful family business management succession

    OpenAIRE

    Desbois, Joris

    2016-01-01

    Succession is the biggest long-term challenge that most family businesses face. Indeed, leaders ‘disposition to plan for their succession is frequently the key factor defining whether their family business subsists or stops. The research seeks to find out how to manage successfully the business management succession over main principles. This work project aims at researching the key points relevant to almost all family firms, to have a viable succession transition and positioni...

  18. The Future of Basic Science in Academic Surgery: Identifying Barriers to Success for Surgeon-scientists.

    Science.gov (United States)

    Keswani, Sundeep G; Moles, Chad M; Morowitz, Michael; Zeh, Herbert; Kuo, John S; Levine, Matthew H; Cheng, Lily S; Hackam, David J; Ahuja, Nita; Goldstein, Allan M

    2017-06-01

    The aim of this study was to examine the challenges confronting surgeons performing basic science research in today's academic surgery environment. Multiple studies have identified challenges confronting surgeon-scientists and impacting their ability to be successful. Although these threats have been known for decades, the downward trend in the number of successful surgeon-scientists continues. Clinical demands, funding challenges, and other factors play important roles, but a rigorous analysis of academic surgeons and their experiences regarding these issues has not previously been performed. An online survey was distributed to 2504 members of the Association for Academic Surgery and Society of University Surgeons to determine factors impacting success. Survey results were subjected to statistical analyses. We also reviewed publicly available data regarding funding from the National Institutes of Health (NIH). NIH data revealed a 27% decline in the proportion of NIH funding to surgical departments relative to total NIH funding from 2007 to 2014. A total of 1033 (41%) members responded to our survey, making this the largest survey of academic surgeons to date. Surgeons most often cited the following factors as major impediments to pursuing basic investigation: pressure to be clinically productive, excessive administrative responsibilities, difficulty obtaining extramural funding, and desire for work-life balance. Surprisingly, a majority (68%) did not believe surgeons can be successful basic scientists in today's environment, including departmental leadership. We have identified important barriers that confront academic surgeons pursuing basic research and a perception that success in basic science may no longer be achievable. These barriers need to be addressed to ensure the continued development of future surgeon-scientists.

  19. Hillslope, river, and Mountain: some surprises in Landscape evolution (Ralph Alger Bagnold Medal Lecture)

    Science.gov (United States)

    Tucker, G. E.

    2012-04-01

    Geomorphology, like the rest of geoscience, has always had two major themes: a quest to understand the earth's history and 'products' - its landscapes and seascapes - and, in parallel, a quest to understand its formative processes. This dualism is manifest in the remarkable career of R. A. Bagnold, who was inspired by landforms such as dunes, and dedicated to understanding the physical processes that shaped them. His legacy inspires us to emulate two principles at the heart of his contributions: the benefits of rooting geomorphic theory in basic physics, and the importance of understanding geomorphic systems in terms of simple equations framed around energy or force. Today, following Bagnold's footsteps, the earth-surface process community is engaged in a quest to build, test, and refine an ever-improving body of theory to describe our planet's surface and its evolution. In this lecture, I review a small sample of some of the fruits of that quest, emphasizing the value of surprises encountered along the way. The first example involves models of long-term river incision into bedrock. When the community began to grapple with how to represent this process mathematically, several different ideas emerged. Some were based on the assumption that sediment transport is the limiting factor; others assumed that hydraulic stress on rock is the key, while still others treated rivers as first-order 'reactors.' Thanks in part to advances in digital topography and numerical computing, the predictions of these models can be tested using natural-experiment case studies. Examples from the King Range, USA, the Central Apennines, Italy, and the fold-thrust belt of Taiwan, illustrate that independent knowledge of history and/or tectonics makes it possible to quantify how the rivers have responded to external forcing. Some interesting surprises emerge, such as: that the relief-uplift relationship can be highly nonlinear in a steady-state landscape because of grain-entrainment thresholds

  20. Success Factors of European Syndromic Surveillance Systems: A Worked Example of Applying Qualitative Comparative Analysis.

    Science.gov (United States)

    Ziemann, Alexandra; Fouillet, Anne; Brand, Helmut; Krafft, Thomas

    2016-01-01

    Syndromic surveillance aims at augmenting traditional public health surveillance with timely information. To gain a head start, it mainly analyses existing data such as from web searches or patient records. Despite the setup of many syndromic surveillance systems, there is still much doubt about the benefit of the approach. There are diverse interactions between performance indicators such as timeliness and various system characteristics. This makes the performance assessment of syndromic surveillance systems a complex endeavour. We assessed if the comparison of several syndromic surveillance systems through Qualitative Comparative Analysis helps to evaluate performance and identify key success factors. We compiled case-based, mixed data on performance and characteristics of 19 syndromic surveillance systems in Europe from scientific and grey literature and from site visits. We identified success factors by applying crisp-set Qualitative Comparative Analysis. We focused on two main areas of syndromic surveillance application: seasonal influenza surveillance and situational awareness during different types of potentially health threatening events. We found that syndromic surveillance systems might detect the onset or peak of seasonal influenza earlier if they analyse non-clinical data sources. Timely situational awareness during different types of events is supported by an automated syndromic surveillance system capable of analysing multiple syndromes. To our surprise, the analysis of multiple data sources was no key success factor for situational awareness. We suggest to consider these key success factors when designing or further developing syndromic surveillance systems. Qualitative Comparative Analysis helped interpreting complex, mixed data on small-N cases and resulted in concrete and practically relevant findings.

  1. Seismic reliability assessment of RC structures including soil–structure interaction using wavelet weighted least squares support vector machine

    International Nuclear Information System (INIS)

    Khatibinia, Mohsen; Javad Fadaee, Mohammad; Salajegheh, Javad; Salajegheh, Eysa

    2013-01-01

    An efficient metamodeling framework in conjunction with the Monte-Carlo Simulation (MCS) is introduced to reduce the computational cost in seismic reliability assessment of existing RC structures. In order to achieve this purpose, the metamodel is designed by combining weighted least squares support vector machine (WLS-SVM) and a wavelet kernel function, called wavelet weighted least squares support vector machine (WWLS-SVM). In this study, the seismic reliability assessment of existing RC structures with consideration of soil–structure interaction (SSI) effects is investigated in accordance with Performance-Based Design (PBD). This study aims to incorporate the acceptable performance levels of PBD into reliability theory for comparing the obtained annual probability of non-performance with the target values for each performance level. The MCS method as the most reliable method is utilized to estimate the annual probability of failure associated with a given performance level in this study. In WWLS-SVM-based MCS, the structural seismic responses are accurately predicted by WWLS-SVM for reducing the computational cost. To show the efficiency and robustness of the proposed metamodel, two RC structures are studied. Numerical results demonstrate the efficiency and computational advantages of the proposed metamodel for the seismic reliability assessment of structures. Furthermore, the consideration of the SSI effects in the seismic reliability assessment of existing RC structures is compared to the fixed base model. It shows which SSI has the significant influence on the seismic reliability assessment of structures.

  2. Knowledge-based inspection:modelling complex processes with the integrated Safeguards Modelling Method (iSMM)

    International Nuclear Information System (INIS)

    Abazi, F.

    2011-01-01

    Increased level of complexity in almost every discipline and operation today raises the demand for knowledge in order to successfully run an organization whether to generate profit or to attain a non-profit mission. Traditional way of transferring knowledge to information systems rich in data structures and complex algorithms continue to hinder the ability to swiftly turnover concepts into operations. Diagrammatic modelling commonly applied in engineering in order to represent concepts or reality remains to be an excellent way of converging knowledge from domain experts. The nuclear verification domain represents ever more a matter which has great importance to the World safety and security. Demand for knowledge about nuclear processes and verification activities used to offset potential misuse of nuclear technology will intensify with the growth of the subject technology. This Doctoral thesis contributes with a model-based approach for representing complex process such as nuclear inspections. The work presented contributes to other domains characterized with knowledge intensive and complex processes. Based on characteristics of a complex process a conceptual framework was established as the theoretical basis for creating a number of modelling languages to represent the domain. The integrated Safeguards Modelling Method (iSMM) is formalized through an integrated meta-model. The diagrammatic modelling languages represent the verification domain and relevant nuclear verification aspects. Such a meta-model conceptualizes the relation between practices of process management, knowledge management and domain specific verification principles. This fusion is considered as necessary in order to create quality processes. The study also extends the formalization achieved through a meta-model by contributing with a formalization language based on Pattern Theory. Through the use of graphical and mathematical constructs of the theory, process structures are formalized enhancing

  3. Stars Form Surprisingly Close to Milky Way's Black Hole

    Science.gov (United States)

    2005-10-01

    The supermassive black hole at the center of the Milky Way has surprisingly helped spawn a new generation of stars, according to observations from NASA's Chandra X-ray Observatory. This novel mode of star formation may solve several mysteries about the supermassive black holes that reside at the centers of nearly all galaxies. "Massive black holes are usually known for violence and destruction," said Sergei Nayakshin of the University of Leicester, United Kingdom, and coauthor of a paper on this research in an upcoming issue of the Monthly Notices of the Royal Astronomical Society. "So it's remarkable that this black hole helped create new stars, not just destroy them." Black holes have earned their fearsome reputation because any material -- including stars -- that falls within the so-called event horizon is never seen again. However, these new results indicate that the immense disks of gas known to orbit many black holes at a "safe" distance from the event horizon can help nurture the formation of new stars. Animation of Stars Forming Around Black Hole Animation of Stars Forming Around Black Hole This conclusion came from new clues that could only be revealed in X-rays. Until the latest Chandra results, astronomers have disagreed about the origin of a mysterious group of massive stars discovered by infrared astronomers to be orbiting less than a light year from the Milky Way's central black hole, a.k.a. Sagittarius A*, or Sgr A*. At such close distances to Sgr A*, the standard model for star formation predicts that gas clouds from which stars form should have been ripped apart by tidal forces from the black hole. Two models to explain this puzzle have been proposed. In the disk model, the gravity of a dense disk of gas around Sgr A* offsets the tidal forces and allows stars to form; in the migration model, the stars formed in a star cluster far away from the black hole and migrated in to form the ring of massive stars. The migration scenario predicts about a

  4. Probing Critical Point Energies of Transition Metal Dichalcogenides: Surprising Indirect Gap of Single Layer WSe 2

    KAUST Repository

    Zhang, Chendong; Chen, Yuxuan; Johnson, Amber; Li, Ming-yang; Li, Lain-Jong; Mende, Patrick C.; Feenstra, Randall M.; Shih, Chih Kang

    2015-01-01

    By using a comprehensive form of scanning tunneling spectroscopy, we have revealed detailed quasi-particle electronic structures in transition metal dichalcogenides, including the quasi-particle gaps, critical point energy locations, and their origins in the Brillouin zones. We show that single layer WSe surprisingly has an indirect quasi-particle gap with the conduction band minimum located at the Q-point (instead of K), albeit the two states are nearly degenerate. We have further observed rich quasi-particle electronic structures of transition metal dichalcogenides as a function of atomic structures and spin-orbit couplings. Such a local probe for detailed electronic structures in conduction and valence bands will be ideal to investigate how electronic structures of transition metal dichalcogenides are influenced by variations of local environment.

  5. Probing Critical Point Energies of Transition Metal Dichalcogenides: Surprising Indirect Gap of Single Layer WSe 2

    KAUST Repository

    Zhang, Chendong

    2015-09-21

    By using a comprehensive form of scanning tunneling spectroscopy, we have revealed detailed quasi-particle electronic structures in transition metal dichalcogenides, including the quasi-particle gaps, critical point energy locations, and their origins in the Brillouin zones. We show that single layer WSe surprisingly has an indirect quasi-particle gap with the conduction band minimum located at the Q-point (instead of K), albeit the two states are nearly degenerate. We have further observed rich quasi-particle electronic structures of transition metal dichalcogenides as a function of atomic structures and spin-orbit couplings. Such a local probe for detailed electronic structures in conduction and valence bands will be ideal to investigate how electronic structures of transition metal dichalcogenides are influenced by variations of local environment.

  6. Virtual Volatility, an Elementary New Concept with Surprising Stock Market Consequences

    Science.gov (United States)

    Prange, Richard; Silva, A. Christian

    2006-03-01

    Textbook investors start by predicting the future price distribution, PDF, of a candidate stock (or portfolio) at horizon T, e.g. a year hence. A (log)normal PDF with center (=drift =expected return) μT and width (=volatility) σT is often assumed on Central Limit Theorem grounds, i.e. by a random walk of daily (log)price increments δs. The standard deviation, stdev, of historical (ex post) δs `s is usually a fair predictor of the coming year's (ex ante) stdev(δs) = σdaily, but the historical mean E(δs) at best roughly limits the true, to be predicted, drift by μtrueT˜ μhistT ± σhistT. Textbooks take a PDF with σ ˜ σdaily and μ as somehow known, as if accurate predictions of μ were possible. It is elementary and presumably new to argue that an average of PDF's over a range of μ values should be taken, e.g. an average over forecasts by different analysts. We estimate that this leads to a PDF with a `virtual' volatility σ ˜ 1.3σdaily. It is indeed clear that uncertainty in the value of the expected gain parameter increases the risk of investment in that security by most measures, e. g. Sharpe's ratio μT/σT will be 30% smaller because of this effect. It is significant and surprising that there are investments which benefit from this 30% virtual increase in the volatility

  7. A conceptual geochemical model of the geothermal system at Surprise Valley, CA

    Science.gov (United States)

    Fowler, Andrew P. G.; Ferguson, Colin; Cantwell, Carolyn A.; Zierenberg, Robert A.; McClain, James; Spycher, Nicolas; Dobson, Patrick

    2018-03-01

    Characterizing the geothermal system at Surprise Valley (SV), northeastern California, is important for determining the sustainability of the energy resource, and mitigating hazards associated with hydrothermal eruptions that last occurred in 1951. Previous geochemical studies of the area attempted to reconcile different hot spring compositions on the western and eastern sides of the valley using scenarios of dilution, equilibration at low temperatures, surface evaporation, and differences in rock type along flow paths. These models were primarily supported using classical geothermometry methods, and generally assumed that fluids in the Lake City mud volcano area on the western side of the valley best reflect the composition of a deep geothermal fluid. In this contribution, we address controls on hot spring compositions using a different suite of geochemical tools, including optimized multicomponent geochemistry (GeoT) models, hot spring fluid major and trace element measurements, mineralogical observations, and stable isotope measurements of hot spring fluids and precipitated carbonates. We synthesize the results into a conceptual geochemical model of the Surprise Valley geothermal system, and show that high-temperature (quartz, Na/K, Na/K/Ca) classical geothermometers fail to predict maximum subsurface temperatures because fluids re-equilibrated at progressively lower temperatures during outflow, including in the Lake City area. We propose a model where hot spring fluids originate as a mixture between a deep thermal brine and modern meteoric fluids, with a seasonally variable mixing ratio. The deep brine has deuterium values at least 3 to 4‰ lighter than any known groundwater or high-elevation snow previously measured in and adjacent to SV, suggesting it was recharged during the Pleistocene when meteoric fluids had lower deuterium values. The deuterium values and compositional characteristics of the deep brine have only been identified in thermal springs and

  8. Evaluating Method Engineer Performance: an error classification and preliminary empirical study

    Directory of Open Access Journals (Sweden)

    Steven Kelly

    1998-11-01

    Full Text Available We describe an approach to empirically test the use of metaCASE environments to model methods. Both diagrams and matrices have been proposed as a means for presenting the methods. These different paradigms may have their own effects on how easily and well users can model methods. We extend Batra's classification of errors in data modelling to cover metamodelling, and use it to measure the performance of a group of metamodellers using either diagrams or matrices. The tentative results from this pilot study confirm the usefulness of the classification, and show some interesting differences between the paradigms.

  9. Cerebral metastasis masquerading as cerebritis: A case of misguiding history and radiological surprise!

    Directory of Open Access Journals (Sweden)

    Ashish Kumar

    2013-01-01

    Full Text Available Cerebral metastases usually have a characteristic radiological appearance. They can be differentiated rather easily from any infective etiology. Similarly, positive medical history also guides the neurosurgeon towards the possible diagnosis and adds to the diagnostic armamentarium. However, occasionally, similarities on imaging may be encountered where even history could lead us in the wrong direction and tends to bias the clinician. We report a case of a 40-year-old female with a history of mastoidectomy for otitis media presenting to us with a space occupying lesion in the right parietal region, which was thought pre-operatively as an abscess along with the cerebritis. Surprisingly, the histopathology proved it to be a metastatic adenocarcinoma. Hence, a ring enhancing lesion may be a high grade neoplasm/metastasis/abscess, significant gyral enhancement; a feature of cerebritis is not linked with a neoplastic etiology more often. This may lead to delayed diagnosis, incorrect prognostication and treatment in patients having coincidental suggestive history of infection. We review the literature and highlight the key points helping to differentiate an infective from a neoplastic pathology which may look similar at times.

  10. Investor Reaction to Market Surprises on the Istanbul Stock Exchange Investor Reaction to Market Surprises on the Istanbul Stock Exchange = İstanbul Menkul Kıymetler Borsasında Piyasa Sürprizlerine Yatırımcı Tepkisi

    Directory of Open Access Journals (Sweden)

    Yaman Ömer ERZURUMLU

    2011-08-01

    Full Text Available This paper examines the reaction of investors to the arrival of unexpected information on the Istanbul Stock Exchange. The empirical results suggest that the investor reaction following unexpected news on the ISE100 is consistent with Overreaction Hypothesis especially after unfavorable market surprises. Interestingly such pattern does not exist for ISE30 index which includes more liquid and informationally efficient securities. A possible implication of this study for investors is that employing a semi contrarian investment strategy of buying losers in ISE100 may generate superior returns. Moreover, results are supportive of the last regulation change of Capital Market Board of Turkey which mandates more disclosure regarding the trading of less liquid stocks with lower market capitalization.

  11. College Success Courses: Success for All

    Science.gov (United States)

    Coleman, Sandra Lee; Skidmore, Susan Troncoso; Weller, Carol Thornton

    2018-01-01

    College success courses (CSCs), or orientation courses, are offered by community colleges and universities to facilitate the success of first-time-in-college students. Primarily, these courses are designed to address students' nonacademic deficiencies, such as weak study habits and poor organizational skills, and to familiarize students with…

  12. Success of sky-polarimetric Viking navigation: revealing the chance Viking sailors could reach Greenland from Norway.

    Science.gov (United States)

    Száz, Dénes; Horváth, Gábor

    2018-04-01

    According to a famous hypothesis, Viking sailors could navigate along the latitude between Norway and Greenland by means of sky polarization in cloudy weather using a sun compass and sunstone crystals. Using data measured in earlier atmospheric optical and psychophysical experiments, here we determine the success rate of this sky-polarimetric Viking navigation. Simulating 1000 voyages between Norway and Greenland with varying cloudiness at summer solstice and spring equinox, we revealed the chance with which Viking sailors could reach Greenland under the varying weather conditions of a 3-week-long journey as a function of the navigation periodicity Δ t if they analysed sky polarization with calcite, cordierite or tourmaline sunstones. Examples of voyage routes are also presented. Our results show that the sky-polarimetric navigation is surprisingly successful on both days of the spring equinox and summer solstice even under cloudy conditions if the navigator determined the north direction periodically at least once in every 3 h, independently of the type of sunstone used for the analysis of sky polarization. This explains why the Vikings could rule the Atlantic Ocean for 300 years and could reach North America without a magnetic compass. Our findings suggest that it is not only the navigation periodicity in itself that is important for higher navigation success rates, but also the distribution of times when the navigation procedure carried out is as symmetrical as possible with respect to the time point of real noon.

  13. Success of sky-polarimetric Viking navigation: revealing the chance Viking sailors could reach Greenland from Norway

    Science.gov (United States)

    Száz, Dénes; Horváth, Gábor

    2018-04-01

    According to a famous hypothesis, Viking sailors could navigate along the latitude between Norway and Greenland by means of sky polarization in cloudy weather using a sun compass and sunstone crystals. Using data measured in earlier atmospheric optical and psychophysical experiments, here we determine the success rate of this sky-polarimetric Viking navigation. Simulating 1000 voyages between Norway and Greenland with varying cloudiness at summer solstice and spring equinox, we revealed the chance with which Viking sailors could reach Greenland under the varying weather conditions of a 3-week-long journey as a function of the navigation periodicity Δt if they analysed sky polarization with calcite, cordierite or tourmaline sunstones. Examples of voyage routes are also presented. Our results show that the sky-polarimetric navigation is surprisingly successful on both days of the spring equinox and summer solstice even under cloudy conditions if the navigator determined the north direction periodically at least once in every 3 h, independently of the type of sunstone used for the analysis of sky polarization. This explains why the Vikings could rule the Atlantic Ocean for 300 years and could reach North America without a magnetic compass. Our findings suggest that it is not only the navigation periodicity in itself that is important for higher navigation success rates, but also the distribution of times when the navigation procedure carried out is as symmetrical as possible with respect to the time point of real noon.

  14. Anatomy of a Rescue: What Makes Hostage Rescue Operations Successful?

    National Research Council Canada - National Science Library

    Perez, Carlos

    2004-01-01

    ...: surprise, intelligence, operator's skill, and deception. These principles are derived from planning models used in special operations, personal experience, and an analysis of six historical case studies...

  15. Surprisingly high substrate specificities observed in complex biofilms

    DEFF Research Database (Denmark)

    Nierychlo, Marta; Kindaichi, Tomonori; Kragelund, Caroline

    The behavior of microorganisms in natural ecosystems (e.g. biofilms) differs significantly from laboratory studies. In nature microorganisms experience alternating periods of surplus nutrients, nutrient-limitation, and starvation. Literature data suggests that to survive and compete successfully......, microorganisms can regulate their metabolism expressing wide range of uptake and catabolic systems. However, ecophysiological studies of natural biofilms indicate that bacteria are very specialized in their choice of substrate, so even minor changes in substrate composition can affect the community composition...... by selection for different specialized species. We hypothesized that bacteria growing in natural environment express strongly conserved substrate specificity which is independent on short-term (few hours) variations in growth conditions. In this study, biofilm from Aalborg wastewater treatment plant was used...

  16. The Ultraviolet Surprise. Efficient Soft X-Ray High Harmonic Generation in Multiply-Ionized Plasmas

    International Nuclear Information System (INIS)

    Popmintchev, Dimitar; Hernandez-Garcia, Carlos; Dollar, Franklin; Mancuso, Christopher; Perez-Hernandez, Jose A.; Chen, Ming-Chang; Hankla, Amelia; Gao, Xiaohui; Shim, Bonggu; Gaeta, Alexander L.; Tarazkar, Maryam; Romanov, Dmitri A.; Levis, Robert J.; Gaffney, Jim A.; Foord, Mark; Libby, Stephen B.; Jaron-Becker, Agnieskzka; Becker, Andreas; Plaja, Luis; Muranane, Margaret M.; Kapteyn, Henry C.; Popmintchev, Tenio

    2015-01-01

    High-harmonic generation is a universal response of matter to strong femtosecond laser fields, coherently upconverting light to much shorter wavelengths. Optimizing the conversion of laser light into soft x-rays typically demands a trade-off between two competing factors. Reduced quantum diffusion of the radiating electron wave function results in emission from each species which is highest when a short-wavelength ultraviolet driving laser is used. But, phase matching - the constructive addition of x-ray waves from a large number of atoms - favors longer-wavelength mid-infrared lasers. We identified a regime of high-harmonic generation driven by 40-cycle ultraviolet lasers in waveguides that can generate bright beams in the soft x-ray region of the spectrum, up to photon energies of 280 electron volts. Surprisingly, the high ultraviolet refractive indices of both neutral atoms and ions enabled effective phase matching, even in a multiply ionized plasma. We observed harmonics with very narrow linewidths, while calculations show that the x-rays emerge as nearly time-bandwidt-limited pulse trains of ~100 attoseconds

  17. Energy, mining, and the commercial success of the Newcomen "steam" engine

    Science.gov (United States)

    Murphy, John Paul

    This dissertation is about energy; specifically how prime movers changed at the beginning of the Industrial Revolution. These power needs are explored via the history of the Newcomen atmospheric engine, as it was used in the 18th century to drive pumps in flooded mines. This approach examines society as an energy-converting phenomenon, and uses the concept of an energy rent. The dissertation seeks to reach past the 19th century's "high-pressure historiography" of the first engines powered by fire; instead, it traces the actual low-pressure atmospheric technology of the first commercially successful engines, and the surprising, rather than inevitable, transformation they engendered. The costs of fuel are shown to be an essential factor in the success or failure of the first Newcomen engines. Thomas Newcomen's failed first attempts in Cornwall (1710) are contrasted with success in collieries, located in the relatively distant region of the Midlands, only two years later. To test the suggestion that coal is needed for a Newcomen engine to be profitable, two detailed case histories compare 18th century engines, both fired using wood fuel, at iron ore mines. The first was a failed engine at Dannemora, Sweden (1728); the second a successful machine built by the Brown brothers at Cranston, Rhode Island (1783). The Brown engine's case history was based on extensive original archive research, and also provides a detailed history of the Hope Furnace, which used the ore from Cranston. Success for the Browns in Rhode Island is found to have been rooted in their careful planning for fuel needs. The two mines were also found to have significantly different construction of gender roles, suggesting the Rhode Island context had established more thoroughly capitalist relations. The work shows that the demand for more extensive power, which led to these engines, was propelled by the ability of the evolving commercial market place to convert energy profitably (16th and 17th centuries

  18. Communication Management and Trust: Their Role in Building Resilience to "Surprises" Such As Natural Disasters, Pandemic Flu, and Terrorism

    Directory of Open Access Journals (Sweden)

    P. H. Longstaff

    2008-06-01

    Full Text Available In times of public danger such as natural disasters and health emergencies, a country's communication systems will be some of its most important assets because access to information will make individuals and groups more resilient. Communication by those charged with dealing with the situation is often critical. We analyzed reports from a wide variety of crisis incidents and found a direct correlation between trust and an organization's preparedness and internal coordination of crisis communication and the effectiveness of its leadership. Thus, trust is one of the most important variables in effective communication management in times of "surprise."

  19. Tracing the origins of success: implications for successful aging.

    Science.gov (United States)

    Peterson, Nora M; Martin, Peter

    2015-02-01

    This paper addresses the debate about the use of the term "successful aging" from a humanistic, rather than behavioral, perspective. It attempts to uncover what success, a term frequently associated with aging, is: how can it be defined and when did it first come into use? In this paper, we draw from a number of humanistic perspectives, including the historical and linguistic, in order to explore the evolution of the term "success." We believe that words and concepts have deep implications for how concepts (such as aging) are culturally and historically perceived. We take a comparative approach, turning to the etymological roots of this term in British, French, and German literature. According to the earliest entries of the term in the Oxford English Dictionary, events can have good or bad success. Another definition marks success as outcome oriented. Often used in the context of war, religion, and medicine, the neutral, but often negative, use of "success" in literature of the Renaissance demonstrates the tensions that surround the word, and suggests that success is something to be approached carefully. Ignoring the ambiguous origins of success erases the fact that aging in earlier centuries echoes much of the same ambivalence with which many people discuss it today. Attending to the origins of success can help gerontologists understand the humanistic tradition behind their inquiry into what successful aging means today. © The Author 2014. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  20. Educational Attainment: Success to the Successful

    Science.gov (United States)

    Anthony, Peter; Gould, David; Smith, Gina

    2013-01-01

    Systems archetypes are patterns of structure found in systems that are helpful in understanding some of the dynamics within them. The intent of this study was to examine educational attainment data using the success-to-the-successful archetype as a model to see if it helps to explain the inequality observed in the data. Data covering 1990 to 2009…

  1. Sensitivity analysis techniques applied to a system of hyperbolic conservation laws

    International Nuclear Information System (INIS)

    Weirs, V. Gregory; Kamm, James R.; Swiler, Laura P.; Tarantola, Stefano; Ratto, Marco; Adams, Brian M.; Rider, William J.; Eldred, Michael S.

    2012-01-01

    Sensitivity analysis is comprised of techniques to quantify the effects of the input variables on a set of outputs. In particular, sensitivity indices can be used to infer which input parameters most significantly affect the results of a computational model. With continually increasing computing power, sensitivity analysis has become an important technique by which to understand the behavior of large-scale computer simulations. Many sensitivity analysis methods rely on sampling from distributions of the inputs. Such sampling-based methods can be computationally expensive, requiring many evaluations of the simulation; in this case, the Sobol' method provides an easy and accurate way to compute variance-based measures, provided a sufficient number of model evaluations are available. As an alternative, meta-modeling approaches have been devised to approximate the response surface and estimate various measures of sensitivity. In this work, we consider a variety of sensitivity analysis methods, including different sampling strategies, different meta-models, and different ways of evaluating variance-based sensitivity indices. The problem we consider is the 1-D Riemann problem. By a careful choice of inputs, discontinuous solutions are obtained, leading to discontinuous response surfaces; such surfaces can be particularly problematic for meta-modeling approaches. The goal of this study is to compare the estimated sensitivity indices with exact values and to evaluate the convergence of these estimates with increasing samples sizes and under an increasing number of meta-model evaluations. - Highlights: ► Sensitivity analysis techniques for a model shock physics problem are compared. ► The model problem and the sensitivity analysis problem have exact solutions. ► Subtle details of the method for computing sensitivity indices can affect the results.

  2. Guideline validation in multiple trauma care through business process modeling.

    Science.gov (United States)

    Stausberg, Jürgen; Bilir, Hüseyin; Waydhas, Christian; Ruchholtz, Steffen

    2003-07-01

    Clinical guidelines can improve the quality of care in multiple trauma. In our Department of Trauma Surgery a specific guideline is available paper-based as a set of flowcharts. This format is appropriate for the use by experienced physicians but insufficient for electronic support of learning, workflow and process optimization. A formal and logically consistent version represented with a standardized meta-model is necessary for automatic processing. In our project we transferred the paper-based into an electronic format and analyzed the structure with respect to formal errors. Several errors were detected in seven error categories. The errors were corrected to reach a formally and logically consistent process model. In a second step the clinical content of the guideline was revised interactively using a process-modeling tool. Our study reveals that guideline development should be assisted by process modeling tools, which check the content in comparison to a meta-model. The meta-model itself could support the domain experts in formulating their knowledge systematically. To assure sustainability of guideline development a representation independent of specific applications or specific provider is necessary. Then, clinical guidelines could be used for eLearning, process optimization and workflow management additionally.

  3. An integrated approach for the knowledge discovery in computer simulation models with a multi-dimensional parameter space

    Energy Technology Data Exchange (ETDEWEB)

    Khawli, Toufik Al; Eppelt, Urs; Hermanns, Torsten [RWTH Aachen University, Chair for Nonlinear Dynamics, Steinbachstr. 15, 52047 Aachen (Germany); Gebhardt, Sascha [RWTH Aachen University, Virtual Reality Group, IT Center, Seffenter Weg 23, 52074 Aachen (Germany); Kuhlen, Torsten [Forschungszentrum Jülich GmbH, Institute for Advanced Simulation (IAS), Jülich Supercomputing Centre (JSC), Wilhelm-Johnen-Straße, 52425 Jülich (Germany); Schulz, Wolfgang [Fraunhofer, ILT Laser Technology, Steinbachstr. 15, 52047 Aachen (Germany)

    2016-06-08

    In production industries, parameter identification, sensitivity analysis and multi-dimensional visualization are vital steps in the planning process for achieving optimal designs and gaining valuable information. Sensitivity analysis and visualization can help in identifying the most-influential parameters and quantify their contribution to the model output, reduce the model complexity, and enhance the understanding of the model behavior. Typically, this requires a large number of simulations, which can be both very expensive and time consuming when the simulation models are numerically complex and the number of parameter inputs increases. There are three main constituent parts in this work. The first part is to substitute the numerical, physical model by an accurate surrogate model, the so-called metamodel. The second part includes a multi-dimensional visualization approach for the visual exploration of metamodels. In the third part, the metamodel is used to provide the two global sensitivity measures: i) the Elementary Effect for screening the parameters, and ii) the variance decomposition method for calculating the Sobol indices that quantify both the main and interaction effects. The application of the proposed approach is illustrated with an industrial application with the goal of optimizing a drilling process using a Gaussian laser beam.

  4. Semantics-Driven Migration of Java Programs: a Practical Experience

    Directory of Open Access Journals (Sweden)

    Artyom O. Aleksyuk

    2017-01-01

    Full Text Available The purpose of the study is to demonstrate the feasibility of automated code migration to a new set of programming libraries. Code migration is a common task in modern software projects. For example, it may arise when a project should be ported to a more secure or feature-rich library, a new platform or a new version of an already used library. The developed method and tool are based on the previously created by the authors a formalism for describing libraries semantics. The formalism specifies a library behaviour by using a system of extended finite state machines (EFSM. This paper outlines the metamodel designed to specify library descriptions and proposes an easy to use domainspecific language (DSL, which can be used to define models for particular libraries. The mentioned metamodel directly forms the code migration procedure. A process of migration is split into five steps, and each step is also described in the paper. The procedure uses an algorithm based on the breadth- first search extended for the needs of the migration task. Models and algorithms were implemented in the prototype of an automated code migration tool. The prototype was tested by both artificial code examples and a real-world open source project. The article describes the experiments performed, the difficulties that have arisen in the process of migration of test samples, and how they are solved in the proposed procedure. The results of experiments indicate that code migration can be successfully automated. 

  5. Evaluation of the BPMN According to the Requirements of the Enterprise Architecture Methodology

    Directory of Open Access Journals (Sweden)

    Václav Řepa

    2012-04-01

    Full Text Available This article evaluates some characteristics of the Business Process Modelling Notation from the perspective of the business system modelling methodology. Firstly the enterprise architecture context of the business process management as well as the importance of standards are discussed. Then the Business System Modelling Methodology is introduced with special attention paid to the Business Process Meta-model as a basis for the evaluation of the BPMN features. Particular basic concepts from the Business Process Meta-model are mapped to the usable constructs of the BPMN and related issues are analysed. Finally the basic conclusions are made and the general context is discussed.

  6. Information Systems Success: An empirical study on the appropriate success criteria and the real value of critical success factors

    OpenAIRE

    Skovly, Jørgen

    2013-01-01

    Success is a complex concept, that people have been trying to understand for some time. Extensive research has been conducted in order to improve our understanding, and thus increase our chances for achieving success. However, as projects still continue to fail, the real value of this research seems unclear. This thesis emphasizes the distinction between variables that may cause success (success factors), and variables that are part of success (success criteria). Success is not a 'black and w...

  7. ‘Surprise’: Outbreak of Campylobacter infection associated with chicken liver pâté at a surprise birthday party, Adelaide, Australia, 2012

    OpenAIRE

    Emma Denehy; Amy Parry; Emily Fearnley

    2012-01-01

    Objective: In July 2012, an outbreak of Campylobacter infection was investigated by the South Australian Communicable Disease Control Branch and Food Policy and Programs Branch. The initial notification identified illness at a surprise birthday party held at a restaurant on 14 July 2012. The objective of the investigation was to identify the potential source of infection and institute appropriate intervention strategies to prevent further illness.Methods: A guest list was obtained and a retro...

  8. Driving Technological Surprise: DARPA’s Mission in a Changing World

    Science.gov (United States)

    2013-04-01

    fundamental ways. Our research, innovation, and entrepreneurial capacity is the envy of the world, but others are building universities, labs, and...through deep engagement with companies, universities, and DoD and other labs. Our success hinges on having a healthy U.S. R&D ecosystem . Within

  9. Surprise: Dwarf Galaxy Harbors Supermassive Black Hole

    Science.gov (United States)

    2011-01-01

    The surprising discovery of a supermassive black hole in a small nearby galaxy has given astronomers a tantalizing look at how black holes and galaxies may have grown in the early history of the Universe. Finding a black hole a million times more massive than the Sun in a star-forming dwarf galaxy is a strong indication that supermassive black holes formed before the buildup of galaxies, the astronomers said. The galaxy, called Henize 2-10, 30 million light-years from Earth, has been studied for years, and is forming stars very rapidly. Irregularly shaped and about 3,000 light-years across (compared to 100,000 for our own Milky Way), it resembles what scientists think were some of the first galaxies to form in the early Universe. "This galaxy gives us important clues about a very early phase of galaxy evolution that has not been observed before," said Amy Reines, a Ph.D. candidate at the University of Virginia. Supermassive black holes lie at the cores of all "full-sized" galaxies. In the nearby Universe, there is a direct relationship -- a constant ratio -- between the masses of the black holes and that of the central "bulges" of the galaxies, leading them to conclude that the black holes and bulges affected each others' growth. Two years ago, an international team of astronomers found that black holes in young galaxies in the early Universe were more massive than this ratio would indicate. This, they said, was strong evidence that black holes developed before their surrounding galaxies. "Now, we have found a dwarf galaxy with no bulge at all, yet it has a supermassive black hole. This greatly strengthens the case for the black holes developing first, before the galaxy's bulge is formed," Reines said. Reines, along with Gregory Sivakoff and Kelsey Johnson of the University of Virginia and the National Radio Astronomy Observatory (NRAO), and Crystal Brogan of the NRAO, observed Henize 2-10 with the National Science Foundation's Very Large Array radio telescope and

  10. Postmating-prezygotic isolation between two allopatric populations of Drosophila montana: fertilisation success differs under sperm competition.

    Science.gov (United States)

    Ala-Honkola, Outi; Ritchie, Michael G; Veltsos, Paris

    2016-03-01

    Postmating but prezygotic (PMPZ) interactions are increasingly recognized as a potentially important early-stage barrier in the evolution of reproductive isolation. A recent study described a potential example between populations of the same species: single matings between Drosophila montana populations resulted in differential fertilisation success because of the inability of sperm from one population (Vancouver) to penetrate the eggs of the other population (Colorado). As the natural mating system of D. montana is polyandrous (females remate rapidly), we set up double matings of all possible crosses between the same populations to test whether competitive effects between ejaculates influence this PMPZ isolation. We measured premating isolation in no-choice tests, female fecundity, fertility and egg-to-adult viability after single and double matings as well as second-male paternity success (P2). Surprisingly, we found no PMPZ reproductive isolation between the two populations under a competitive setting, indicating no difficulty of sperm from Vancouver males to fertilize Colorado eggs after double matings. While there were subtle differences in how P2 changed over time, suggesting that Vancouver males' sperm are somewhat less competitive in a first-male role within Colorado females, these effects did not translate into differences in overall P2. Fertilisation success can thus differ dramatically between competitive and noncompetitive conditions, perhaps because the males that mate second produce higher quality ejaculates in response to sperm competition. We suggest that unlike in more divergent species comparisons, where sperm competition typically increases reproductive isolation, ejaculate tailoring can reduce the potential for PMPZ isolation when recently diverged populations interbreed.

  11. Motivational and Adaptational Factors of Successful Women Engineers

    Science.gov (United States)

    Bornsen, Susan Edith

    2012-01-01

    It is no surprise that there is a shortage of women engineers. The reasons for the shortage have been researched and discussed in myriad papers, and suggestions for improvement continue to evolve. However, there are few studies that have specifically identified the positive aspects that attract women to engineering and keep them actively engaged…

  12. Conundrums, paradoxes, and surprises: a brave new world of biodiversity conservation

    Science.gov (United States)

    A.E. Lugo

    2012-01-01

    Anthropogenic activity is altering the global disturbance regime through such processes as urbanization, deforestation, and climate change. These disturbance events alter the environmental conditions under which organisms live and adapt and trigger succession, thus setting the biota in otiion in both ecological and evolutionary space. The result is the mixing of...

  13. [Fall from height--surprising autopsy diagnosis in primarily unclear initial situations].

    Science.gov (United States)

    Schyma, Christian; Doberentz, Elke; Madea, Burkhard

    2012-01-01

    External post-mortem examination and first police assessments are often not consistent with subsequent autopsy results. This is all the more surprising the more serious the injuries found at autopsy are. Such discrepancies result especially from an absence of gross external injuries, as demonstrated by four examples. A 42-year-old, externally uninjured male was found at night time in a helpless condition in the street and died in spite of resuscitation. Autopsy showed severe polytrauma with traumatic brain injury and lesions of the thoracic and abdominal organs. A jump from the third floor was identified as the cause. At dawn, a twenty-year-old male was found dead on the grounds of the adjacent house. Because of the blood-covered head the police assumed a traumatic head injury by strike impact. The external examination revealed only abrasions on the forehead and to a minor extent on the back. At autopsy a midfacial fracture, a trauma of the thorax and abdomen and fractures of the spine and pelvis were detected. Afterwards investigations showed that the man, intoxicated by alcohol, had fallen from the flat roof of a multistoried house. A 77-year-old man was found unconscious on his terrace at day time; a cerebral seizure was assumed. He was transferred to emergency care where he died. The corpse was externally inconspicuous. Autopsy revealed serious traumatic injuries of the brain, thorax, abdomen and pelvis, which could be explained by a fall from the balcony. A 47-year-old homeless person without any external injuries was found dead in a barn. An alcohol intoxication was assumed. At autopsy severe injuries of the brain and cervical spine were found which were the result of a fall from a height of 5 m. On the basis of an external post-mortem examination alone gross blunt force trauma cannot be reliably excluded.

  14. For Catholic Colleges, an Important Goal: Don't Surprise the Bishop

    Science.gov (United States)

    Supiano, Beckie

    2009-01-01

    Every college president's success depends on building good relationships with outside groups, whether donors, alumni, or legislators. Presidents of Roman Catholic colleges have one more party to please: the local bishop. In recent months, the bishop of Scranton, Pennsylvania, asked colleges in his diocese to assure him that they were not providing…

  15. The Surprising Impact of Seat Location on Student Performance

    Science.gov (United States)

    Perkins, Katherine K.; Wieman, Carl E.

    2005-01-01

    Every physics instructor knows that the most engaged and successful students tend to sit at the front of the class and the weakest students tend to sit at the back. However, it is normally assumed that this is merely an indication of the respective seat location preferences of weaker and stronger students. Here we present evidence suggesting that in fact this may be mixing up the cause and effect. It may be that the seat selection itself contributes to whether the student does well or poorly, rather than the other way around. While a number of studies have looked at the effect of seat location on students, the results are often inconclusive, and few, if any, have studied the effects in college classrooms with randomly assigned seats. In this paper, we report on our observations of a large introductory physics course in which we randomly assigned students to particular seat locations at the beginning of the semester. Seat location during the first half of the semester had a noticeable impact on student success in the course, particularly in the top and bottom parts of the grade distribution. Students sitting in the back of the room for the first half of the term were nearly six times as likely to receive an F as students who started in the front of the room. A corresponding but less dramatic reversal was evident in the fractions of students receiving As. These effects were in spite of many unusual efforts to engage students at the back of the class and a front-to-back reversal of seat location halfway through the term. These results suggest there may be inherent detrimental effects of large physics lecture halls that need to be further explored.

  16. The genome of Pelobacter carbinolicus reveals surprising metabolic capabilities and physiological features

    Energy Technology Data Exchange (ETDEWEB)

    Aklujkar, Muktak [University of Massachusetts, Amherst; Haveman, Shelley [University of Massachusetts, Amherst; DiDonatoJr, Raymond [University of Massachusetts, Amherst; Chertkov, Olga [Los Alamos National Laboratory (LANL); Han, Cliff [Los Alamos National Laboratory (LANL); Land, Miriam L [ORNL; Brown, Peter [University of Massachusetts, Amherst; Lovley, Derek [University of Massachusetts, Amherst

    2012-01-01

    Background: The bacterium Pelobacter carbinolicus is able to grow by fermentation, syntrophic hydrogen/formate transfer, or electron transfer to sulfur from short-chain alcohols, hydrogen or formate; it does not oxidize acetate and is not known to ferment any sugars or grow autotrophically. The genome of P. carbinolicus was sequenced in order to understand its metabolic capabilities and physiological features in comparison with its relatives, acetate-oxidizing Geobacter species. Results: Pathways were predicted for catabolism of known substrates: 2,3-butanediol, acetoin, glycerol, 1,2-ethanediol, ethanolamine, choline and ethanol. Multiple isozymes of 2,3-butanediol dehydrogenase, ATP synthase and [FeFe]-hydrogenase were differentiated and assigned roles according to their structural properties and genomic contexts. The absence of asparagine synthetase and the presence of a mutant tRNA for asparagine encoded among RNA-active enzymes suggest that P. carbinolicus may make asparaginyl-tRNA in a novel way. Catabolic glutamate dehydrogenases were discovered, implying that the tricarboxylic acid (TCA) cycle can function catabolically. A phosphotransferase system for uptake of sugars was discovered, along with enzymes that function in 2,3-butanediol production. Pyruvate: ferredoxin/flavodoxin oxidoreductase was identified as a potential bottleneck in both the supply of oxaloacetate for oxidation of acetate by the TCA cycle and the connection of glycolysis to production of ethanol. The P. carbinolicus genome was found to encode autotransporters and various appendages, including three proteins with similarity to the geopilin of electroconductive nanowires. Conclusions: Several surprising metabolic capabilities and physiological features were predicted from the genome of P. carbinolicus, suggesting that it is more versatile than anticipated.

  17. Attitudes of Success.

    Science.gov (United States)

    Pendarvis, Faye

    This document investigates the attitudes of successful individuals, citing the achievement of established goals as the criteria for success. After offering various definitions of success, the paper focuses on the importance of self-esteem to success and considers ways by which the self-esteem of students can be improved. Theories of human behavior…

  18. ISTehnology – integrated Approach to IS Development and Benefits of its Using

    Directory of Open Access Journals (Sweden)

    Janis Iljins

    2011-07-01

    Full Text Available The system ISTechnology and benefits of its usage are analyzed in the paper. ISTechnology provides an integrated approach to business modeling and development of information systems. The system consists of a meta-model and applications. The meta-model enables defining of a platform independent business model of the organization. The applications provide the definition and interpretation of the business model. Interpretation of the business model provides functionality of the information system in the selected platform. The lessons learned confirm that the development and maintenance cost of information systems can be significantly reduced by use of the ISTechnology. The paper describes additional benefits of using ISTechnology like generation of documentation and easy system migration to another platform.

  19. Polynomial Chaos decomposition applied to stochastic dosimetry: study of the influence of the magnetic field orientation on the pregnant woman exposure at 50 Hz.

    Science.gov (United States)

    Liorni, I; Parazzini, M; Fiocchi, S; Guadagnin, V; Ravazzani, P

    2014-01-01

    Polynomial Chaos (PC) is a decomposition method used to build a meta-model, which approximates the unknown response of a model. In this paper the PC method is applied to the stochastic dosimetry to assess the variability of human exposure due to the change of the orientation of the B-field vector respect to the human body. In detail, the analysis of the pregnant woman exposure at 7 months of gestational age is carried out, to build-up a statistical meta-model of the induced electric field for each fetal tissue and in the fetal whole-body by means of the PC expansion as a function of the B-field orientation, considering a uniform exposure at 50 Hz.

  20. SUCCESSION MANAGEMENT: UPAYA HUMAN RESOURCE PLANNING MENUJU SUCCESS CORPORATE

    Directory of Open Access Journals (Sweden)

    Rini Kuswati

    2010-06-01

    create a more flexible and dynamic approach for preparing future executive and have the leadership necessary ready to meet the business challenges of the remainder of the decade and beyond. Succession management allows the corporate leadership to instill a more dynamic process became easier to integrate with the firm’s strategic initiatives. It better aligns organizational thinking with the external environment where the discontinuities make it possible to anticipate the full spectrum of change that a corporation will confront. It is the leadership and succession philosophy that focuses on developing the creativity and flexibility that allows for a more rapid response to change. So succession management as one way to became the success corporate.

  1. Success in Science, Success in Collaboration

    Energy Technology Data Exchange (ETDEWEB)

    Johnston, Mariann R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-25

    This is a series of four different scientific problems which were resolved through collaborations. They are: "Better flow cytometry through novel focusing technology", "Take Off®: Helping the Agriculture Industry Improve the Viability of Sustainable, Large-Production Crops", "The National Institutes of Health's Models of Infectious Disease Agent Study (MIDAS)", and "Expanding the capabilities of SOLVE/RESOLVE through the PHENIX Consortium." For each one, the problem is listed, the solution, advantages, bottom line, then information about the collaboration including: developing the technology, initial success, and continued success.

  2. The evolution of polyandry: patterns of genotypic variation in female mating frequency, male fertilization success and a test of the sexy-sperm hypothesis.

    Science.gov (United States)

    Simmons, L W

    2003-07-01

    The sexy-sperm hypothesis predicts that females obtain indirect benefits for their offspring via polyandy, in the form of increased fertilization success for their sons. I use a quantitative genetic approach to test the sexy-sperm hypothesis using the field cricket Teleogryllus oceanicus. Previous studies of this species have shown considerable phenotypic variation in fertilization success when two or more males compete. There were high broad-sense heritabilities for both paternity and polyandry. Patterns of genotypic variance were consistent with X-linked inheritance and/or maternal effects on these traits. The genetic architecture therefore precludes the evolution of polyandry via a sexy-sperm process. Thus the positive genetic correlation between paternity in sons and polyandry in daughters predicted by the sexy-sperm hypothesis was absent. There was significant heritable variation in the investment by females in ovaries and by males in the accessory gland. Surprisingly there was a very strong genetic correlation between these two traits. The significance of this genetic correlation for the coevolution of male seminal products and polyandry is discussed.

  3. A Multi-Year Study on Rice Morphological Parameter Estimation with X-Band Polsar Data

    Directory of Open Access Journals (Sweden)

    Onur Yuzugullu

    2017-06-01

    Full Text Available Rice fields have been monitored with spaceborne Synthetic Aperture Radar (SAR systems for decades. SAR is an essential source of data and allows for the estimation of plant properties such as canopy height, leaf area index, phenological phase, and yield. However, the information on detailed plant morphology in meter-scale resolution is necessary for the development of better management practices. This letter presents the results of the procedure that estimates the stalk height, leaf length and leaf width of rice fields from a copolar X-band TerraSAR-X time series data based on a priori phenological phase. The methodology includes a computationally efficient stochastic inversion algorithm of a metamodel that mimics a radiative transfer theory-driven electromagnetic scattering (EM model. The EM model and its metamodel are employed to simulate the backscattering intensities from flooded rice fields based on their simplified physical structures. The results of the inversion procedure are found to be accurate for cultivation seasons from 2013 to 2015 with root mean square errors less than 13.5 cm for stalk height, 7 cm for leaf length, and 4 mm for leaf width parameters. The results of this research provided new perspectives on the use of EM models and computationally efficient metamodels for agriculture management practices.

  4. Classifier-guided sampling for discrete variable, discontinuous design space exploration: Convergence and computational performance

    Energy Technology Data Exchange (ETDEWEB)

    Backlund, Peter B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shahan, David W. [HRL Labs., LLC, Malibu, CA (United States); Seepersad, Carolyn Conner [Univ. of Texas, Austin, TX (United States)

    2014-04-22

    A classifier-guided sampling (CGS) method is introduced for solving engineering design optimization problems with discrete and/or continuous variables and continuous and/or discontinuous responses. The method merges concepts from metamodel-guided sampling and population-based optimization algorithms. The CGS method uses a Bayesian network classifier for predicting the performance of new designs based on a set of known observations or training points. Unlike most metamodeling techniques, however, the classifier assigns a categorical class label to a new design, rather than predicting the resulting response in continuous space, and thereby accommodates nondifferentiable and discontinuous functions of discrete or categorical variables. The CGS method uses these classifiers to guide a population-based sampling process towards combinations of discrete and/or continuous variable values with a high probability of yielding preferred performance. Accordingly, the CGS method is appropriate for discrete/discontinuous design problems that are ill-suited for conventional metamodeling techniques and too computationally expensive to be solved by population-based algorithms alone. In addition, the rates of convergence and computational properties of the CGS method are investigated when applied to a set of discrete variable optimization problems. Results show that the CGS method significantly improves the rate of convergence towards known global optima, on average, when compared to genetic algorithms.

  5. Probabilistic Electricity Price Forecasting Models by Aggregation of Competitive Predictors

    Directory of Open Access Journals (Sweden)

    Claudio Monteiro

    2018-04-01

    Full Text Available This article presents original probabilistic price forecasting meta-models (PPFMCP models, by aggregation of competitive predictors, for day-ahead hourly probabilistic price forecasting. The best twenty predictors of the EEM2016 EPF competition are used to create ensembles of hourly spot price forecasts. For each hour, the parameter values of the probability density function (PDF of a Beta distribution for the output variable (hourly price can be directly obtained from the expected and variance values associated to the ensemble for such hour, using three aggregation strategies of predictor forecasts corresponding to three PPFMCP models. A Reliability Indicator (RI and a Loss function Indicator (LI are also introduced to give a measure of uncertainty of probabilistic price forecasts. The three PPFMCP models were satisfactorily applied to the real-world case study of the Iberian Electricity Market (MIBEL. Results from PPFMCP models showed that PPFMCP model 2, which uses aggregation by weight values according to daily ranks of predictors, was the best probabilistic meta-model from a point of view of mean absolute errors, as well as of RI and LI. PPFMCP model 1, which uses the averaging of predictor forecasts, was the second best meta-model. PPFMCP models allow evaluations of risk decisions based on the price to be made.

  6. Building Successful Information Systems – a Key for Successful Organization

    Directory of Open Access Journals (Sweden)

    Doina ROSCA

    2010-12-01

    Full Text Available Building Successful Information Systems – a Key for Successful OrganizationAbstract: An Information System (IS can have a major impact on corporate strategy and organizational success. The involvement of managers and decision makers in all aspects of information systems is a major factor for organizational success, including higher profits and lower costs. Some of the benefits business organization seek to achieve through information systems include: better safety, competitive advantage, fewer errors, greater accuracy, higher quality products, improved communications, increased efficiency and productivity, more efficient administration, superior financial and managerial decision making.

  7. SPEM: Software Process Engineering Metamodel

    OpenAIRE

    Víctor Hugo Menéndez Domínguez; María Enriqueta Castellanos Bolaños

    2015-01-01

    Todas las organizaciones involucradas en el desarrollo de software necesitan establecer, gestionar y soportar el trabajo de desarrollo. El término “proceso de desarrollo de software” tiende a unificar todas las actividades y prácticas que cubren esas necesidades. Modelar el proceso de software es una forma para mejorar el desarrollo y la calidad de las aplicaciones resultantes. De entre todos los lenguajes existentes para el modelado de procesos, aquellos basados en productos de trabajo son l...

  8. Successive neuron loss in the thalamus and cortex in a mouse model of infantile neuronal ceroid lipofuscinosis.

    Science.gov (United States)

    Kielar, Catherine; Maddox, Lucy; Bible, Ellen; Pontikis, Charlie C; Macauley, Shannon L; Griffey, Megan A; Wong, Michael; Sands, Mark S; Cooper, Jonathan D

    2007-01-01

    Infantile neuronal ceroid lipofuscinosis (INCL) is caused by deficiency of the lysosomal enzyme, palmitoyl protein thioesterase 1 (PPT1). We have investigated the onset and progression of pathological changes in Ppt1 deficient mice (Ppt1-/-) and the development of their seizure phenotype. Surprisingly, cortical atrophy and neuron loss occurred only late in disease progression but were preceded by localized astrocytosis within individual thalamic nuclei and the progressive loss of thalamic neurons that relay different sensory modalities to the cortex. This thalamic neuron loss occurred first within the visual system and only subsequently in auditory and somatosensory relay nuclei or the inhibitory reticular thalamic nucleus. The loss of granule neurons and GABAergic interneurons followed in each corresponding cortical region, before the onset of seizure activity. These findings provide novel evidence for successive neuron loss within the thalamus and cortex in Ppt1-/- mice, revealing the thalamus as an important early focus of INCL pathogenesis.

  9. Effect of Temperature Shock and Inventory Surprises on Natural Gas and Heating Oil Futures Returns

    Science.gov (United States)

    Hu, John Wei-Shan; Lin, Chien-Yu

    2014-01-01

    The aim of this paper is to examine the impact of temperature shock on both near-month and far-month natural gas and heating oil futures returns by extending the weather and storage models of the previous study. Several notable findings from the empirical studies are presented. First, the expected temperature shock significantly and positively affects both the near-month and far-month natural gas and heating oil futures returns. Next, significant temperature shock has effect on both the conditional mean and volatility of natural gas and heating oil prices. The results indicate that expected inventory surprises significantly and negatively affects the far-month natural gas futures returns. Moreover, volatility of natural gas futures returns is higher on Thursdays and that of near-month heating oil futures returns is higher on Wednesdays than other days. Finally, it is found that storage announcement for natural gas significantly affects near-month and far-month natural gas futures returns. Furthermore, both natural gas and heating oil futures returns are affected more by the weighted average temperature reported by multiple weather reporting stations than that reported by a single weather reporting station. PMID:25133233

  10. Determining Success Criteria and Success Factors for International Construction Projects for Malaysian Contractors

    Directory of Open Access Journals (Sweden)

    Ali Mohammed Alashwal

    2017-06-01

    Full Text Available The success of international construction projects is fraught with various challenges such as competitiveness, lack of resources, versatile global economy, and specific conditions in the host country. Malaysian contractors have been venturing into global construction market since early 1980s. However, their venturing was not successful all the time. The number of international projects awarded to Malaysian contractors has reduced drastically during the past decade. Taking advantage of this experience, this paper aims to identify the success criteria and success factors of international construction projects. The data was collected from 120 respondents using a questionnaire survey and analysed using principal component analysis and regression analysis. The results revealed three principal criteria of project success namely, Management Success, Functional Success, and Organisation Success. The main components of success factors include Team Power and Skills, Resource Availability, External Environment, Organisation Capability, Project Support, and Project Organisation. Further analysis emphasized the importance of strong financing capacity of contractors, project social environment, and competence of the project manager in achieving project success. The results of this paper can serve as a guideline for contractors and project managers to achieve success in this context. Future studies may provide in-depth analysis of success criteria and success factors specific for construction project type and host-country location.

  11. Editorial: Selling a sugar tax: the sweet smell of success?

    Science.gov (United States)

    Lloyd-Williams, Ffion; Capewell, Simon

    2016-09-01

    This editorial briefly considers the increasing epidemic of obesity and Type 2 diabetes, the underlying drivers of junk food and sugary drinks, and the recent scientific and campaigning movements culminating in the UK Chancellor's surprise announcement of a Sugary Drinks Levy. Copyright© 2016 Dennis Barber Ltd.

  12. Toward An Ontology of Mutual Recursion: Models, Mind and Media

    Directory of Open Access Journals (Sweden)

    Mat Wall-Smith

    2008-01-01

    Full Text Available In Parables for the Virtual Massumi describes 'The Autonomy of Affect' in our ecology of thought (Massumi 2002 : 35. The object of Stiegler's Technics and Time is 'technics apprehended as the horizon of all possibility to come and all possibility of a future' (Stiegler 1998 : ix. The ecological dynamic described by the recursion between this 'affective autonomy' and a 'technical horizon of possibility' describes a metamodel of the relation between body and world, between perception and expression. I argue that this metamodel allows for the technical architectures that enshrine media processes and models as both the manifestation and modulation of the 'industry' or vitality of mind. I argue that these technical architectures are crucial to the creation and maintenance of dynamic ecologies of living.

  13. Redefining reproductive success in songbirds: Moving beyond the nest success paradigm

    Science.gov (United States)

    Streby, Henry M.; Refsnider, Jeanine M.; Andersen, David E.

    2014-01-01

    One of the most commonly estimated parameters in studies of songbird ecology is reproductive success, as a measure of either individual fitness or population productivity. Traditionally, the “success” in reproductive success refers to whether, or how many, nestlings leave nests. Here, we advocate that “reproductive success” in songbirds be redefined as full-season productivity, or the number of young raised to independence from adult care in a breeding season. A growing body of evidence demonstrates interdependence between nest success and fledgling survival, and emphasizes that data from either life stage alone can produce misleading measures of individual fitness and population productivity. Nest success, therefore, is an insufficient measure of reproductive success, and songbird ecology needs to progress beyond this long-standing paradigm. Full-season productivity, an evolutionarily rational measure of reproductive success, provides the framework for appropriately addressing unresolved questions about the adaptive significance of many breeding behaviors and within which effective breeding-grounds conservation and management can be designed.

  14. Summit surprises.

    Science.gov (United States)

    Myers, N

    1994-01-01

    A New Delhi Population Summit, organized by the Royal Society, the US National Academy of Sciences, the Royal Swedish Academy of Sciences, and the Indian National Science Academy, was convened with representation of 120 (only 10% women) scientists from 50 countries and about 12 disciplines and 43 national scientific academies. Despite the common assumption that scientists never agree, a 3000 word statement was signed by 50 prominent national figures and supported by 25 professional papers on diverse subjects. The statement proclaimed that stable world population and "prodigious planning efforts" are required for dealing with global social, economic, and environmental problems. The target should be zero population growth by the next generation. The statement, although containing many uncompromising assertions, was not as strong as a statement by the Royal Society and the US National Academy of Sciences released last year: that, in the future, science and technology may not be able to prevent "irreversible degradation of the environment and continued poverty," and that the capacity to sustain life on the planet may be permanently jeopardized. The Delhi statement was backed by professional papers highlighting several important issues. Dr Mahmoud Fathalla of the Rockefeller Foundation claimed that the 500,000 annual maternal deaths worldwide, of which perhaps 33% are due to "coathanger" abortions, are given far less attention than a one-day political event of 500 deaths would receive. Although biologically women have been given a greater survival advantage, which is associated with their reproductive capacity, socially disadvantaged females are relegated to low status. There is poorer nutrition and overall health care for females, female infanticide, and female fetuses are increasingly aborted in China, India, and other countries. The sex ratio in developed countries is 95-97 males to every 100 females, but in developing Asian countries the ratio is 105 males to 100 females. There are reports of 60-100 million missing females. The human species 12,000 years ago had a population of 6 million, a life expectancy of 20 years, and a doubling time of 8000 years; high birth rates were important for preservation of the species. Profertility attitudes are still prevalent today. Insufficient funds go to contraceptive research.

  15. Surprises from a Deep ASCA Spectrum of the Broad Absorption Line Quasar PHL 5200

    Science.gov (United States)

    Mathur, Smita; Matt, G.; Green, P. J.; Elvis, M.; Singh, K. P.

    2002-01-01

    We present a deep (approx. 85 ks) ASCA observation of the prototype broad absorption line quasar (BALQSO) PHL 5200. This is the best X-ray spectrum of a BALQSO yet. We find the following: (1) The source is not intrinsically X-ray weak. (2) The line-of-sight absorption is very strong, with N(sub H) = 5 x 10(exp 23)/sq cm. (3) The absorber does not cover the source completely; the covering fraction is approx. 90%. This is consistent with the large optical polarization observed in this source, implying multiple lines of sight. The most surprising result of this observation is that (4) the spectrum of this BALQSO is not exactly similar to other radio-quiet quasars. The hard X-ray spectrum of PHL 5200 is steep, with the power-law spectral index alpha approx. 1.5. This is similar to the steepest hard X-ray slopes observed so far. At low redshifts, such steep slopes are observed in narrow-line Seyfert 1 (NLS1) galaxies, believed to be accreting at a high Eddington rate. This observation strengthens the analogy between BALQSOs and NLS1 galaxies and supports the hypothesis that BALQSOs represent an early evolutionary state of quasars. It is well accepted that the orientation to the line of sight determines the appearance of a quasar: age seems to play a significant role as well.

  16. Research into Success

    Directory of Open Access Journals (Sweden)

    Bogomir Novak

    1997-12-01

    Full Text Available As competition is becoming ever more fierce, research into the prerequisites for success is gaining ground. By most people, success is perceived as an external phenomenon, but it is in fact the consequence of a person's readiness to perform in the world (of business. In the paper, Novak distinguishes between internal, external and group success. The essence of interna!success, which is the condition for the other two types of success, is assuming responsibility for, and exercising self-control over one's psychic phenomena. This in fact means that one needs to "reprogramme" the old patterns of behaviour and substitute them for the new, which leads to personality changes based on the understanding and acceptance of the self and others as they are. In realizing personal abilities, motives and goals, mental guiding laws must also be taken into account. Nowadays, the overall success of an organization is an important indicator of the quality of gro up work. The working patterns of individuals comply with the patterns used by his or her colleagues. When we do something for ourselves, we do it for others. In certain organizations, through accepted ways of communication all people become successful, and no body needs to be paid off. Employees wholly identify themselves with their organization, and vice versa. This three-part paradigm (I-Others-Community is the basis for various models of practical training for success, which are often idealized, but are primarily aimed at abolishing passivity and flaws in the system and its wider environment.

  17. Metaproteomics of cellulose methanisation under thermophilic conditions reveals a surprisingly high proteolytic activity.

    Science.gov (United States)

    Lü, Fan; Bize, Ariane; Guillot, Alain; Monnet, Véronique; Madigou, Céline; Chapleur, Olivier; Mazéas, Laurent; He, Pinjing; Bouchez, Théodore

    2014-01-01

    Cellulose is the most abundant biopolymer on Earth. Optimising energy recovery from this renewable but recalcitrant material is a key issue. The metaproteome expressed by thermophilic communities during cellulose anaerobic digestion was investigated in microcosms. By multiplying the analytical replicates (65 protein fractions analysed by MS/MS) and relying solely on public protein databases, more than 500 non-redundant protein functions were identified. The taxonomic community structure as inferred from the metaproteomic data set was in good overall agreement with 16S rRNA gene tag pyrosequencing and fluorescent in situ hybridisation analyses. Numerous functions related to cellulose and hemicellulose hydrolysis and fermentation catalysed by bacteria related to Caldicellulosiruptor spp. and Clostridium thermocellum were retrieved, indicating their key role in the cellulose-degradation process and also suggesting their complementary action. Despite the abundance of acetate as a major fermentation product, key methanogenesis enzymes from the acetoclastic pathway were not detected. In contrast, enzymes from the hydrogenotrophic pathway affiliated to Methanothermobacter were almost exclusively identified for methanogenesis, suggesting a syntrophic acetate oxidation process coupled to hydrogenotrophic methanogenesis. Isotopic analyses confirmed the high dominance of the hydrogenotrophic methanogenesis. Very surprising was the identification of an abundant proteolytic activity from Coprothermobacter proteolyticus strains, probably acting as scavenger and/or predator performing proteolysis and fermentation. Metaproteomics thus appeared as an efficient tool to unravel and characterise metabolic networks as well as ecological interactions during methanisation bioprocesses. More generally, metaproteomics provides direct functional insights at a limited cost, and its attractiveness should increase in the future as sequence databases are growing exponentially.

  18. Old Star's "Rebirth" Gives Astronomers Surprises

    Science.gov (United States)

    2005-04-01

    Astronomers using the National Science Foundation's Very Large Array (VLA) radio telescope are taking advantage of a once-in-a-lifetime opportunity to watch an old star suddenly stir back into new activity after coming to the end of its normal life. Their surprising results have forced them to change their ideas of how such an old, white dwarf star can re-ignite its nuclear furnace for one final blast of energy. Sakurai's Object Radio/Optical Images of Sakurai's Object: Color image shows nebula ejected thousands of years ago. Contours indicate radio emission. Inset is Hubble Space Telescope image, with contours indicating radio emission; this inset shows just the central part of the region. CREDIT: Hajduk et al., NRAO/AUI/NSF, ESO, StSci, NASA Computer simulations had predicted a series of events that would follow such a re-ignition of fusion reactions, but the star didn't follow the script -- events moved 100 times more quickly than the simulations predicted. "We've now produced a new theoretical model of how this process works, and the VLA observations have provided the first evidence supporting our new model," said Albert Zijlstra, of the University of Manchester in the United Kingdom. Zijlstra and his colleagues presented their findings in the April 8 issue of the journal Science. The astronomers studied a star known as V4334 Sgr, in the constellation Sagittarius. It is better known as "Sakurai's Object," after Japanese amateur astronomer Yukio Sakurai, who discovered it on February 20, 1996, when it suddenly burst into new brightness. At first, astronomers thought the outburst was a common nova explosion, but further study showed that Sakurai's Object was anything but common. The star is an old white dwarf that had run out of hydrogen fuel for nuclear fusion reactions in its core. Astronomers believe that some such stars can undergo a final burst of fusion in a shell of helium that surrounds a core of heavier nuclei such as carbon and oxygen. However, the

  19. Know Your Enemy: Successful Bioinformatic Approaches to Predict Functional RNA Structures in Viral RNAs

    Science.gov (United States)

    Lim, Chun Shen; Brown, Chris M.

    2018-01-01

    Structured RNA elements may control virus replication, transcription and translation, and their distinct features are being exploited by novel antiviral strategies. Viral RNA elements continue to be discovered using combinations of experimental and computational analyses. However, the wealth of sequence data, notably from deep viral RNA sequencing, viromes, and metagenomes, necessitates computational approaches being used as an essential discovery tool. In this review, we describe practical approaches being used to discover functional RNA elements in viral genomes. In addition to success stories in new and emerging viruses, these approaches have revealed some surprising new features of well-studied viruses e.g., human immunodeficiency virus, hepatitis C virus, influenza, and dengue viruses. Some notable discoveries were facilitated by new comparative analyses of diverse viral genome alignments. Importantly, comparative approaches for finding RNA elements embedded in coding and non-coding regions differ. With the exponential growth of computer power we have progressed from stem-loop prediction on single sequences to cutting edge 3D prediction, and from command line to user friendly web interfaces. Despite these advances, many powerful, user friendly prediction tools and resources are underutilized by the virology community. PMID:29354101

  20. A successful effort to involve stakeholders in a facility siting decision using LIPS with stakeholder involvement

    International Nuclear Information System (INIS)

    Merkhofer, L.; Conway, R.; Anderson, B.

    1995-01-01

    Local public opposition to federal bureaucratic decisions has resulted in public agencies rethinking the role of stakeholders in decision making. Efforts to include stakeholders directly in the decision-making process are on the increase. Unfortunately, many attempts to involve members of the public in decisions involving complex technical issues have failed. A key problem has been defining a meaningful role for the public in the process of arriving at a technical decision. This paper describes a successful effort by Sandia National Laboratories (SNL) in New Mexico to involve stakeholders in an important technical decision associated with its Environmental Restoration (ER) Project. The decision was where to locate a Corrective Action Management Unit (CAMU), a facility intended to consolidate and store wastes generated from the cleanup of hazardous waste sites. A formal priority setting process known as the Laboratory Integration Prioritization System (LIPS) was adapted to provide an approach for involving the public. Although rarely applied to stakeholder participation, the LIPS process proved surprisingly effective. It produced a consensus over a selected site and enhanced public trust and understanding of Project activities

  1. A modified multi-objective particle swarm optimization approach and its application to the design of a deepwater composite riser

    Science.gov (United States)

    Zheng, Y.; Chen, J.

    2017-09-01

    A modified multi-objective particle swarm optimization method is proposed for obtaining Pareto-optimal solutions effectively. Different from traditional multi-objective particle swarm optimization methods, Kriging meta-models and the trapezoid index are introduced and integrated with the traditional one. Kriging meta-models are built to match expensive or black-box functions. By applying Kriging meta-models, function evaluation numbers are decreased and the boundary Pareto-optimal solutions are identified rapidly. For bi-objective optimization problems, the trapezoid index is calculated as the sum of the trapezoid's area formed by the Pareto-optimal solutions and one objective axis. It can serve as a measure whether the Pareto-optimal solutions converge to the Pareto front. Illustrative examples indicate that to obtain Pareto-optimal solutions, the method proposed needs fewer function evaluations than the traditional multi-objective particle swarm optimization method and the non-dominated sorting genetic algorithm II method, and both the accuracy and the computational efficiency are improved. The proposed method is also applied to the design of a deepwater composite riser example in which the structural performances are calculated by numerical analysis. The design aim was to enhance the tension strength and minimize the cost. Under the buckling constraint, the optimal trade-off of tensile strength and material volume is obtained. The results demonstrated that the proposed method can effectively deal with multi-objective optimizations with black-box functions.

  2. Revisiting the Relationship between Marketing Education and Marketing Career Success

    Science.gov (United States)

    Bacon, Donald R.

    2017-01-01

    In a replication of a classic article by Hunt, Chonko, and Wood, regression analysis was conducted using data from a sample of 864 marketing professionals. In contrast to Hunt, Chonko, and Wood, an undergraduate degree in marketing was positively related to income in marketing jobs, but surprisingly, respondents with some nonmarketing majors…

  3. A first formal link between the price equation and an optimization program.

    Science.gov (United States)

    Grafen, Alan

    2002-07-07

    The Darwin unification project is pursued. A meta-model encompassing an important class of population genetic models is formed by adding an abstract model of the number of successful gametes to the Price equation under uncertainty. A class of optimization programs are defined to represent the "individual-as-maximizing-agent analogy" in a general way. It is then shown that for each population genetic model there is a corresponding optimization program with which formal links can be established. These links provide a secure logical foundation for the commonplace biological principle that natural selection leads organisms to act as if maximizing their "fitness", provides a definition of "fitness", and clarifies the limitations of that principle. The situations covered do not include frequency dependence or social behaviour, but the approach is capable of extension.

  4. Surprise responses in the human brain demonstrate statistical learning under high concurrent cognitive demand

    Science.gov (United States)

    Garrido, Marta Isabel; Teng, Chee Leong James; Taylor, Jeremy Alexander; Rowe, Elise Genevieve; Mattingley, Jason Brett

    2016-06-01

    The ability to learn about regularities in the environment and to make predictions about future events is fundamental for adaptive behaviour. We have previously shown that people can implicitly encode statistical regularities and detect violations therein, as reflected in neuronal responses to unpredictable events that carry a unique prediction error signature. In the real world, however, learning about regularities will often occur in the context of competing cognitive demands. Here we asked whether learning of statistical regularities is modulated by concurrent cognitive load. We compared electroencephalographic metrics associated with responses to pure-tone sounds with frequencies sampled from narrow or wide Gaussian distributions. We showed that outliers evoked a larger response than those in the centre of the stimulus distribution (i.e., an effect of surprise) and that this difference was greater for physically identical outliers in the narrow than in the broad distribution. These results demonstrate an early neurophysiological marker of the brain's ability to implicitly encode complex statistical structure in the environment. Moreover, we manipulated concurrent cognitive load by having participants perform a visual working memory task while listening to these streams of sounds. We again observed greater prediction error responses in the narrower distribution under both low and high cognitive load. Furthermore, there was no reliable reduction in prediction error magnitude under high-relative to low-cognitive load. Our findings suggest that statistical learning is not a capacity limited process, and that it proceeds automatically even when cognitive resources are taxed by concurrent demands.

  5. Are seismic hazard assessment errors and earthquake surprises unavoidable?

    Science.gov (United States)

    Kossobokov, Vladimir

    2013-04-01

    Why earthquake occurrences bring us so many surprises? The answer seems evident if we review the relationships that are commonly used to assess seismic hazard. The time-span of physically reliable Seismic History is yet a small portion of a rupture recurrence cycle at an earthquake-prone site, which makes premature any kind of reliable probabilistic statements about narrowly localized seismic hazard. Moreover, seismic evidences accumulated to-date demonstrate clearly that most of the empirical relations commonly accepted in the early history of instrumental seismology can be proved erroneous when testing statistical significance is applied. Seismic events, including mega-earthquakes, cluster displaying behaviors that are far from independent or periodic. Their distribution in space is possibly fractal, definitely, far from uniform even in a single segment of a fault zone. Such a situation contradicts generally accepted assumptions used for analytically tractable or computer simulations and complicates design of reliable methodologies for realistic earthquake hazard assessment, as well as search and definition of precursory behaviors to be used for forecast/prediction purposes. As a result, the conclusions drawn from such simulations and analyses can MISLEAD TO SCIENTIFICALLY GROUNDLESS APPLICATION, which is unwise and extremely dangerous in assessing expected societal risks and losses. For example, a systematic comparison of the GSHAP peak ground acceleration estimates with those related to actual strong earthquakes, unfortunately, discloses gross inadequacy of this "probabilistic" product, which appears UNACCEPTABLE FOR ANY KIND OF RESPONSIBLE SEISMIC RISK EVALUATION AND KNOWLEDGEABLE DISASTER PREVENTION. The self-evident shortcomings and failures of GSHAP appeals to all earthquake scientists and engineers for an urgent revision of the global seismic hazard maps from the first principles including background methodologies involved, such that there becomes: (a) a

  6. Successful ageing

    DEFF Research Database (Denmark)

    Bülow, Morten Hillgaard; Söderqvist, Thomas

    2014-01-01

    Since the late 1980s, the concept of ‘ successful ageing’ has set the frame for discourse about contemporary ageing research. Through an analysis of the reception to John W. Rowe and Robert L. Kahn's launch of the concept of ‘ successful ageing’ in 1987, this article maps out the important themes...... and discussions that have emerged from the interdisciplinary field of ageing research. These include an emphasis on interdisciplinarity; the interaction between biology, psycho-social contexts and lifestyle choices; the experiences of elderly people; life-course perspectives; optimisation and prevention...... strategies; and the importance of individual, societal and scientific conceptualisations and understandings of ageing. By presenting an account of the recent historical uses, interpretations and critiques of the concept, the article unfolds the practical and normative complexities of ‘ successful ageing’....

  7. Real-time simulation of biological soft tissues: a PGD approach.

    Science.gov (United States)

    Niroomandi, S; González, D; Alfaro, I; Bordeu, F; Leygue, A; Cueto, E; Chinesta, F

    2013-05-01

    We introduce here a novel approach for the numerical simulation of nonlinear, hyperelastic soft tissues at kilohertz feedback rates necessary for haptic rendering. This approach is based upon the use of proper generalized decomposition techniques, a generalization of PODs. Proper generalized decomposition techniques can be considered as a means of a priori model order reduction and provides a physics-based meta-model without the need for prior computer experiments. The suggested strategy is thus composed of an offline phase, in which a general meta-model is computed, and an online evaluation phase in which the results are obtained at real time. Results are provided that show the potential of the proposed technique, together with some benchmark test that shows the accuracy of the method. Copyright © 2013 John Wiley & Sons, Ltd.

  8. An Agent Based Approach to Coordination of Resource Allocation and Process Performance

    DEFF Research Database (Denmark)

    Umair, Aisha

    2018-01-01

    resource allocation and process performance in CPSoCPS. The proposed coordination mechanism constitutes a meta-model of CPSoCPS, intra-constituent optimisation model and inter-constituent negotiation model. The meta-model of CPSoCPS describes how multiple autonomous constituent-CPSs are networked together...... enhanced functionality and performance compared to that of the sum of individual systems. In this regard, the concept of Cyber-Physical Systems (CPSs) has emerged in recent years. CPSs are the systems, which combine computational algorithms and communication with physical processes. The System...... type of SoS where each constituent system constitutes a CPS. An important challenge in this case is to develop seamless collaboration between the constituent-CPSs to coordinate the operations of several autonomous-yet- interacting CPSs. In this thesis, we propose a coordination mechanism to coordinate...

  9. Flexible Software Process Lines in Practice

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Ternité, Thomas; Friedrich, Jan

    2016-01-01

    Process flexibility and adaptability is a frequently discussed topic in literature, and several approaches propose techniques to improve and optimize software processes for a given organization- or project context. A software process line (SPrL) is an instrument to systematically construct...... that can be adapted to the respective context. In this article, we present an approach to construct flexible software process lines and show its practical application in the German V-Modell XT. The presented approach emerges from a 10-year research endeavor and was used to enhance the metamodel of the V......-Modell XT and to allow for improved process variability and lifecycle management. Practical dissemination and complementing empirical research show the suitability of the concept. We therefore contribute a proven approach that is presented as metamodel fragment for reuse and implementation in further...

  10. An Integrated Framework to Specify Domain-Specific Modeling Languages

    DEFF Research Database (Denmark)

    Zarrin, Bahram; Baumeister, Hubert

    2018-01-01

    , a logic-based specification language. The drawback of MS DSL Tools is it does not provide a formal and rigorous approach for semantics specifications. In this framework, we use Microsoft DSL Tools to define the metamodel and graphical notations of DSLs, and an extended version of ForSpec as a formal......In this paper, we propose an integrated framework that can be used by DSL designers to implement their desired graphical domain-specific languages. This framework relies on Microsoft DSL Tools, a meta-modeling framework to build graphical domain-specific languages, and an extension of ForSpec...... language to define their semantics. Integrating these technologies under the umbrella of Microsoft Visual Studio IDE allows DSL designers to utilize a single development environment for developing their desired domain-specific languages....

  11. Dual-mode nested search method for categorical uncertain multi-objective optimization

    Science.gov (United States)

    Tang, Long; Wang, Hu

    2016-10-01

    Categorical multi-objective optimization is an important issue involved in many matching design problems. Non-numerical variables and their uncertainty are the major challenges of such optimizations. Therefore, this article proposes a dual-mode nested search (DMNS) method. In the outer layer, kriging metamodels are established using standard regular simplex mapping (SRSM) from categorical candidates to numerical values. Assisted by the metamodels, a k-cluster-based intelligent sampling strategy is developed to search Pareto frontier points. The inner layer uses an interval number method to model the uncertainty of categorical candidates. To improve the efficiency, a multi-feature convergent optimization via most-promising-area stochastic search (MFCOMPASS) is proposed to determine the bounds of objectives. Finally, typical numerical examples are employed to demonstrate the effectiveness of the proposed DMNS method.

  12. The Project of Success

    DEFF Research Database (Denmark)

    Kreiner, Kristian

    more complicated matter than meeting targets. While success may ultimately be justified in terms of a correspondence between aims and achievements, the understanding of both aspects is highly dependent on the project process. An example of a successful project that did not meet the original performance...... targets will serve to show that success is at matter of perspective as much as it is a matter of achievement. Other types of research, e.g. social psychology, have addressed the issue of success more explicitly. I draw on such literature to conceptualize project success anew and to reestablish...

  13. Paving the Road to Success: A Framework for Implementing the Success Tutoring Approach

    Directory of Open Access Journals (Sweden)

    Spark Linda

    2017-12-01

    Full Text Available The exponential growth of higher education enrolment in South Africa has resulted in increased diversity of the student body, leading to a proliferation of factors that affect student performance and success. Various initiatives have been adopted by tertiary institutions to mitigate the negative impact these factors may have on student success, and it is suggested that interventions that include aspects of social integration are the most successful. This paper outlines an approach called Success Tutoring (a non-academic tutorial approach used as part of a student success and support programme in the Faculty of Commerce, Law, and Management at the University of the Witwatersrand, which is underscored by empirical evidence drawn from evaluation data collected during Success Tutor symposia. The authors draw conclusions and make recommendations based on a thematic analysis of the dataset, and ultimately provide readers with a framework for implementing Success Tutoring at their tertiary institutions.

  14. Citation Success

    DEFF Research Database (Denmark)

    Vaio, Gianfranco Di; Waldenström, Daniel; Weisdorf, Jacob Louis

    2012-01-01

    This study examines the determinants of citation success among authors who have recently published their work in economic history journals. Besides offering clues about how to improve one's scientific impact, our citation analysis also sheds light on the state of the field of economic history...... find similar patterns when assessing the same authors' citation success in economics journals. As a novel feature, we demonstrate that the diffusion of research — publication of working papers, as well as conference and workshop presentations — has a first-order positive impact on the citation rate........ Consistent with our expectations, we find that full professors, authors appointed at economics and history departments, and authors working in Anglo-Saxon and German countries are more likely to receive citations than other scholars. Long and co-authored articles are also a factor for citation success. We...

  15. Succession and dynamics of Pristionchus nematodes and their microbiome during decomposition of Oryctes borbonicus on La Réunion Island.

    Science.gov (United States)

    Meyer, Jan M; Baskaran, Praveen; Quast, Christian; Susoy, Vladislav; Rödelsperger, Christian; Glöckner, Frank O; Sommer, Ralf J

    2017-04-01

    Insects and nematodes represent the most species-rich animal taxa and they occur together in a variety of associations. Necromenic nematodes of the genus Pristionchus are found on scarab beetles with more than 30 species known from worldwide samplings. However, little is known about the dynamics and succession of nematodes and bacteria during the decomposition of beetle carcasses. Here, we study nematode and bacterial succession of the decomposing rhinoceros beetle Oryctes borbonicus on La Réunion Island. We show that Pristionchus pacificus exits the arrested dauer stage seven days after the beetles´ deaths. Surprisingly, new dauers are seen after 11 days, suggesting that some worms return to the dauer stage after one reproductive cycle. We used high-throughput sequencing of the 16S rRNA genes of decaying beetles, beetle guts and nematodes to study bacterial communities in comparison to soil. We find that soil environments have the most diverse bacterial communities. The bacterial community of living and decaying beetles are more stable but one single bacterial family dominates the microbiome of decaying beetles. In contrast, the microbiome of nematodes is relatively similar even across different families. This study represents the first characterization of the dynamics of nematode-bacterial interactions during the decomposition of insects. © 2017 Society for Applied Microbiology and John Wiley & Sons Ltd.

  16. Predictors of successful external cephalic version and assessment of success for vaginal delivery.

    Science.gov (United States)

    Salzer, Liat; Nagar, Ran; Melamed, Nir; Wiznitzer, Arnon; Peled, Yoav; Yogev, Yariv

    2015-01-01

    To identify predictors of successful external cephalic version (ECV) and to compare delivery outcome between women who had a successful ECV and women with spontaneous vertex presentation. A retrospective cohort study of all women who underwent ECV in a single tertiary medical center between 2007 and 2011. Delivery outcome was compared between women who underwent a trial of vaginal delivery following successful ECV with that of a control group in a 2:1 ratio. Multivariate analysis was used to identify predictors of successful ECV. Overall 287 were eligible for the study group. Of these 130 (45.3%) had a successful ECV. Polyhydramnios was the strongest factor associated with successful ECV (OR=3.1, 95%-CI 1.4-7.2), followed by transverse lie (versus breech presentation, OR=2.6, 95%-CI 1.2-6.7) and a posterior placenta (OR=1.7, 95%-CI 1.1-3.9), while nulliparity was associated with a lower likelihood of successful ECV (OR=0.4, 95%-CI 0.2-0.6). Women who had a successful ECV and underwent a trial of labor were more likely to deliver by operative vaginal delivery (OVD) (OR=1.8, 95%-CI 1.2-3.6), mainly due to a higher rate of prolonged 2nd, but were not at an increased risk for CS (OR=0.9, 95%-CI 0.4-2.4). Counselling to women prior to ECV should address the likelihood of success based on the predicting factors described above, as well as the increased risk for OVD in the case of successful ECV.

  17. What Does Successful Aging Mean? Lay Perception of Successful Aging Among Elderly Singaporeans.

    Science.gov (United States)

    Feng, Qiushi; Straughan, Paulin Tay

    2017-03-01

    We explore the culturally specific meaning of successful aging in Singapore, an ethnically diverse city-state in Asia. We aim to investigate lay perceptions of successful aging among the elderly individuals in Singapore and further examine variations of these perceptions. We applied a mixed-method research design. Firstly, we conducted qualitative interviews with 49 elderly respondents, generating 12 main subjective components of successful aging. Next, we did a national survey with a sample of 1,540 local residents aged 50 to 69 years, in which respondents were asked to evaluate the importance of each subjective component of successful aging. We used the regression models and latent class analysis to analyze the correlatives of successful aging and to classify the elderly individuals by perception types. Among 12 components of successful aging, those related to self-sufficiency received the highest acknowledgment among the elderly individuals in Singapore. At least half of them simultaneously highlighted independence from family and dependence on family. Malays and Indians in Singapore valued more of the roles of spouse and children in successful aging, as compared with Chinese. The latent class analysis classified four groups of the elderly individuals according to their lay views on successful aging. As compared with the western model of successful aging, the elderly individuals in Singapore perceived successful aging with a strong focus on familism. These lay perceptions also significantly varied among these elderly individuals. © The Author 2016. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. The trials, tribulations, and relative success of the ongoing clinical merger of two large academic hospital systems.

    Science.gov (United States)

    Cohen, J R; Dowling, M; Gallagher, J S

    2001-07-01

    The North Shore Health System and the Long Island Jewish Medical Center merged in 1997 and now form the third largest not-for-profit academic health care system in the United States. The authors analyze the specific factors responsible for the relative success of the clinical merger, review their merger's initial failures and how they crafted a more pragmatic and appropriate set of guiding principles to continue the merger, and discuss the future of their institution's clinical integration strategy. In 2000, clinical integration of the 19 clinical departments at the two merged institutions was surveyed across five broad areas: conferences, residency programs, common faculty and support staff, finances, and research. Extents of clinical integration ranged from 20% to 72%. Six departments had more than 50% clinical integration, and overall clinical integration was 42%. Not surprisingly, clinical integration had occurred most frequently with conferences (50%) and least with finances (25%). The single-chairperson model for department leadership has been most successful in achieving significant clinical integration of the formerly separate departments. The relative success of the clinical merger has been guided by the principle that no clinical service should be integrated simply for the sake of merging, but rather that integration should be encouraged where and when it makes sense to achieve specific program goals. In addition, the merger would not have proceeded without constant communication among the leadership and staff, flexibility in building leadership models, patience in having events progress over a time course that developed trust among the senior leaders, and the presence of a senior executive structure whose authority to make decisions is accepted. The most important factor for achieving a reasonable level of clinical integration is the ability of the clinical leaders to collaborate and lead the change process.

  19. Beyond the Rose-Colored Binoculars: How to Launch a Successful Physics Career in the 21st Century

    Science.gov (United States)

    Bailey, Crystal

    Physics degree holders are among the most employable in the world, often doing everything from managing a research lab at a multi-million dollar corporation, to developing solutions to global problems in their own small startups. Employers know that with a physics training, a potential hire has acquired a broad problem-solving skill set that translates to almost any environment, as well as an ability to be self- guided and -motivated so that they can learn whatever skills are needed to successfully achieve their goals. Therefore it's no surprise that the majority of physics graduates find employment in private sector, industrial settings. Yet at the same time, only about 25 graduating PhDs will take a permanent faculty position- yet academic careers are usually the only track to which students are exposed while earning their degrees. In this talk, I will explore less-familiar (but more common!) career paths for physics graduates, and provide resources to help faculty mentors give their students better information and training for a broader scope of career possibilities.

  20. Studies of key success factors of product development success: A reinterpretation of results

    DEFF Research Database (Denmark)

    Plichta, Kirsten; Harmsen, Hanne

    In this paper the general validity of the research area of key factors of success in product development is discussed. To be more specific we argue that validity hinges on the causal relation between success and success factrors ­ a relation that unaccounted for in the empirical studies....... The theoretical tradition of the resource-based perspective provides ­ at least to some extent ­ an account of this causality. An important point in the paper is that the key factors of success in the empi studies are not factors causally related to success, but at the most a number of valuable resources and thus......, but in the studies problems concerning implementation are not discussed. When the lists factors of success are interprested as valuable resources and capabilities, we show that some implications on the implementability might be deduced from the resource-based perspective....

  1. Business Intelligence Success Factors

    DEFF Research Database (Denmark)

    Gaardboe, Rikke; Jonasen, Tanja Svarre

    2018-01-01

    Business intelligence (BI) is a strategically important practice in many organizations. Several studies have investigated the factors that contribute to BI success; however, an overview of the critical success factors (CSFs) involved is lacking in the extant literature. We have integrated...... 34 CSFs related to BI success. The distinct CSFs identified in the extant literature relate to project management skills (13 papers), management support (20 papers), and user involvement (11 papers). In the articles with operationalized BI success, we found several distinct factors: system quality...

  2. Explanatory models of health and disease: surprises from within the former Soviet Union

    Directory of Open Access Journals (Sweden)

    Tatiana I Andreeva

    2013-06-01

    Full Text Available Extract The review of anthropological theories as applied to public health by Jennifer J. Carroll (Carroll, 2013 published in this issue of TCPHEE made me recollect my first and most surprising discoveries of how differently same things can be understood in different parts of the world. Probably less unexpectedly, these impressions concern substance abuse and addiction behaviors, similarly to many examples deployed by Jennifer J. Carroll. The first of these events happened soon after the break-up of the Soviet Union when some of the most active people from the West rushed to discover what was going on behind the opening iron curtain. A director of an addiction clinic, who had just come into contact with a Dutch counterpart, invited me to join the collaboration and the innovation process he planned to launch. Being a participant of the exchange program started within this collaboration, I had an opportunity to discover how addictive behaviors were understood and explained in books (English, 1961; Kooyman, 1992; Viorst, 1986 recommended by the colleagues in the Netherlands and, as I could observe with my own eyes, addressed in everyday practice. This was a jaw-dropping contrast to what I learnt at the soviet medical university and some post-graduate courses, where all the diseases related to alcohol, tobacco, or drug abuse were considered predominantly a result of the substance intake. In the Soviet discourse, the intake itself was understood as 'willful and deliberate' or immoral behavior which, in some cases, was to be rectified in prison-like treatment facilities. In the West, quite oppositely, substance abuse was seen rather as a consequence of a constellation of life-course adversities thoroughly considered by developmental psychology. This approach was obviously deeply ingrained in how practitioners diagnosed and treated their patients.

  3. A surprisingly simple correlation between the classical and quantum structural networks in liquid water

    Science.gov (United States)

    Hamm, Peter; Fanourgakis, George S.; Xantheas, Sotiris S.

    2017-08-01

    Nuclear quantum effects in liquid water have profound implications for several of its macroscopic properties related to the structure, dynamics, spectroscopy, and transport. Although several of water's macroscopic properties can be reproduced by classical descriptions of the nuclei using interaction potentials effectively parameterized for a narrow range of its phase diagram, a proper account of the nuclear quantum effects is required to ensure that the underlying molecular interactions are transferable across a wide temperature range covering different regions of that diagram. When performing an analysis of the hydrogen-bonded structural networks in liquid water resulting from the classical (class) and quantum (qm) descriptions of the nuclei with two interaction potentials that are at the two opposite ends of the range in describing quantum effects, namely the flexible, pair-wise additive q-TIP4P/F, and the flexible, polarizable TTM3-F, we found that the (class) and (qm) results can be superimposed over the temperature range T = 250-350 K using a surprisingly simple, linear scaling of the two temperatures according to T(qm) = α T(class) + ΔT, where α = 0.99 and ΔT = -6 K for q-TIP4P/F and α = 1.24 and ΔT = -64 K for TTM3-F. This simple relationship suggests that the structural networks resulting from the quantum and classical treatment of the nuclei with those two very different interaction potentials are essentially similar to each other over this extended temperature range once a model-dependent linear temperature scaling law is applied.

  4. Successful project management

    CERN Document Server

    Young, Trevor L

    2016-01-01

    Successful Project Management, 5th edition, is an essential guide for anyone who wants to improve the success rate of their projects. It will help managers to maintain a balance between the demands of the customer, the project, the team and the organization. Covering the more technical aspects of a project from start to completion it contains practised and tested techniques, covering project conception and start-up, how to manage stake holders, effective risk management, project planning and launch and execution. Also including a brand new glossary of key terms, it provides help with evaluating your project as well as practical checklists and templates to ensure success for any ambitious project manager. With over one million copies sold, the hugely popular Creating Success series covers a wide variety of topic, with the latest editions including new chapters such as Tough Conversations and Treating People Right. This indispensable business skills collection is suited to a variety of roles, from someone look...

  5. Landscape structure and management alter the outcome of a pesticide ERA: evaluating impacts of endocrine disruption using the ALMaSS European Brown Hare model

    DEFF Research Database (Denmark)

    Topping, Christopher John; Dalby, Lars; Skov, Flemming

    2016-01-01

    from data collected primarily for EU agricultural subsidy support and GIS map data. Ten different Danish landscapes were generated and the ERA carried out for each landscape using two different assumed toxicities. The results showed negative impacts in all cases, but the extent and form in terms...... of impacts on abundance or occupancy differed greatly between landscapes. A meta-model was created, predicting impact from landscape and farming characteristics. Scenarios based on all combinations of farming and landscape for five landscapes representing extreme and middle impacts were created. The meta......-models developed from the 10 real landscapes failed to predict impacts for these 25 scenarios. Landscape, farming, and the emergent density of hares all influenced the results of the risk assessment considerably. The study indicates that prediction of a reasonable worst case scenario is difficult from structural...

  6. Carbon Dioxide: Surprising Effects on Decision Making and Neurocognitive Performance

    Science.gov (United States)

    James, John T.

    2013-01-01

    The occupants of modern submarines and the International Space Station (ISS) have much in common as far as their air quality is concerned. Air is polluted by materials offgassing, use of utility compounds, leaks of systems chemicals, and anthropogenic sources. The primary anthropogenic compound of concern to submariners and astronauts has been carbon dioxide (CO2). NASA and the US Navy rely on the National Research Council Committee on Toxicology (NRC-COT) to help formulate exposure levels to CO2 that are thought to be safe for exposures of 3-6 months. NASA calls its limits Spacecraft Maximum Allowable Concentrations (SMACs). Years of experience aboard the ISS and a recent publication on deficits in decision making in ground-based subjects exposed briefly to 0.25% CO2 suggest that exposure levels that have been presumed acceptable to preserve health and performance need to be reevaluated. The current CO2 exposure limits for 3-6 months set by NASA and the UK Navy are 0.7%, and the limit for US submariners is 0.5%, although the NRC-COT recommended a 90-day level of 0.8% as safe a few years ago. NASA has set a 1000-day SMAC at 0.5% for exploration-class missions. Anecdotal experience with ISS operations approaching the current 180-day SMAC of 0.7% suggest that this limit is too high. Temporarily, NASA has limited exposures to 0.5% until further peer-reviewed data become available. In the meantime, a study published last year in the journal Environmental Health Perspectives (Satish U, et al. 2012) demonstrated that complexdecision- making performance is somewhat affected at 0.1% CO2 and becomes "dysfunctional" for at least half of the 9 indices of performance at concentrations approaching 0.25% CO2. The investigators used the Strategic Management Simulation (SMS) method of testing for decisionmaking ability, and the results were so surprising to the investigators that they declared that their findings need to be independently confirmed. NASA has responded to the

  7. Sensitivity Analysis of Simulation Models

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2009-01-01

    This contribution presents an overview of sensitivity analysis of simulation models, including the estimation of gradients. It covers classic designs and their corresponding (meta)models; namely, resolution-III designs including fractional-factorial two-level designs for first-order polynomial

  8. Surpresa refrativa pós-facoemulsificação em distrofia corneana posterior amorfa Post-phacoemulsification refractive surprise in a posterior amorphous corneal dystrophy patient

    Directory of Open Access Journals (Sweden)

    Giuliano de Oliveira Freitas

    2010-02-01

    Full Text Available Relato de um caso de surpresa refracional pós-operatória não pretendida em paciente portador de distrofia corneana posterior amorfa submetida à facoemulsificação. A provável causa do erro, bem como a conduta tomada a partir do reconhecimento da mesma, são discutidas neste relato.One case of post-phacoemulsification refractive surprise in a posterior amorphous corneal dystrophy patient is reported herein. Its likely causative factor, as well as our approach once it was recognized are discussed in this report.

  9. Mergers: Success versus failure

    International Nuclear Information System (INIS)

    Carley, G. R.

    1997-01-01

    Successful mergers in the context of long-term value creation, as measured by return realized on investor-provided capital, were discussed. In essence, a successful merger is characterized by being motivated by a sound business reason and strategy for the merger, a reasonable price and sound execution. The acquiror's pre-merger success in managing a company is a good indicator of future success. Poorly managed companies that acquire other companies generally continue to be poorly managed with no significant increase in shareholder value. Prior to the acquisition, identification of the potential target, assessment of the people involved on both sides of the transaction, thorough knowledge of the target's potential for value creation, financial implications (debt, equity, terms and demand, tax implications, the potential effect of the proposed acquisition on the acquiror's business plan) and finally the execution of the process itself, are the important determinants of successful mergers

  10. Integrating socio-economic and biophysical data to support water allocations within river basins: an example from the Inkomati Water Management Area in South Africa

    CSIR Research Space (South Africa)

    De Lange, Willem J

    2010-01-01

    Full Text Available ; based on a meta-modelling approach using Geographical Information Systems, the geo-spatial analysis platform, and an application of a water-use simulation model. The method is developed and applied to the irrigation agriculture sector in the Inkomati...

  11. Discrete least squares polynomial approximation with random evaluations - application to PDEs with Random parameters

    KAUST Repository

    Nobile, Fabio

    2015-01-01

    the parameter-to-solution map u(y) from random noise-free or noisy observations in random points by discrete least squares on polynomial spaces. The noise-free case is relevant whenever the technique is used to construct metamodels, based on polynomial

  12. Editorial - Special Issue on Model-driven Service-oriented architectures

    NARCIS (Netherlands)

    Andrade Almeida, João; Ferreira Pires, Luis; van Sinderen, Marten J.; Steen, M.W.A.

    2009-01-01

    Model-driven approaches to software development have proliferated in recent years owing to the availability of techniques based on metamodelling and model transformations, such as the meta-object facility (MOF) and the query view transformation (QVT) standards. During the same period,

  13. Technological monitoring radar: a weak signals interpretation tool for the identification of strategic surprises

    Directory of Open Access Journals (Sweden)

    Adalton Ozaki

    2011-07-01

    Full Text Available In the current competitive scenario, marked by rapid and constant changes, it is vital that companies actively monitor the business environment, in search of signs which might anticipate changes. This study poses to propose and discuss a tool called Technological Monitoring Radar, which endeavours to address the following query: “How can a company systematically monitor the environment and capture signs that anticipate opportunities and threats concerning a particular technology?”. The literature review covers Competitive Intelligence, Technological Intelligence, Environmental Analysis and Anticipative Monitoring. Based on the critical analysis of the literature, a tool called Technological Monitoring Radar is proposed comprising five environments to be monitored (political, economical, technological, social and competition each of which with key topics for analysis. To exemplify the use of the tool, it is applied to the smartphone segment in an exclusively reflexive manner, and without the participation of a specific company. One of the suggestions for future research is precisely the application of the proposed methodology in an actual company. Despite the limitation of this being a theoretical study, the example demonstrated the tool´s applicability. The radar prove to be very useful for a company that needs to monitor the environment in search of signs of change. This study´s main contribution is to relate different fields of study (technological intelligence, environmental analysis and anticipative monitoring and different approaches to provide a practical tool that allows a manager to identify and better visualize opportunities and threats, thus avoiding strategic surprises in the technological arena.Key words: Technological monitoring. Technological intelligence. Competitive intelligence. Weak signals.

  14. Human Resource Outsourcing Success

    Directory of Open Access Journals (Sweden)

    Hasliza Abdul-Halim

    2014-07-01

    Full Text Available The existing literature on partnership seems to take the relationship between partnership quality and outsourcing success for granted. Therefore, this article aims at examining the role of service quality in strengthening the relationship between partnership quality and human resource (HR outsourcing success. The samples were obtained from 96 manufacturing organizations in Penang, Malaysia. The results showed that partnership quality variables such as trust, business understanding, and communication have significant positive impact on HR outsourcing success, whereas in general, service quality was found to partially moderate these relationships. Therefore, comprehending the HR outsourcing relationship in the context of service quality may assist the organizations to accomplish HR outsourcing success by identifying areas of expected benefits and improvements.

  15. Target cell availability and the successful suppression of HIV by hydroxyurea and didanosine

    NARCIS (Netherlands)

    Boer, R.J. de; Boucher, C.A.B.; Perelson, A.S.

    1998-01-01

    Surprisingly, immunosuppressive treatment can enhance the efficacy of conventional HIV-1 antiretroviral treatment, and can be beneficial for HIV-1- infected patients. This argues for a role of target cell availability in limiting the HIV-1 infection, and is in agreement with mathematical models

  16. Modern Sedimentation along the SE Bangladesh Coast Reveal Surprisingly Low Accumulation Rates

    Science.gov (United States)

    McHugh, C.; Mustaque, S.; Mondal, D. R.; Akhter, S. H.; Iqbal, M.

    2016-12-01

    Recent sediments recovered along the SE coast of Bangladesh, from Teknaf to Cox's Bazar and drainage basin analyses reveal sediment sources and very low sedimentation rates of 1mm/year. These low rates are surprisingly low given that this coast is adjacent to the Ganges-Brahmaputra Delta with a yearly discharge of 1GT. The Teknaf anticline (elevation 200 m), part of the western Burma fold-thrust belt dominates the topography extending across and along the Teknaf peninsula. It is thought to have begun evolving since the Miocene (Alam et al. 2003 & Allen et al. 2008). Presently the anticline foothills on the west are flanked by uplifted terraces, the youngest linked to coseismic displacement during the 1762 earthquake (Mondal et al. 2015), and a narrow beach 60-200 m in width. Petrography, semi-quantitative bulk mineralogy and SEM/EDX analyses were conducted on sediments recovered along the west coast from 1-4 m deep trenches and three 4-8 m deep drill holes. GIS mapping of drainage basins and quartz-feldspar-lithic (QFL) ternary plots based on grain counting show mixing of sediments from multiple sources: Himalayan provenance of metamorphic and igneous origin (garnet-mostly almandine, tourmaline, rutile, kyanite, zircon, sillimanite and clinopyroxene) similar to Uddin et al. (2007); Brahmaputra provenance of igneous and metamorphic origin (amphibole, epidote, plagioclase 40% Na and 60% Ca, apatite, ilmenite, magnetite, Cr-spinel and garnet-mostly grossular,) as indicated by Garzanti et al. (2010) & Rahman et al. (2016) and Burmese sources (cassiterite and wolframite) (Zaw 1990 & Searle et al. 2007). Low sedimentation rates are the result of two main factors: 1. Strong longshore currents from the south-east that interact with high tidal ranges as evidenced by the morphology of sand waves and ridge and runnel landforms along the beach. 2. Streams draining the Teknaf anticline are dry during the winter and during summer monsoon rains, the sediments bypass the narrow

  17. Success-Breeds-Success in Collective Political Behavior: Evidence from a Field Experiment

    NARCIS (Netherlands)

    Van De Rijt, Arnout; Akin, Idil; Willer, Robb; Feinberg, Matthew

    2016-01-01

    Scholars have proposed that the emergence of political movements is highly pathdependent, such that early mobilization successes may lead to disproportionately greater eventual success. This article replicates a unique field experiment testing for positive feedback in internet petition signing (van

  18. Group cohesion and starting status in successful and less successful elite volleyball teams.

    Science.gov (United States)

    Spink, K S

    1992-08-01

    The main purpose of this study was to examine the relationship between members' perceptions of group cohesion and starting status in elite volleyball teams. The results of the study revealed that the form of the cohesion-starting status relationship was moderated by the variable of success. The results for less successful teams revealed that differences did emerge between specific measures of cohesion endorsed by starters and non-starters. No such differences in cohesion emerged when the starters and non-starters on successful teams were compared. These results provide initial support for the suggestion that the most successful teams are the ones where the perceptions of cohesiveness by starters and non-starters are similar. A secondary purpose of the study was to determine whether those teams that were the most successful and similar in their members' perceptions of cohesiveness, were also the teams whose members have the most positive outcome expectancy. The results supported this prediction.

  19. Successful Enterprise System Re-Implementation

    DEFF Research Database (Denmark)

    Svejvig, Per

    2017-01-01

    Achieving success in enterprise systems (ES) implementations is challenging. The success rate is not high in view of the sums invested by many organizations in these companywide systems. The literature is charged with reasons for unsuccessful implementations, such as a lack of top management...... support and insufficient change management. Contrary to this research, empirical data from an ES re-implementation in a Scandinavian high-tech company shows successful implementation despite many problematic shifts in outsourcing partners. Therefore, it is natural to ask: why was the re......-implementation of the ES at SCANDI successful despite the major troubles encountered during the project? Building an analysis based on ten Critical Success Factors (CSFs) combined with an investigation into the institutional structures at play, we present several reasons for the successful implementation. The CSF analysis...

  20. The surprisingly small but increasing role of international agricultural trade on the European Union’s dependence on mineral phosphorus fertiliser

    Science.gov (United States)

    Nesme, Thomas; Roques, Solène; Metson, Geneviève S.; Bennett, Elena M.

    2016-02-01

    Phosphorus (P) is subject to global management challenges due to its importance to both food security and water quality. The European Union (EU) has promoted policies to limit fertiliser over-application and protect water quality for more than 20 years, helping to reduce European P use. Over this time period, the EU has, however, become more reliant on imported agricultural products. These imported products require fertiliser to be used in distant countries to grow crops that will ultimately feed European people and livestock. As such, these imports represent a displacement of European P demand, possibly allowing Europe to decrease its apparent P footprint by moving P use to locations outside the EU. We investigated the effect of EU imports on the European P fertiliser footprint to better understand whether the EU’s decrease in fertiliser use over time resulted from P demand being ‘outsourced’ to other countries or whether it truly represented a decline in P demand. To do this, we quantified the ‘virtual P flow’ defined as the amount of mineral P fertiliser applied to agricultural soils in non-EU countries to support agricultural product imports to the EU. We found that the EU imported a virtual P flow of 0.55 Tg P/yr in 1995 that, surprisingly, decreased to 0.50 Tg P/yr in 2009. These results were contrary to our hypothesis that trade increases would be used to help the EU reduce its domestic P fertiliser use by outsourcing its P footprint abroad. Still, the contribution of virtual P flows to the total P footprint of the EU has increased by 40% from 1995 to 2009 due to a dramatic decrease in domestic P fertiliser use in Europe: in 1995, virtual P was equivalent to 32% of the P used as fertiliser domestically to support domestic consumption but jumped to 53% in 2009. Soybean and palm tree products from South America and South East Asia contributed most to the virtual P flow. These results demonstrate that, although policies in the EU have successfully

  1. A Toolkit Modeling Approach for Sustainable Forest Management Planning: Achieving Balance between Science and Local Needs

    Directory of Open Access Journals (Sweden)

    Brian R. Sturtevant

    2007-12-01

    Full Text Available To assist forest managers in balancing an increasing diversity of resource objectives, we developed a toolkit modeling approach for sustainable forest management (SFM. The approach inserts a meta-modeling strategy into a collaborative modeling framework grounded in adaptive management philosophy that facilitates participation among stakeholders, decision makers, and local domain experts in the meta-model building process. The modeling team works iteratively with each of these groups to define essential questions, identify data resources, and then determine whether available tools can be applied or adapted, or whether new tools can be rapidly created to fit the need. The desired goal of the process is a linked series of domain-specific models (tools that balances generalized "top-down" models (i.e., scientific models developed without input from the local system with case-specific customized "bottom-up" models that are driven primarily by local needs. Information flow between models is organized according to vertical (i.e., between scale and horizontal (i.e., within scale dimensions. We illustrate our approach within a 2.1 million hectare forest planning district in central Labrador, a forested landscape where social and ecological values receive a higher priority than economic values. However, the focus of this paper is on the process of how SFM modeling tools and concepts can be rapidly assembled and applied in new locations, balancing efficient transfer of science with adaptation to local needs. We use the Labrador case study to illustrate strengths and challenges uniquely associated with a meta-modeling approach to integrated modeling as it fits within the broader collaborative modeling framework. Principle advantages of the approach include the scientific rigor introduced by peer-reviewed models, combined with the adaptability of meta-modeling. A key challenge is the limited transparency of scientific models to different participatory groups

  2. Examining Productive Failure, Productive Success, Unproductive Failure, and Unproductive Success in Learning

    Science.gov (United States)

    Kapur, Manu

    2016-01-01

    Learning and performance are not always commensurable. Conditions that maximize performance in the initial learning may not maximize learning in the longer term. I exploit this incommensurability to theoretically and empirically interrogate four possibilities for design: productive success, productive failure, unproductive success, and…

  3. Auditing Marketing Strategy Implementation Success

    OpenAIRE

    Herhausen, Dennis; Egger, Thomas; Oral, Cansu

    2014-01-01

    What makes a marketing strategy implementation successful and how can managers measure this success? To answer these questions, we developed a two-step audit approach. First, managers should measure the implementation success regarding effectiveness, efficiency, performance outcomes, and strategic embeddedness. Second, they should explore the reasons that have led to success or failure by regarding managerial, leadership, and environmental traps. Doing so will also provide corrective action p...

  4. Is international junior success a reliable predictor for international senior success in elite combat sports?

    Science.gov (United States)

    Li, Pingwei; De Bosscher, Veerle; Pion, Johan; Weissensteiner, Juanita R; Vertonghen, Jikkemien

    2018-05-01

    Currently in the literature, there is a dearth of empirical research that confirms whether international junior success is a reliable predictor for future international senior success. Despite the uncertainty of the junior-senior relationship, federations and coaches still tend to use junior success as a predictor for long-term senior success. A range of former investigations utilising a retrospective lens has merely focused on success that athletes attained at junior level competitions. Success that was achieved at senior-level competitions but at a junior age was relatively ignored. This study explored to what extent international senior success can be predicted based on success that athletes achieved in either international junior level competitions (i.e. junior medalists) or senior competitions at a junior age (i.e. early achievers). The sample contains 4011 international male and female athletes from three combat sports (taekwondo, wrestling and boxing), who were born between 1974 and 1990 and participated in both international junior and senior-level competitions between 1990 and 2016. Gender and sport differences were compared. The results revealed that 61.4% of the junior medalists and 90.4% of the early achievers went on to win international medals at a senior age. Among the early achievers, 92.2% of the taekwondo athletes, 68.4% of the wrestling athletes and 37.9% of the boxing athletes could be reliably "predicted" to win international senior medals. The findings demonstrate that specific to the three combat sports examined, international junior success appears to be an important predictor to long-term international senior success.

  5. Modeling the Interaction Between Semantic Agents and Semantic Web Services Using MDA Approach

    NARCIS (Netherlands)

    Kardas, Geylani; Göknil, Arda; Dikenelli, Oguz; Topaloglu, N. Yasemin

    2007-01-01

    In this paper, we present our metamodeling approach for integrating semantic web services and semantic web enabled agents under Model Driven Architecture (MDA) view which defines a conceptual framework to realize model driven development. We believe that agents must have well designed environment

  6. Rule-based modularization in model transformation languages illustrated with ATL

    NARCIS (Netherlands)

    Ivanov, Ivan; van den Berg, Klaas; Jouault, Frédéric

    2007-01-01

    This paper studies ways for modularizing transformation definitions in current rule-based model transformation languages. Two scenarios are shown in which the modular units are identified on the basis of relations between source and target metamodels and on the base of generic transformation

  7. Experimental sources of variation in avian energetics: estimated basal metabolic rate decreases with successive measurements.

    Science.gov (United States)

    Jacobs, Paul J; McKechnie, Andrew E

    2014-01-01

    Basal metabolic rate (BMR) is one of the most widely used metabolic variables in endotherm ecological and evolutionary physiology. Surprisingly few studies have investigated how BMR is influenced by experimental and analytical variables over and above the standardized conditions required for minimum normothermic resting metabolism. We tested whether avian BMR is affected by habituation to the conditions experienced during laboratory gas exchange measurements by measuring BMR five times in succession in budgerigars (Melopsittacus undulatus) housed under constant temperature and photoperiod. Both the magnitude and the variability of BMR decreased significantly with repeated measurements, from 0.410 ± 0.092 W (n = 9) during the first measurement to 0.285 ± 0.042 W (n = 9) during the fifth measurement. Thus, estimated BMR decreased by ∼30% within individuals solely on account of the number of times they had previously experienced the experimental conditions. The most likely explanation for these results is an attenuation with repeated exposure of the acute stress response induced by birds being handled and placed in respirometry chambers. Our data suggest that habituation to experimental conditions is potentially an important determinant of observed BMR, and this source of variation needs to be taken into account in future studies of metabolic variation among individuals, populations, and species.

  8. Model Oriented Application Generation for Industrial Control Systems

    CERN Document Server

    Copy, B; Blanco Vinuela, E; Fernandez Adiego, B; Nogueira Ferandes, R; Prieto Barreiro, I

    2011-01-01

    The CERN Unified Industrial Control Systems framework (UNICOS) is a software generation methodology and a collection of development tools that standardizes the design of industrial control applications [1]. A Software Factory, named the UNICOS Application Builder (UAB) [2], was introduced to ease extensibility and maintenance of the framework, introducing a stable metamodel, a set of platformindependent models and platformspecific configurations against which code generation plugins and configuration generation plugins can be written. Such plugins currently target PLC programming environments (Schneider and SIEMENS PLCs) as well as SIEMENS WinCC Open Architecture SCADA (previously known as ETM PVSS) but are being expanded to cover more and more aspects of process control systems. We present what constitutes the UNICOS metamodel and the models in use, how these models can be used to capture knowledge about industrial control systems and how this knowledge can be leveraged to generate both code and configuratio...

  9. Between the processes of strengthening and weakening of the Family Health Strategy

    Directory of Open Access Journals (Sweden)

    Regina Stella Spagnuolo

    2013-06-01

    Full Text Available This was a qualitative study with the purpose of designing a meta-model for the work process of the Family Health Strategy (FHS team. It was based on the experience of six sample groups, composed of their members (physicians, professional nurses, dentists, dental assistants, licensed technical nurses and community health agents in a city in São Paulo state, Brazil, totaling 54 subjects. Six theoretical models emerged from non-directive interviews. These were analyzed according to Grounded Theory and submitted to the meta-synthesis strategy, which produced the meta-model "between the processes of strengthening and weakening of the FHS model: professional-team-community reciprocity as an intervening component". When analyzed in light of the Theory of Complexity (TC, it showed to be a work with a vertical and authoritarian tendency, which is largely hegemonic in the tradition of public health care policies.

  10. Success-slope effects on the illusion of control and on remembered success-frequency

    Directory of Open Access Journals (Sweden)

    Anastasia Ejova

    2013-07-01

    Full Text Available The illusion of control refers to the inference of action-outcome contingency in situations where outcomes are in fact random. The strength of this illusion has been found to be affected by whether the frequency of successes increases or decreases over repeated trials, in what can be termed a ``success-slope'' effect. Previous studies have generated inconsistent findings regarding the nature of this effect. In this paper we present an experiment (N = 334 that overcomes several methodological limitations within this literature, employing a wider range of dependent measures (measures of two different types of illusory control, primary (by self and secondary (by luck, as well as measures of remembered success-frequency. Results indicate that different dependent measures lead to different effects. On measures of (primary, but not secondary control over the task, scores were highest when the rate of success increased over time. Meanwhile, estimates of success-frequency in the task did not vary across conditions and showed trends consistent with the broader literature on human memory.

  11. Widening and Wandering the Short Road to Success: The Louisiana Transfer Degree Guarantee

    Science.gov (United States)

    Cope, Kevin L.

    2012-01-01

    "Surprise" is a word and concept seldom associated with higher education policy or with one of its most demanding projects, the development of a statewide articulation and transfer program. Legislators who mandate articulation enterprises and officials who care for these behemoths present themselves as thoughtful managers who cherish…

  12. A post-genomic surprise. The molecular reinscription of race in science, law and medicine.

    Science.gov (United States)

    Duster, Troy

    2015-03-01

    The completion of the first draft of the Human Genome Map in 2000 was widely heralded as the promise and future of genetics-based medicines and therapies - so much so that pundits began referring to the new century as 'The Century of Genetics'. Moreover, definitive assertions about the overwhelming similarities of all humans' DNA (99.9 per cent) by the leaders of the Human Genome Project were trumpeted as the end of racial thinking about racial taxonomies of human genetic differences. But the first decade of the new century brought unwelcomed surprises. First, gene therapies turned out to be far more complicated than any had anticipated - and instead the pharmaceutical industry turned to a focus on drugs that might be 'related' to population differences based upon genetic markers. While the language of 'personalized medicine' dominated this frame, research on racially and ethnically designated populations differential responsiveness to drugs dominated the empirical work in the field. Ancestry testing and 'admixture research' would play an important role in a new kind of molecular reification of racial categories. Moreover, the capacity of the super-computer to map differences reverberated into personal identification that would affect both the criminal justice system and forensic science, and generate new levels of concern about personal privacy. Social scientists in general, and sociologists in particular, have been caught short by these developments - relying mainly on assertions that racial categories are socially constructed, regionally and historically contingent, and politically arbitrary. While these assertions are true, the imprimatur of scientific legitimacy has shifted the burden, since now 'admixture research' can claim that its results get at the 'reality' of human differentiation, not the admittedly flawed social constructions of racial categories. Yet what was missing from this framing of the problem: 'admixture research' is itself based upon socially

  13. Surprising quantum bounces

    CERN Document Server

    Nesvizhevsky, Valery

    2015-01-01

    This unique book demonstrates the undivided unity and infinite diversity of quantum mechanics using a single phenomenon: quantum bounces of ultra-cold particles. Various examples of such "quantum bounces" are: gravitational quantum states of ultra-cold neutrons (the first observed quantum states of matter in a gravitational field), the neutron whispering gallery (an observed matter-wave analog of the whispering gallery effect well known in acoustics and for electromagnetic waves), and gravitational and whispering gallery states for anti-matter atoms that remain to be observed. These quantum states are an invaluable tool in the search for additional fundamental short-range forces, for exploring the gravitational interaction and quantum effects of gravity, for probing physics beyond the standard model, and for furthering studies into the foundations of quantum mechanics, quantum optics, and surface science.

  14. More Supernova Surprises

    Science.gov (United States)

    2010-09-24

    originated in South America. E veryone appreciates the beauty of dai- sies, chrysanthemums, and sunfl ow- ers, and many of us enjoy eating lettuce ...few fossils. On page 1621 of this issue, Barreda et al. ( 1) describe an unusually well-preserved new fossil that sheds light on the history of

  15. Social-philosophical practices of success

    Directory of Open Access Journals (Sweden)

    S. R. Karpenko

    2017-01-01

    Is social-philosophical experts of success represent the complicated system of various world outlook, speech, mental factors and events in life of the various professional, age and subcultural bunches producing assessments under different visual angles, from positions of various social installations and identity in what the social philosophy of success expresses. In the course of forming social an expert (both in daily, and in an institutional discourse are shaped also theoretical ideas success: instrumental, is social-philosophical, is social-psychological, world outlook, historical and cultural, etc., characterising thereby various systems of a social discourse. Examination is social-philosophical the success expert shows the real complexity and ambiguity of the given appearance. Besides the presented typology constructed as the most approximate abstract plan, in each separate case probably build-up of typological models according to a principle ad hoc. It looks quite justified, considering that circumstance that representations about success and the successful person are constantly transformed and acquire new performances. Efficiency of the further examinations of a discourse and a success expert will depend on accepting of new heuristic approaches, capable to consider multidimensionality and ambiguity of the given phenomenon.

  16. The characteristics of successful entrepreneurs

    Directory of Open Access Journals (Sweden)

    Pokrajčić Dragana M.

    2004-01-01

    Full Text Available This paper examines the economic, psychological and social-behavioral theories of the entrepreneur in order to determine the characteristics of a successful entrepreneur. The major contribution of economic theories of the entrepreneur is better understanding of the entrepreneur and his/her role in economic development. The psychological characteristic theory of entrepreneur argues that successful entrepreneurs possess certain personality traits that mark them out as special, and tries to determine and to evaluate these special traits. The social-behavioral theories stress the influence of experience, knowledge, social environment and ability to learn on the entrepreneur’s success as well as his/her personality traits. Neither of the examined theories of entrepreneur gives a satisfactory explanation of the entrepreneur’s success, but taken as a whole, they can explain key factors of entrepreneur’s success. The entrepreneur’s success comes about as a result of his/her personality traits, ability to learn from experience and ability to adjust to his/her environment.

  17. Successful removable partial dentures.

    Science.gov (United States)

    Lynch, Christopher D

    2012-03-01

    Removable partial dentures (RPDs) remain a mainstay of prosthodontic care for partially dentate patients. Appropriately designed, they can restore masticatory efficiency, improve aesthetics and speech, and help secure overall oral health. However, challenges remain in providing such treatments, including maintaining adequate plaque control, achieving adequate retention, and facilitating patient tolerance. The aim of this paper is to review the successful provision of RPDs. Removable partial dentures are a successful form of treatment for replacing missing teeth, and can be successfully provided with appropriate design and fabrication concepts in mind.

  18. A foundation of ecology rediscovered: 100 years of succession on the William S. Cooper plots in Glacier Bay, Alaska.

    Science.gov (United States)

    Buma, Brian; Bisbing, Sarah; Krapek, John; Wright, Glenn

    2017-06-01

    Understanding plant community succession is one of the original pursuits of ecology, forming some of the earliest theoretical frameworks in the field. Much of this was built on the long-term research of William S. Cooper, who established a permanent plot network in Glacier Bay, Alaska, in 1916. This study now represents the longest-running primary succession plot network in the world. Permanent plots are useful for their ability to follow mechanistic change through time without assumptions inherent in space-for-time (chronosequence) designs. After 100-yr, these plots show surprising variety in species composition, soil characteristics (carbon, nitrogen, depth), and percent cover, attributable to variation in initial vegetation establishment first noted by Cooper in the 1916-1923 time period, partially driven by dispersal limitations. There has been almost a complete community composition replacement over the century and general species richness increase, but the effective number of species has declined significantly due to dominance of Salix species which established 100-yr prior (the only remaining species from the original cohort). Where Salix dominates, there is no establishment of "later" successional species like Picea. Plots nearer the entrance to Glacier Bay, and thus closer to potential seed sources after the most recent glaciation, have had consistently higher species richness for 100 yr. Age of plots is the best predictor of soil N content and C:N ratio, though plots still dominated by Salix had lower overall N; soil accumulation was more associated with dominant species. This highlights the importance of contingency and dispersal in community development. The 100-yr record of these plots, including species composition, spatial relationships, cover, and observed interactions between species provides a powerful view of long-term primary succession. © 2017 by the Ecological Society of America.

  19. Behind the Exporters’ Success: Analysis of Successful Hungarian Exporter Companies From a Strategic Perspective

    Directory of Open Access Journals (Sweden)

    Annamaria Kazai Onodi

    2014-09-01

    Full Text Available The purpose of the study is to provide an overview of export success from a strategic management perspective. The paper empirically tested the relationships between the firm’s export performance, strategic thinking, adaptation to the changing environment and companies’ capabilities. The research is based on the Hungarian Competitiveness Research database of 2013 that consists of 300 firms. Cluster analysis differentiated successful export-oriented and stagnant companies. Both of them had high export intensity (higher than 75%, but significant differences could be observed in export volume and profitability. More than 90Ťn of total export revenue belonged to the successful export-oriented cluster. Successful export oriented companies proved to be more proactive and innovative than stagnant, thus they were capable of adapting to the changing environment better. The study highlighted that appropriate strategic thinking could play a significant role in improving export success. The implication of the study is that stagnant companies need to develop their forecast abilities, flexibility to adapt to the changing environment and operational efficiency. Stagnant companies lagged behind successful exporters concerning industry forecast, production level, number of innovations, competitive prices and employee qualifications.

  20. North Country Successes: Case Studies of Successful Entrepreneurs in the ANCA Region.

    Science.gov (United States)

    Chugh, Ram L.; Gandhi, Prem P.

    This study identifies the characteristics of both successful small businesses and their entrepreneurial owners in a 14-county area of the Adirondack North Country Association (ANCA). Of the 100 survey respondents representing successful small businesses, 50% had been in business for less than 14 years; 38% were in manufacturing; 48% employed more…

  1. Real-world problem-based learning: a case study evaluated | de ...

    African Journals Online (AJOL)

    The Hexa-C Metamodel is used as inquiry tool to evaluate the student project according to contemporary learning and instructional theories, using the criteria of Creativity, Collaborative learning, Customization, Components, Cognitive learning, and Constructivism. Background to the particular course is provided followed by ...

  2. A case study on the transformation of context-aware domain data onto XML schemas

    NARCIS (Netherlands)

    Guareis de farias, Cléver; van Sinderen, Marten J.; Ferreira Pires, Luis; Hammoudi, S.

    2007-01-01

    In order to accelerate the development of context-aware applications, it would be convenient to have a smooth path between the context models and the automated services that support these models. This paper discusses how MDA technology (metamodelling and the QVT standard) can support the

  3. Evaluation of Rule-based Modularization in Model Transformation Languages illustrated with ATL

    NARCIS (Netherlands)

    Ivanov, Ivan; van den Berg, Klaas; Jouault, Frédéric

    This paper studies ways for modularizing transformation definitions in current rule-based model transformation languages. Two scenarios are shown in which the modular units are identified on the base of the relations between source and target metamodels and on the base of generic transformation

  4. Conditional simulation for efficient global optimization

    NARCIS (Netherlands)

    Kleijnen, Jack P.C.; Mehdad, E.; Pasupathy, R.; Kim, S.-H.; Tolk, A.; Hill, R.; Kuhl, M.E.

    2013-01-01

    A classic Kriging or Gaussian process (GP) metamodel estimates the variance of its predictor by plugging-in the estimated GP (hyper)parameters; namely, the mean, variance, and covariances. The problem is that this predictor variance is biased. To solve this problem for deterministic simulations, we

  5. Public project success as seen in a broad perspective.: Lessons from a meta-evaluation of 20 infrastructure projects in Norway.

    Science.gov (United States)

    Volden, Gro Holst

    2018-08-01

    Infrastructure projects in developed countries are rarely evaluated ex-post. Despite their number and scope, our knowledge about their various impacts is surprisingly limited. The paper argues that such projects must be assessed in a broad perspective that includes both operational, tactical and strategic aspects, and unintended as well as intended effects. A generic six-criteria evaluation framework is suggested, inspired by a framework frequently used to evaluate development assistance projects. It is tested on 20 Norwegian projects from various sectors (transport, defence, ICT, buildings). The results indicate that the majority of projects were successful, especially in operational terms, possibly because they underwent external quality assurance up-front. It is argued that applying this type of standardized framework provides a good basis for comparison and learning across sectors. It is suggested that evaluations should be conducted with the aim of promoting accountability, building knowledge about infrastructure projects, and continuously improve the tools, methods and governance arrangements used in the front-end of project development. Copyright © 2018 The Author. Published by Elsevier Ltd.. All rights reserved.

  6. Combining site occupancy, breeding population sizes and reproductive success to calculate time-averaged reproductive output of different habitat types: an application to Tricolored Blackbirds.

    Directory of Open Access Journals (Sweden)

    Marcel Holyoak

    Full Text Available In metapopulations in which habitat patches vary in quality and occupancy it can be complicated to calculate the net time-averaged contribution to reproduction of particular populations. Surprisingly, few indices have been proposed for this purpose. We combined occupancy, abundance, frequency of occurrence, and reproductive success to determine the net value of different sites through time and applied this method to a bird of conservation concern. The Tricolored Blackbird (Agelaius tricolor has experienced large population declines, is the most colonial songbird in North America, is largely confined to California, and breeds itinerantly in multiple habitat types. It has had chronically low reproductive success in recent years. Although young produced per nest have previously been compared across habitats, no study has simultaneously considered site occupancy and reproductive success. Combining occupancy, abundance, frequency of occurrence, reproductive success and nest failure rate we found that that large colonies in grain fields fail frequently because of nest destruction due to harvest prior to fledging. Consequently, net time-averaged reproductive output is low compared to colonies in non-native Himalayan blackberry or thistles, and native stinging nettles. Cattail marshes have intermediate reproductive output, but their reproductive output might be improved by active management. Harvest of grain-field colonies necessitates either promoting delay of harvest or creating alternative, more secure nesting habitats. Stinging nettle and marsh colonies offer the main potential sources for restoration or native habitat creation. From 2005-2011 breeding site occupancy declined 3x faster than new breeding colonies were formed, indicating a rapid decline in occupancy. Total abundance showed a similar decline. Causes of variation in the value for reproduction of nesting substrates and factors behind continuing population declines merit urgent

  7. Analysis of successive data sets

    NARCIS (Netherlands)

    Spreeuwers, Lieuwe Jan; Breeuwer, Marcel; Haselhoff, Eltjo Hans

    2008-01-01

    The invention relates to the analysis of successive data sets. A local intensity variation is formed from such successive data sets, that is, from data values in successive data sets at corresponding positions in each of the data sets. A region of interest is localized in the individual data sets on

  8. Analysis of successive data sets

    NARCIS (Netherlands)

    Spreeuwers, Lieuwe Jan; Breeuwer, Marcel; Haselhoff, Eltjo Hans

    2002-01-01

    The invention relates to the analysis of successive data sets. A local intensity variation is formed from such successive data sets, that is, from data values in successive data sets at corresponding positions in each of the data sets. A region of interest is localized in the individual data sets on

  9. The seven S's for successful management.

    Science.gov (United States)

    Davidhizar, R

    1995-03-01

    Becoming a successful manager in a health care agency is, for most new managers, an awesome goal. Successful management is more than knowledge of leadership roles and management functions that can be learned in school or educational workshops. Successful management involves effective use of both the manager's affective and cognitive domains. Mentoring and apprenticeship with a successful nurse leader is for many novice managers a highly valuable way to learn management skills since this allows for techniques with a successful nurse manager to be visualized and then modeled. "Seven S's" that provide a framework for managerial success are discussed.

  10. Ingredients for successful partnerships

    NARCIS (Netherlands)

    S.M. Pfisterer (Stella)

    2011-01-01

    textabstractFor the development of new cross-sector partnerships it is required to know what the essence of successful partnership projects is. Which factors influence success or failure of partnerships is highly related to the specific context where partnerships operate. The literature on critical

  11. Examining Management Success Potential.

    Science.gov (United States)

    Quatrano, Louis A.

    The derivation of a model of management success potential in hospitals or health services administration is described. A questionnaire developed to assess management success potential in health administration students was voluntarily completed by approximately 700 incoming graduate students in 35 university health services administration programs…

  12. Successful ageing for psychiatrists.

    Science.gov (United States)

    Peisah, Carmelle

    2016-04-01

    This paper aims to explore the concept and determinants of successful ageing as they apply to psychiatrists as a group, and as they can be applied specifically to individuals. Successful ageing is a heterogeneous, inclusive concept that is subjectively defined. No longer constrained by the notion of "super-ageing", successful ageing can still be achieved in the face of physical and/or mental illness. Accordingly, it remains within the reach of most of us. It can, and should be, person-specific and individually defined, specific to one's bio-psycho-social and occupational circumstances, and importantly, reserves. Successful professional ageing is predicated upon insight into signature strengths, with selection of realistic goal setting and substitution of new goals, given the dynamic nature of these constructs as we age. Other essential elements are generativity and self-care. Given that insight is key, taking a regular stock or inventory of our reserves across bio-psycho-social domains might be helpful. Importantly, for successful ageing, this needs to be suitably matched to the professional task and load. This lends itself to a renewable personal ageing plan, which should be systemically adopted with routine expectations of self-care and professional responsibility. © The Royal Australian and New Zealand College of Psychiatrists 2015.

  13. Post-Mergers and Acquisitions: The Motives, Success Factors and Key Success Indicators

    Directory of Open Access Journals (Sweden)

    Hatem El Zuhairy

    2015-07-01

    Full Text Available There is a wide body of evidence showing a significant increase in the adoption of mergers and acquisitions (M&A worldwide. Moreover, research confirms that the integration and implementation stage (post-M&A has a major impact on the success or failure of a merger or acquisition. Therefore it has become increasingly important to explore the post-M&A phase further in order to support the management teams of organizations pursuing a merger or acquisition in meeting all their desired objectives. This paper proposes a framework to help in the successful execution of M&A. The framework contains three main elements: the motives, success factors and key success indicators (KSI. A qualitative research approach using the multiple case study methodology was conducted to test the framework. Ten case studies were selected from the industrial sector in Egypt and used to validate the research. The final version of the M&A framework was provided after applying the research results. Considering the practical implications of the M&A framework, a tool was proposed for its application in light of the balanced scorecard (BSC methodology. The proposed M&A scorecard tool should be used in the strategic planning and execution of M&A. Both the proposed M&A framework and the M&A scorecard tool should be used to guide the implementation of M&A in order to increase the success rate enjoyed by organizations.

  14. Priority setting: what constitutes success? A conceptual framework for successful priority setting.

    Science.gov (United States)

    Sibbald, Shannon L; Singer, Peter A; Upshur, Ross; Martin, Douglas K

    2009-03-05

    The sustainability of healthcare systems worldwide is threatened by a growing demand for services and expensive innovative technologies. Decision makers struggle in this environment to set priorities appropriately, particularly because they lack consensus about which values should guide their decisions. One way to approach this problem is to determine what all relevant stakeholders understand successful priority setting to mean. The goal of this research was to develop a conceptual framework for successful priority setting. Three separate empirical studies were completed using qualitative data collection methods (one-on-one interviews with healthcare decision makers from across Canada; focus groups with representation of patients, caregivers and policy makers; and Delphi study including scholars and decision makers from five countries). This paper synthesizes the findings from three studies into a framework of ten separate but interconnected elements germane to successful priority setting: stakeholder understanding, shifted priorities/reallocation of resources, decision making quality, stakeholder acceptance and satisfaction, positive externalities, stakeholder engagement, use of explicit process, information management, consideration of values and context, and revision or appeals mechanism. The ten elements specify both quantitative and qualitative dimensions of priority setting and relate to both process and outcome components. To our knowledge, this is the first framework that describes successful priority setting. The ten elements identified in this research provide guidance for decision makers and a common language to discuss priority setting success and work toward improving priority setting efforts.

  15. Biosphere reserves: Attributes for success.

    Science.gov (United States)

    Van Cuong, Chu; Dart, Peter; Hockings, Marc

    2017-03-01

    Biosphere reserves established under the UNESCO Man and the Biosphere Program aim to harmonise biodiversity conservation and sustainable development. Concerns over the extent to which the reserve network was living up to this ideal led to the development of a new strategy in 1995 (the Seville Strategy) to enhance the operation of the network of reserves. An evaluation of effectiveness of management of the biosphere reserve network was called for as part of this strategy. Expert opinion was assembled through a Delphi Process to identify successful and less successful reserves and investigate common factors influencing success or failure. Ninety biosphere reserves including sixty successful and thirty less successful reserves in 42 countries across all five Man and the Biosphere Program regions were identified. Most successful sites are the post-Seville generation while the majority of unsuccessful sites are pre-Seville that are managed as national parks and have not been amended to conform to the characteristics that are meant to define a biosphere reserve. Stakeholder participation and collaboration, governance, finance and resources, management, and awareness and communication are the most influential factors in the success or failure of the biosphere reserves. For success, the biosphere reserve concept needs to be clearly understood and applied through landscape zoning. Designated reserves then need a management system with inclusive good governance, strong participation and collaboration, adequate finance and human resource allocation and stable and responsible management and implementation. All rather obvious but it is difficult to achieve without commitment to the biosphere reserve concept by the governance authorities. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. High-Impact Succession Management. Executive Summary

    Science.gov (United States)

    Lamoureux, Kim; Campbell, Michael; Smith, Roland

    2009-01-01

    Most companies have an opportunity to improve their succession management programs. The number one challenge for succession management (as identified by both HR leaders and executives) is developing a succession planning strategy. This comprehensive industry study sets out to determine how succession management (when done well) helps improve…

  17. Research on Adult Learning and Memory: Retrospect and Prospect.

    Science.gov (United States)

    Hultsch, David F.; Pentz, C. A.

    1980-01-01

    Descriptions of cognitive development are determined by the metamodel on which theories and data are based. The associative and information processing approaches have generated much of the research on adult learning and memory. A contextual approach, emphasizing perceiving, comprehending, and remembering, is emerging in the present historical…

  18. Model-driven design of simulation support for the TERRA robot software tool suite

    NARCIS (Netherlands)

    Lu, Zhou; Bezemer, M.M.; Broenink, Johannes F.

    2015-01-01

    Model-Driven Development (MDD) – based on the concepts of model, meta-model and model transformation – is an approach to develop predictable and re- liable software for Cyber-Physical Systems (CPS). The work presented here concerns a methodology to design simulation software based on MDD techniques,

  19. The Evaluation of Preprocessing Choices in Single-Subject BOLD fMRI Using NPAIRS Performance Metrics

    DEFF Research Database (Denmark)

    Stephen, LaConte; Rottenberg, David; Strother, Stephen

    2003-01-01

    to obtain cross-validation-based model performance estimates of prediction accuracy and global reproducibility for various degrees of model complexity. We rely on the concept of an analysis chain meta-model in which all parameters of the preprocessing steps along with the final statistical model are treated...

  20. Models, More Models, and Then a Lot More

    NARCIS (Netherlands)

    Babur, O.; Cleophas, L.; Brand, van den M.; Tekinerdogan, B.; Aksit, M.

    2018-01-01

    With increased adoption of Model-Driven Engineering, the number of related artefacts in use, such as models, metamodels and transformations, greatly increases. To confirm this, we present quantitative evidence from both academia — in terms of repositories and datasets — and industry — in terms of

  1. Efficient approximation of black-box functions and Pareto sets

    NARCIS (Netherlands)

    Rennen, G.

    2009-01-01

    In the case of time-consuming simulation models or other so-called black-box functions, we determine a metamodel which approximates the relation between the input- and output-variables of the simulation model. To solve multi-objective optimization problems, we approximate the Pareto set, i.e. the

  2. Modelling bark beetle disturbances in a large scale forest scenario model to assess climate change impacts and evaluate adaptive management strategies

    NARCIS (Netherlands)

    Seidl, R.; Schelhaas, M.J.; Lindner, M.; Lexer, M.J.

    2009-01-01

    To study potential consequences of climate-induced changes in the biotic disturbance regime at regional to national scale we integrated a model of Ips typographus (L. Scol. Col.) damages into the large-scale forest scenario model EFISCEN. A two-stage multivariate statistical meta-model was used to

  3. Strategic Success Of SRI Lankan Government Against LTTE Remains Tentative Despite Military Success

    Science.gov (United States)

    2016-02-13

    approach. While the SLG achieved a military success it has had difficulty translating the triumph into a strategic success through the application of...2004, based upon the tragedy and impact of a tsunami, the Indonesian government and insurgent forces negotiated a peace settlement for the greater

  4. Motivational and adaptational factors of successful women engineers

    Science.gov (United States)

    Bornsen, Susan Edith

    It is no surprise that there is a shortage of women engineers. The reasons for the shortage have been researched and discussed in myriad papers, and suggestions for improvement continue to evolve. However, there are few studies that have specifically identified the positive aspects that attract women to engineering and keep them actively engaged in the field. This paper examines how women engineers view their education, their work, and their motivation to remain in the field. A qualitative research design was used to understand the motivation and adaptability factors women use to support their decision to major in engineering and stay in the engineering profession. Women engineers were interviewed using broad questions about motivation and adaptability. Interviews were transcribed and coded, looking for common threads of factors that suggest not only why women engineers persist in the field, but also how they thrive. Findings focus on the experiences, insights, and meaning of women interviewed. A grounded theory approach was used to describe the success factors found in practicing women engineers. The study found categories of attraction to the field, learning environment, motivation and adaptability. Sub-categories of motivation are intrinsic motivational factors such as the desire to make a difference, as well as extrinsic factors such as having an income that allows the kind of lifestyle that supports the family. Women engineers are comfortable with and enjoy working with male peers and when barriers arise, women learn to adapt in the male dominated field. Adaptability was indicated in areas of gender, culture, and communication. Women found strength in the ability to 'read' their clients, and provide insight to their teams. Sufficient knowledge from the field advances theory and offers strategies to programs for administrators and faculty of schools of engineering as well as engineering firms, who have interest in recruitment, and retention of female students

  5. The concept of the deceased's habitual residence in the European succession regulation / El concepto de residencia habitual del causante en el Reglamento Sucesorio europeo

    Directory of Open Access Journals (Sweden)

    Javier Carrascosa González

    2015-10-01

    Full Text Available International successions have often raised controversies for Private International Law. This paper deals with the general ground of jurisdiction of the deceased’s last habitual residence. In this field, the flexible, fluid and changing concept of the last “habitual residence” of the deceased needs an appropriate interpretation both for academics and for the practitioners of Private International Law. However, this essay holds that the liquidity of the concept “habitual residence” of the deceased may be an advantage to grant international jurisdiction on the courts which are best placed to rule on the merits of the case. Moreover, this paper sustains that a careful and holistic interpretation of the text of the Regulation and a proper analysis of the function of this ground of international jurisdiction leads to a surprising conclusion, i.e., the concept of the “habitual residence” is not as complex and difficult to specify as, at first glance, it might appear.

  6. Model oriented application generation for industrial control systems

    International Nuclear Information System (INIS)

    Copy, B.; Barillere, R.; Blanco, E.; Fernandez Adiego, B.; Nogueira Fernandes, R.; Prieto Barreiro, I.

    2012-01-01

    The CERN Unified Industrial Control Systems framework (UNICOS) is a software generation methodology and a collection of development tools that standardizes the design of industrial control applications. A Software Factory, named the UNICOS Application Builder (UAB), was introduced to ease extensibility and maintenance of the framework, introducing a stable meta-model, a set of platform-independent models and platform-specific configurations against which code generation plug-ins and configuration generation plug-ins can be written. Such plug-ins currently target PLC programming environments (Schneider and SIEMENS PLCs) as well as SIEMENS WinCC Open Architecture SCADA (previously known as ETM PVSS) but are being expanded to cover more and more aspects of process control systems. We present what constitutes the UNICOS meta-model and the models in use, how these models can be used to capture knowledge about industrial control systems and how this knowledge can be used to generate both code and configuration for a variety of target usages. (authors)

  7. An Extended Petri-Net Based Approach for Supply Chain Process Enactment in Resource-Centric Web Service Environment

    Science.gov (United States)

    Wang, Xiaodong; Zhang, Xiaoyu; Cai, Hongming; Xu, Boyi

    Enacting a supply-chain process involves variant partners and different IT systems. REST receives increasing attention for distributed systems with loosely coupled resources. Nevertheless, resource model incompatibilities and conflicts prevent effective process modeling and deployment in resource-centric Web service environment. In this paper, a Petri-net based framework for supply-chain process integration is proposed. A resource meta-model is constructed to represent the basic information of resources. Then based on resource meta-model, XML schemas and documents are derived, which represent resources and their states in Petri-net. Thereafter, XML-net, a high level Petri-net, is employed for modeling control and data flow of process. From process model in XML-net, RESTful services and choreography descriptions are deduced. Therefore, unified resource representation and RESTful services description are proposed for cross-system integration in a more effective way. A case study is given to illustrate the approach and the desirable features of the approach are discussed.

  8. Assessing performance of flaw characterization methods through uncertainty propagation

    Science.gov (United States)

    Miorelli, R.; Le Bourdais, F.; Artusi, X.

    2018-04-01

    In this work, we assess the inversion performance in terms of crack characterization and localization based on synthetic signals associated to ultrasonic and eddy current physics. More precisely, two different standard iterative inversion algorithms are used to minimize the discrepancy between measurements (i.e., the tested data) and simulations. Furthermore, in order to speed up the computational time and get rid of the computational burden often associated to iterative inversion algorithms, we replace the standard forward solver by a suitable metamodel fit on a database built offline. In a second step, we assess the inversion performance by adding uncertainties on a subset of the database parameters and then, through the metamodel, we propagate these uncertainties within the inversion procedure. The fast propagation of uncertainties enables efficiently evaluating the impact due to the lack of knowledge on some parameters employed to describe the inspection scenarios, which is a situation commonly encountered in the industrial NDE context.

  9. PROACTIVE APPROACH TO THE INCIDENT AND PROBLEM MANAGEMENT IN COMMUNICATION NETWORKS

    Directory of Open Access Journals (Sweden)

    Vjeran Strahonja

    2007-06-01

    Full Text Available Proactive approach to communication network maintenance has the capability of enhancing the integrity and reliability of communication networks, as well as of reducing maintenance costs and overall number of incidents. This paper presents approaches to problem and incident prevention with the help of root-cause analysis, aligning that with the goal to foresee software performance. Implementation of proactive approach requires recognition of enterprise's current level of maintenance better insights into available approaches and tools, as well as their comparison, interoperability, integration and further development. The approach we are proposing and elaborating in this paper lies on the construction of a metamodel of the problem management of information technology, particularly the proactive problem management. The metamodel is derived from the original ITIL specification and presented in an object-oriented fashion by using structure (class diagrams conform to UML notation. Based on current research, appropriate metrics based on the concept of Key Performance Indicators is suggested.

  10. Capability-Driven Design of Business Service Ecosystem to Support Risk Governance in Regulatory Ecosystems

    Directory of Open Access Journals (Sweden)

    Christophe Feltus

    2017-04-01

    Full Text Available Risk-based regulation and risk governance gain momentum in most sectorial ecosystems, should they be the finance, the healthcare or the telecommunications ecosystems. Although there is a profusion of tools to address this issue at the corporate level, worth is to note that no solution fulfils this function at the ecosystem level yet. Therefore, in this article, the Business Service Ecosystem (BSE metamodel is semantically extended, considering the Capability as a Service (CaaS theory, in order to raise the enterprise risk management from the enterprise level up to the ecosystem level. This extension allows defining a concrete ecosystem metamodel which is afterwards mapped with an information system risk management model to support risk governance at the ecosystem level. This mapping is illustrated and validated on the basis of an application case for the Luxembourgish financial sector applied to the most important concepts from the BSE: capability, resource, service and goal.

  11. Multicriteria shape design of an aerosol can

    Directory of Open Access Journals (Sweden)

    Benki Aalae

    2015-07-01

    Full Text Available One of the current challenges in the domain of the multicriteria shape optimization is to reduce the calculation time required by conventional methods. The high computational cost is due to the high number of simulation or function calls required by these methods. Recently, several studies have been led to overcome this problem by integrating a metamodel in the overall optimization loop. In this paper, we perform a coupling between the Normal Boundary Intersection – NBI – algorithm with Radial Basis Function – RBF – metamodel in order to have a simple tool with a reasonable calculation time to solve multicriteria optimization problems. First, we apply our approach to academic test cases. Then, we validate our method against an industrial case, namely, shape optimization of the bottom of an aerosol can undergoing nonlinear elasto-plastic deformation. Then, in order to select solutions among the Pareto efficient ones, we use the same surrogate approach to implement a method to compute Nash and Kalai–Smorodinsky equilibria.

  12. Potential of a precrash lateral occupant movement in side collisions of (electric) minicars.

    Science.gov (United States)

    Hierlinger, T; Lienkamp, M; Unger, J; Unselt, T

    2015-01-01

    In minicars, the survival space between the side structure and occupant is smaller than in conventional cars. This is an issue in side collisions. Therefore, in this article a solution is studied in which a lateral seat movement is imposed in the precrash phase. It generates a pre-acceleration and an initial velocity of the occupant, thus reducing the loads due to the side impact. The assessment of the potential is done by numerical simulations and a full-vehicle crash test. The optimal parameters of the restraint system including the precrash movement, time-to-fire of head and side airbag, etc., are found using metamodel-based optimization methods by minimizing occupant loads according to European New Car Assessment Programme (Euro NCAP). The metamodel-based optimization approach is able to tune the restraint system parameters. The numerical simulations show a significant averaged reduction of 22.3% in occupant loads. The results show that the lateral precrash occupant movement offers better occupant protection in side collisions.

  13. Promoting positive human development and social justice: Integrating theory, research and application in contemporary developmental science.

    Science.gov (United States)

    Lerner, Richard M

    2015-06-01

    The bold claim that developmental science can contribute to both enhancing positive development among diverse individuals across the life span and promoting social justice in their communities, nations and regions is supported by decades of theoretical, methodological and research contributions. To explain the basis of this claim, I describe the relational developmental systems (RDS) metamodel that frames contemporary developmental science, and I present an example of a programme of research within the adolescent portion of the life span that is associated with this metamodel and is pertinent to promoting positive human development. I then discuss methodological issues associated with using RDS-based models as frames for research and application. Finally, I explain how the theoretical and methodological ideas associated with RDS thinking may provide the scholarly tools needed by developmental scientists seeking to contribute to human thriving and to advance social justice in the Global South. © 2015 International Union of Psychological Science.

  14. Modelling and analysis of solar cell efficiency distributions

    Science.gov (United States)

    Wasmer, Sven; Greulich, Johannes

    2017-08-01

    We present an approach to model the distribution of solar cell efficiencies achieved in production lines based on numerical simulations, metamodeling and Monte Carlo simulations. We validate our methodology using the example of an industrial feasible p-type multicrystalline silicon “passivated emitter and rear cell” process. Applying the metamodel, we investigate the impact of each input parameter on the distribution of cell efficiencies in a variance-based sensitivity analysis, identifying the parameters and processes that need to be improved and controlled most accurately. We show that if these could be optimized, the mean cell efficiencies of our examined cell process would increase from 17.62% ± 0.41% to 18.48% ± 0.09%. As the method relies on advanced characterization and simulation techniques, we furthermore introduce a simplification that enhances applicability by only requiring two common measurements of finished cells. The presented approaches can be especially helpful for ramping-up production, but can also be applied to enhance established manufacturing.

  15. The International Successful School Principalship Project: Success Sustained?

    Science.gov (United States)

    Moos, Lejf; Johansson, Olof

    2009-01-01

    Purpose: The purpose of this paper is to synthesize the findings of the follow-up studies of successful school principals in six countries: Australia, Denmark, England, Norway, Sweden, and the USA. Design/methodology/approach: Data were categorized according to stakeholder expectations, the concept and practice of leadership, and the…

  16. A development process meta-model for Web based expert systems: The Web engineering point of view

    DEFF Research Database (Denmark)

    Dokas, I.M.; Alapetite, Alexandre

    2006-01-01

    raised their complexity. Unfortunately, there is so far no clear answer to the question: How may the methods and experience of Web engineering and expert systems be combined and applied in order todevelop effective and successful Web based expert systems? In an attempt to answer this question...... on Web based expert systems – will be presented. The idea behind the presentation of theaccessibility evaluation and its conclusions is to show to Web based expert system developers, who typically have little Web engineering background, that Web engineering issues must be considered when developing Web......Similar to many legacy computer systems, expert systems can be accessed via the Web, forming a set of Web applications known as Web based expert systems. The tough Web competition, the way people and organizations rely on Web applications and theincreasing user requirements for better services have...

  17. Human Resource Outsourcing Success

    OpenAIRE

    Hasliza Abdul-Halim; Elaine Ee; T. Ramayah; Noor Hazlina Ahmad

    2014-01-01

    The existing literature on partnership seems to take the relationship between partnership quality and outsourcing success for granted. Therefore, this article aims at examining the role of service quality in strengthening the relationship between partnership quality and human resource (HR) outsourcing success. The samples were obtained from 96 manufacturing organizations in Penang, Malaysia. The results showed that par...

  18. Planning for College Success

    Science.gov (United States)

    PEPNet, 2009

    2009-01-01

    "Planning for College Success" (PCS) is a curriculum model designed by Sharon Downs, M.S., for a course intended to assist deaf and hard of hearing students during their initial introduction to college life. This program allows students to work one-on-one with a counselor to plan for their college success. The program includes short-term goals and…

  19. The Military Assistance Command-Vietnam Studies and Observations Group-A Case Study in Special Operations Campaigning

    Science.gov (United States)

    2016-06-10

    realistically rehearsed , and executed with surprise, speed and purpose.”13 McRaven’s 11 William H...objective of special operations rather than the techniques used. For SOF to continue to achieve operational surprise, and to innovate in a constantly ...achieve success in 91 Peterson, 4-5. 49 an ever changing environment, they will need to constantly

  20. A top-down approach to construct execution views of a large software-intensive system

    NARCIS (Netherlands)

    Callo Arias, Trosky B.; America, Pierre; Avgeriou, Paris

    This paper presents an approach to construct execution views, which are views that describe what the software of a software-intensive system does at runtime and how it does it. The approach represents an architecture reconstruction solution based on a metamodel, a set of viewpoints, and a dynamic

  1. The Most Distant Mature Galaxy Cluster - Young, but surprisingly grown-up

    Science.gov (United States)

    2011-03-01

    Astronomers have used an armada of telescopes on the ground and in space, including the Very Large Telescope at ESO's Paranal Observatory in Chile to discover and measure the distance to the most remote mature cluster of galaxies yet found. Although this cluster is seen when the Universe was less than one quarter of its current age it looks surprisingly similar to galaxy clusters in the current Universe. "We have measured the distance to the most distant mature cluster of galaxies ever found", says the lead author of the study in which the observations from ESO's VLT have been used, Raphael Gobat (CEA, Paris). "The surprising thing is that when we look closely at this galaxy cluster it doesn't look young - many of the galaxies have settled down and don't resemble the usual star-forming galaxies seen in the early Universe." Clusters of galaxies are the largest structures in the Universe that are held together by gravity. Astronomers expect these clusters to grow through time and hence that massive clusters would be rare in the early Universe. Although even more distant clusters have been seen, they appear to be young clusters in the process of formation and are not settled mature systems. The international team of astronomers used the powerful VIMOS and FORS2 instruments on ESO's Very Large Telescope (VLT) to measure the distances to some of the blobs in a curious patch of very faint red objects first observed with the Spitzer space telescope. This grouping, named CL J1449+0856 [1], had all the hallmarks of being a very remote cluster of galaxies [2]. The results showed that we are indeed seeing a galaxy cluster as it was when the Universe was about three billion years old - less than one quarter of its current age [3]. Once the team knew the distance to this very rare object they looked carefully at the component galaxies using both the NASA/ESA Hubble Space Telescope and ground-based telescopes, including the VLT. They found evidence suggesting that most of the

  2. Leadership succession patterns in the apostolic church as a template for critique of contemporary charismatic leadership succession patterns

    Directory of Open Access Journals (Sweden)

    Cephas Tushima

    2016-06-01

    Full Text Available The pattern of leadership succession observed globally in most contemporary Pentecostal movements and churches can be characterised as dynastic succession. Yet historic modern Pentecostalism (in the Azusa tradition prided itself on being biblical. This article explores the biblical sources, examining first the leadership structure and then the leadership succession patterns in the apostolic church as well as the extra-biblical sources of the apostolic patristic era. Our findings from this New Testament (and patristic sources survey of leadership succession in the apostolic church and post-apostolic churches furnish a template for critical evaluation of the prevalent succession approaches of contemporary Pentecostal groups. Critical elements of apostolic and leadership structure and succession patterns are highlighted, and needed inferences are drawn for the re-shaping of leadership and its succession in contemporary Christian ministries and churches. Keywords: Azusa; Apostolic Leadership; Leadership Succession; Pentecostalism; Evangelicals; Leadership Patterns

  3. Self-Presentation Strategies, Fear of Success and Anticipation of Future Success among University and High School Students

    Science.gov (United States)

    Kosakowska-Berezecka, Natasza; Jurek, Paweł; Besta, Tomasz; Badowska, Sylwia

    2017-01-01

    The backlash avoidance model (BAM) suggests women insufficiently self-promote because they fear backlash for behavior which is incongruent with traditional gender roles. Avoiding self-promoting behavior is also potentially related to associating success with negative consequences. In two studies we tested whether self-promotion and fear of success will be predictors of lower salaries and anticipation of lower chances of success in an exam. In study 1, prior to the exam they were about to take, we asked 234 students about their predictions concerning exam results and their future earnings. They also filled scales measuring their associations with success (fear of success) and tendency for self-promotion. The tested model proved that in comparison to men, women expect lower salaries in the future, anticipate lower test performance and associate success with more negative consequences. Both tendency for self-promotion and fear of success are related to anticipation of success in test performance and expectations concerning future earnings. In study 2 we repeated the procedure on a sample of younger female and male high school pupils (N = 100) to verify whether associating success with negative consequences and differences in self-promotion strategies are observable in a younger demographic. Our results show that girls and boys in high school do not differ with regard to fear of success, self-promotion or agency levels. Girls and boys anticipated to obtain similar results in math exam results, but girls expected to have higher results in language exams. Nevertheless, school pupils also differed regarding their future earnings but only in the short term. Fear of success and agency self-ratings were significant predictors of expectations concerning future earnings, but only among high school boys and with regard to earnings expected just after graduation. PMID:29163271

  4. Self-Presentation Strategies, Fear of Success and Anticipation of Future Success among University and High School Students.

    Science.gov (United States)

    Kosakowska-Berezecka, Natasza; Jurek, Paweł; Besta, Tomasz; Badowska, Sylwia

    2017-01-01

    The backlash avoidance model (BAM) suggests women insufficiently self-promote because they fear backlash for behavior which is incongruent with traditional gender roles. Avoiding self-promoting behavior is also potentially related to associating success with negative consequences. In two studies we tested whether self-promotion and fear of success will be predictors of lower salaries and anticipation of lower chances of success in an exam. In study 1, prior to the exam they were about to take, we asked 234 students about their predictions concerning exam results and their future earnings. They also filled scales measuring their associations with success (fear of success) and tendency for self-promotion. The tested model proved that in comparison to men, women expect lower salaries in the future, anticipate lower test performance and associate success with more negative consequences. Both tendency for self-promotion and fear of success are related to anticipation of success in test performance and expectations concerning future earnings. In study 2 we repeated the procedure on a sample of younger female and male high school pupils ( N = 100) to verify whether associating success with negative consequences and differences in self-promotion strategies are observable in a younger demographic. Our results show that girls and boys in high school do not differ with regard to fear of success, self-promotion or agency levels. Girls and boys anticipated to obtain similar results in math exam results, but girls expected to have higher results in language exams. Nevertheless, school pupils also differed regarding their future earnings but only in the short term. Fear of success and agency self-ratings were significant predictors of expectations concerning future earnings, but only among high school boys and with regard to earnings expected just after graduation.

  5. Self-Presentation Strategies, Fear of Success and Anticipation of Future Success among University and High School Students

    Directory of Open Access Journals (Sweden)

    Natasza Kosakowska-Berezecka

    2017-10-01

    Full Text Available The backlash avoidance model (BAM suggests women insufficiently self-promote because they fear backlash for behavior which is incongruent with traditional gender roles. Avoiding self-promoting behavior is also potentially related to associating success with negative consequences. In two studies we tested whether self-promotion and fear of success will be predictors of lower salaries and anticipation of lower chances of success in an exam. In study 1, prior to the exam they were about to take, we asked 234 students about their predictions concerning exam results and their future earnings. They also filled scales measuring their associations with success (fear of success and tendency for self-promotion. The tested model proved that in comparison to men, women expect lower salaries in the future, anticipate lower test performance and associate success with more negative consequences. Both tendency for self-promotion and fear of success are related to anticipation of success in test performance and expectations concerning future earnings. In study 2 we repeated the procedure on a sample of younger female and male high school pupils (N = 100 to verify whether associating success with negative consequences and differences in self-promotion strategies are observable in a younger demographic. Our results show that girls and boys in high school do not differ with regard to fear of success, self-promotion or agency levels. Girls and boys anticipated to obtain similar results in math exam results, but girls expected to have higher results in language exams. Nevertheless, school pupils also differed regarding their future earnings but only in the short term. Fear of success and agency self-ratings were significant predictors of expectations concerning future earnings, but only among high school boys and with regard to earnings expected just after graduation.

  6. Healthcare succession planning: an integrative review.

    Science.gov (United States)

    Carriere, Brian K; Muise, Melanie; Cummings, Greta; Newburn-Cook, Chris

    2009-12-01

    Succession planning is a business strategy that has recently gained attention in the healthcare literature, primarily because of nursing shortage concerns and the demand for retaining knowledgeable personnel to meet organizational needs. Little research has been conducted in healthcare settings that clearly defines best practices for succession planning frameworks. To effectively carry out such organizational strategies during these challenging times, an integrative review of succession planning in healthcare was performed to identify consistencies in theoretical approaches and strategies for chief nursing officers and healthcare managers to initiate. Selected articles were compared with business succession planning to determine whether healthcare strategies were similar to best practices already established in business contexts. The results of this integrative review will aid leaders and managers to use succession planning as a tool in their recruitment, retention, mentoring, and administration activities and also provide insights for future development of healthcare succession planning frameworks.

  7. Critical success factors of Indian Entrepreneurs

    Directory of Open Access Journals (Sweden)

    Alex Antonites

    2013-12-01

    Full Text Available This research seeks to explore the critical success factors that influence the success of Indian small business owners in the largest metropolitan area in South Africa. To achieve this, the objective of the study was to confirm whether there are significant differences between a successful and less successful group of business owners in terms of general management skills, personal characteristics, and entrepreneurial orientation and financing of the business. Through analysing secondary evidence and empirical results it was possible to facilitate a better understanding of how Indian entrepreneurs operating in small and medium enterprises sustain success, thus contributing to the body of knowledge relating to entrepreneurship development in the domain of entrepreneurship. From the literature it became clear that cultural dimensions have an impact on the entrepreneurial process. The arrival of Indians in South Africa has contributed to a unique Indian culture. The characteristics that describe ethnic entrepreneurs and success factors attributed to their success are described. Small and medium enterprises (SMEs are crucial for the development of any country as they offer benefits of economic growth and employment generation. The success factors to sustain SMEs are also described. The findings of the study indicate that there are no significant differences between the comparable groups in relation to management skills and finance factors. There are, however, significant differences relating to personal factors, such as the level of education, family support and experience. Finally, an important learning is that the Indian entrepreneurs in this study are similar to ethnic entrepreneurs reviewed in literature. The study was conducted in Tshwane, the largest metropolitan area in South Africa, and amongst the largest in the world. Keywords: Culture, ethnic entrepreneurship, Indian entrepreneurship, critical success factors, small and medium enterprises

  8. Exotic B=2 states in the SU(2) Skyrme model and other recent results in the B=1 sector

    International Nuclear Information System (INIS)

    Schwesinger, B.

    1986-01-01

    Effective theories with surprising phenomenological success immediatly prompt the suspicion that they are intimately connected to a more fundamental theory. In the case of the Skyrme model things have gone the other way round: first there was the finding that the large N c -limit of QCD results in an effective theory of free mesons where baryons emerge as solitons from meson fields. Subsequently the long forgotten Skyrme model was unearthed by Witten as a possible candidate for such a theory. Examined in the light of its phenomenological capabilities the Skyrme model lead to the surprising success it enjoys till now. (orig./BBOE)

  9. Critical success factors in ERP implementation

    Directory of Open Access Journals (Sweden)

    Blerta Abazi Chaushi

    2016-11-01

    Full Text Available This study conducts state of the art literature review of critical success factors for enterprise resource planning systems implementation success. Since research on critical success factors for ERP implementation success is very rare and fragmented, this study provides a more comprehensive list of ten factors that companies that have adopted and struggle with the implementation, as well as companies who are in the process of considering implementation of ERP system can easily adopt and follow. The main contribution of this paper is that these ten new critical success factors are identifi ed through a thorough analysis of 22 selected research papers and is more comprehensive and straightforwardly employable for use.

  10. Bangladesh becomes "success story".

    Science.gov (United States)

    1999-01-01

    The State Minister for Health and Family of Bangladesh, Dr. Mohammed Amanullah, highlighted some of the successes being achieved by his country in lowering fertility and improving the lives of the people since the 1994 International Conference on Population and Development. Some of these successes include practical measures to eliminate violence against women; introduction of a quota for women in public sector employment; and launching of the Health and Population Sector Program to provide a one-stop, full range of essential reproductive health, family planning and child health services through an integrated delivery mechanism. Moreover, the Minister informed the Forum participants that their success is attributable to many factors which include support from the government, from non-governmental organizations, civil society, mass media, religious and other community leaders, intersectoral collaboration, microcredit and income-generation activities.

  11. Citation Success

    DEFF Research Database (Denmark)

    Di Vaio, Gianfranco; Waldenström, Daniel; Weisdorf, Jacob Louis

    affects citations. In regard to author-specific characteristics, male authors, full professors and authors working economics or history departments, and authors employed in Anglo-Saxon countries, are more likely to get cited than others. As a ‘shortcut' to citation success, we find that research diffusion...

  12. Successful ageing

    DEFF Research Database (Denmark)

    Kusumastuti, Sasmita; Derks, Marloes G. M.; Tellier, Siri

    2016-01-01

    BACKGROUND: Ageing is accompanied by an increased risk of disease and a loss of functioning on several bodily and mental domains and some argue that maintaining health and functioning is essential for a successful old age. Paradoxically, studies have shown that overall wellbeing follows a curvili...

  13. Grammar Maturity Model

    NARCIS (Netherlands)

    Zaytsev, V.; Pierantonio, A.; Schätz, B.; Tamzalit, D.

    2014-01-01

    The evolution of a software language (whether modelled by a grammar or a schema or a metamodel) is not limited to development of new versions and dialects. An important dimension of a software language evolution is maturing in the sense of improving the quality of its definition. In this paper, we

  14. Experiences in Teaching a Graduate Course on Model-Driven Software Development

    Science.gov (United States)

    Tekinerdogan, Bedir

    2011-01-01

    Model-driven software development (MDSD) aims to support the development and evolution of software intensive systems using the basic concepts of model, metamodel, and model transformation. In parallel with the ongoing academic research, MDSD is more and more applied in industrial practices. After being accepted both by a broad community of…

  15. Teaching Assistants, Neuro-Linguistic Programming (NLP) and Special Educational Needs: "Reframing" the Learning Experience for Students with Mild SEN

    Science.gov (United States)

    Kudliskis, Voldis

    2014-01-01

    This study examines how an understanding of two NLP concepts, the meta-model of language and the implementation of reframing, could be used to help teaching assistants enhance class-based interactions with students with mild SEN. Participants (students) completed a pre-intervention and a post-intervention "Beliefs About my Learning…

  16. A meta-model perspective on business models

    NARCIS (Netherlands)

    Alberts, Berend Thomas; Meertens, Lucas Onno; Iacob, Maria Eugenia; Nieuwenhuis, Lambertus Johannes Maria; Shishkov, Boris

    2013-01-01

    The business model field of research is a young and emerging discipline that finds itself confronted with the need for a common language, lack of conceptual consolidation, and without adequate theoretical development. This not only slows down research, but also undermines business model’s usefulness

  17. A Metamodeling Approach for Reasoning about Requirements

    NARCIS (Netherlands)

    Göknil, Arda; Ivanov, Ivan; van den Berg, Klaas; Schieferdecker, I.; Hartman, A.

    In requirements engineering, there are several approaches for requirements modeling such as goal-oriented, aspect-driven, and system requirements modeling. In practice, companies often customize a given approach to their specific needs. Thus, we seek a solution that allows customization in a

  18. Metamodel-based robust simulation-optimization : An overview

    NARCIS (Netherlands)

    Dellino, G.; Meloni, C.; Kleijnen, J.P.C.; Dellino, Gabriella; Meloni, Carlo

    2015-01-01

    Optimization of simulated systems is the goal of many methods, but most methods assume known environments. We, however, develop a "robust" methodology that accounts for uncertain environments. Our methodology uses Taguchi's view of the uncertain world but replaces his statistical techniques by

  19. Successful modeling?

    Science.gov (United States)

    Lomnitz, Cinna

    Tichelaar and Ruff [1989] propose to “estimate model variance in complicated geophysical problems,” including the determination of focal depth in earthquakes, by means of unconventional statistical methods such as bootstrapping. They are successful insofar as they are able to duplicate the results from more conventional procedures.

  20. Human capital and career success

    DEFF Research Database (Denmark)

    Frederiksen, Anders; Kato, Takao

    capital formally through schooling for career success, as well as the gender gap in career success rates. Second, broadening the scope of human capital by experiencing various occupations (becoming a generalist) is found to be advantageous for career success. Third, initial human capital earned through......Denmark’s registry data provide accurate and complete career history data along with detailed personal characteristics (e.g., education, gender, work experience, tenure and others) for the population of Danish workers longitudinally. By using such data from 1992 to 2002, we provide rigorous...... formal schooling and subsequent human capital obtained informally on the job are found to be complements in the production of career success. Fourth, though there is a large body of the literature on the relationship between firm-specific human capital and wages, the relative value of firm-specific human...

  1. A Model of Successful School Leadership from the International Successful School Principalship Project

    Directory of Open Access Journals (Sweden)

    David Gurr

    2015-03-01

    Full Text Available The International Successful School Principalship Project (ISSPP has been actively conducting research about the work of successful principals since 2001. Findings from four project books and eight models derived from this project are synthesised into a model of successful school leadership. Building on Gurr, Drysdale and Mulford’s earlier model, the work of school leaders is described as engaging within the school context to influence student and school outcomes through interventions in teaching and learning, school capacity building, and the wider context. The qualities a leader brings to their role, a portfolio approach to using leadership ideas, constructing networks, collaborations and partnerships, and utilising accountability and evaluation for evidence-informed improvement, are important additional elements. The model is applicable to all in leadership roles in schools.

  2. FE Model Updating on an In-Service Self-Anchored Suspension Bridge with Extra-Width Using Hybrid Method

    Directory of Open Access Journals (Sweden)

    Zhiyuan Xia

    2017-02-01

    Full Text Available Nowadays, many more bridges with extra-width have been needed for vehicle throughput. In order to obtain a precise finite element (FE model of those complex bridge structures, the practical hybrid updating method by integration of Gaussian mutation particle swarm optimization (GMPSO, Kriging meta-model and Latin hypercube sampling (LHS was proposed. By demonstrating the efficiency and accuracy of the hybrid method through the model updating of a damaged simply supported beam, the proposed method was applied to the model updating of a self-anchored suspension bridge with extra-width which showed great necessity considering the results of ambient vibration test. The results of bridge model updating showed that both of the mode frequencies and shapes had relatively high agreement between the updated model and experimental structure. The successful model updating of this bridge fills in the blanks of model updating of a complex self-anchored suspension bridge. Moreover, the updating process enables other model updating issues for complex bridge structures

  3. A Solution to the Flowgraphs Case Study using Triple Graph Grammars and eMoflon

    Directory of Open Access Journals (Sweden)

    Anthony Anjorin

    2013-11-01

    Full Text Available After 20 years of Triple Graph Grammars (TGGs and numerous actively maintained implementations, there is now a need for challenging examples and success stories to show that TGGs can be used for real-world bidirectional model transformations. Our primary goal in recent years has been to increase the expressiveness of TGGs by providing a set of pragmatic features that allow a controlled fallback to programmed graph transformations and Java. Based on the Flowgraphs case study of the Transformation Tool Contest (TTC 2013, we present (i attribute constraints used to express complex bidirectional attribute manipulation, (ii binding expressions for specifying arbitrary context relationships, and (iii post-processing methods as a black box extension for TGG rules. In each case, we discuss the enabled trade-off between guaranteed formal properties and expressiveness. Our solution, implemented with our metamodelling and model transformation tool eMoflon (www.emoflon.org, is available as a virtual machine hosted on Share.

  4. Successful aging: considering non-biomedical constructs

    Directory of Open Access Journals (Sweden)

    Carver LF

    2016-11-01

    Full Text Available Lisa F Carver,1 Diane Buchanan2 1Department of Sociology, Queen’s University Kingston, ON, Canada; 2School of Nursing, Queen’s University Kingston, ON, Canada Objectives: Successful aging continues to be applied in a variety of contexts and is defined using a number of different constructs. Although previous reviews highlight the multidimensionality of successful aging, a few have focused exclusively on non-biomedical factors, as was done here. Methods: This scoping review searched Ovid Medline database for peer-reviewed English-language articles published between 2006 and 2015, offering a model of successful aging and involving research with older adults. Results: Seventy-two articles were reviewed. Thirty-five articles met the inclusion criteria. Common non-biomedical constructs associated with successful aging included engagement, optimism and/or positive attitude, resilience, spirituality and/or religiosity, self-efficacy and/or self-esteem, and gerotranscendence. Discussion: Successful aging is a complex process best described using a multidimensional model. Given that the majority of elders will experience illness and/or disease during the life course, public health initiatives that promote successful aging need to employ non-biomedical constructs, facilitating the inclusion of elders living with disease and/or disability. Keywords: successful aging, resilience, gerotranscendence, engagement, optimism

  5. Success tree method of resources evaluation

    International Nuclear Information System (INIS)

    Chen Qinglan; Sun Wenpeng

    1994-01-01

    By applying the reliability theory in system engineering, the success tree method is used to transfer the expert's recognition on metallogenetic regularities into the form of the success tree. The aim of resources evaluation is achieved by means of calculating the metallogenetic probability or favorability of the top event of the success tree. This article introduces in detail, the source, principle of the success tree method and three kinds of calculation methods, expounds concretely how to establish the success tree of comprehensive uranium metallogenesis as well as the procedure from which the resources evaluation is performed. Because this method has not restrictions on the number of known deposits and calculated area, it is applicable to resources evaluation for different mineral species, types and scales and possesses good prospects of development

  6. Success and Women's Career Adjustment.

    Science.gov (United States)

    Russell, Joyce E. A.; Burgess, Jennifer R. D.

    1998-01-01

    Women still face barriers to career success and satisfaction: stereotypes, assumptions, organizational culture, human resource practices, and lack of opportunities. Despite individual and organizational strategies, many women leave to become entrepreneurs. There is a need to investigate how women define career success. (SK)

  7. Four malaria success stories: how malaria burden was successfully reduced in Brazil, Eritrea, India, and Vietnam.

    Science.gov (United States)

    Barat, Lawrence M

    2006-01-01

    While many countries struggle to control malaria, four countries, Brazil, Eritrea, India, and Vietnam, have successfully reduced malaria burden. To determine what led these countries to achieve impact, published and unpublished reports were reviewed and selected program and partner staff were interviewed to identify common factors that contributed to these successes. Common success factors included conducive country conditions, a targeted technical approach using a package of effective tools, data-driven decision-making, active leadership at all levels of government, involvement of communities, decentralized implementation and control of finances, skilled technical and managerial capacity at national and sub-national levels, hands-on technical and programmatic support from partner agencies, and sufficient and flexible financing. All these factors were essential in achieving success. If the goals of Roll Back Malaria are to be achieved, governments and their partners must take the lessons learned from these program successes and apply them in other affected countries.

  8. MODEL OF TRAINING OF SUCCESS IN LIFE

    Directory of Open Access Journals (Sweden)

    Екатерина Александровна Лежнева

    2014-04-01

    Full Text Available The article explains the importance of the development of motive to succeed in adolescence. It is determined the value of the motive to achieve success in the further development of the teenager: a motive to achieve effective internal forces mobilized for the implementation of successful operation ensures the active involvement of teenagers in social and interpersonal relationships. As the primary means of motive development success is considered training. The author provides a definition of "training for success in life," creates a model of training for success in life, and describes its units (targeted, informative, technological, productive, reveals the successful development of the technology life strategy used during the training (self-presentation, targets, incentives, subject-orientation. The author pays attention to the need for a future psychologist to develop teenagers’ motive to achieve success through the mastery of competence in constructing a model of training for success in life, and its implementation in the course of professional activities. The main means of training students of psychology to the use of training success in life identified the additional educational programs and psychological section.DOI: http://dx.doi.org/10.12731/2218-7405-2013-9-77

  9. Organizational Climate for Successful Aging

    Science.gov (United States)

    Zacher, Hannes; Yang, Jie

    2016-01-01

    Research on successful aging at work has neglected contextual resources such as organizational climate, which refers to employees’ shared perceptions of their work environment. We introduce the construct of organizational climate for successful aging (OCSA) and examine it as a buffer of the negative relationship between employee age and focus on opportunities (i.e., beliefs about future goals and possibilities at work). Moreover, we expected that focus on opportunities, in turn, positively predicts job satisfaction, organizational commitment, and motivation to continue working after official retirement age. Data came from 649 employees working in 120 companies (Mage = 44 years, SD = 13). We controlled for organizational tenure, psychological climate for successful aging (i.e., individuals’ perceptions), and psychological and organizational age discrimination climate. Results of multilevel analyses supported our hypotheses. Overall, our findings suggest that OCSA is an important contextual resource for successful aging at work. PMID:27458405

  10. Organizational Climate for Successful Aging.

    Science.gov (United States)

    Zacher, Hannes; Yang, Jie

    2016-01-01

    Research on successful aging at work has neglected contextual resources such as organizational climate, which refers to employees' shared perceptions of their work environment. We introduce the construct of organizational climate for successful aging (OCSA) and examine it as a buffer of the negative relationship between employee age and focus on opportunities (i.e., beliefs about future goals and possibilities at work). Moreover, we expected that focus on opportunities, in turn, positively predicts job satisfaction, organizational commitment, and motivation to continue working after official retirement age. Data came from 649 employees working in 120 companies (M age = 44 years, SD = 13). We controlled for organizational tenure, psychological climate for successful aging (i.e., individuals' perceptions), and psychological and organizational age discrimination climate. Results of multilevel analyses supported our hypotheses. Overall, our findings suggest that OCSA is an important contextual resource for successful aging at work.

  11. Nurse manager succession planning: a concept analysis.

    Science.gov (United States)

    Titzer, Jennifer L; Shirey, Maria R

    2013-01-01

    The current nursing leadership pipeline is inadequate and demands strategic succession planning methods. This article provides concept clarification regarding nurse manager succession planning. Attributes common to succession planning include organizational commitment and resource allocation, proactive and visionary leadership approach, and a mentoring and coaching environment. Strategic planning, current and future leadership analysis, high-potential identification, and leadership development are succession planning antecedents. Consequences of succession planning are improved leadership and organizational culture continuity, and increased leadership bench strength. Health care has failed to strategically plan for future leadership. Developing a strong nursing leadership pipeline requires deliberate and strategic succession planning. © 2013 Wiley Periodicals, Inc.

  12. Status and Mating Success Amongst Visual Artists

    Science.gov (United States)

    Clegg, Helen; Nettle, Daniel; Miell, Dorothy

    2011-01-01

    Geoffrey Miller has hypothesized that producing artwork functions as a mating display. Here we investigate the relationship between mating success and artistic success in a sample of 236 visual artists. Initially, we derived a measure of artistic success that covered a broad range of artistic behaviors and beliefs. As predicted by Miller’s evolutionary theory, more successful male artists had more sexual partners than less successful artists but this did not hold for female artists. Also, male artists with greater artistic success had a mating strategy based on longer term relationships. Overall the results provide partial support for the sexual selection hypothesis for the function of visual art. PMID:22059085

  13. Endoscopic Third Ventriculostomy: Success and Failure.

    Science.gov (United States)

    Deopujari, Chandrashekhar E; Karmarkar, Vikram S; Shaikh, Salman T

    2017-05-01

    Endoscopic third ventriculostomy (ETV) has now become an accepted mode of hydrocephalus treatment in children. Varying degrees of success for the procedure have been reported depending on the type and etiology of hydrocephalus, age of the patient and certain technical parameters. Review of these factors for predictability of success, complications and validation of success score is presented.

  14. Succession Planning in Australian Farming

    Directory of Open Access Journals (Sweden)

    John Hicks

    2012-11-01

    Full Text Available The theme of this paper is that succession planning in Australian farming is under-developed.It may be linked to economic and social change which suggests that farmers need to adapt togenerational change but this is being resisted or ignored. The implications of this are the slowdecline of family farming, a poor transfer of skills and knowledge to subsequent generationsof farmers in some parts of the agricultural sector and the potential for an extension of thefinancial services industry to develop a more effective raft of succession planning measuresto mitigate the effects of a traditional approach to succession in agriculture.

  15. Qualitative Characteristic of Sociability in Groups of Successful and Less Successful Learners of Foreign Languages

    Directory of Open Access Journals (Sweden)

    G V Zarembo

    2011-06-01

    Full Text Available The article deals with a qualitative characteristic of sociability for successful and less successful learners of foreign languages. The statistical estimate showing the differences between average figures on sociability and effectiveness in the second language learning is also given.

  16. Investigating critical success factors in tile industry

    Directory of Open Access Journals (Sweden)

    Davood Salmani

    2014-04-01

    Full Text Available This paper presents an empirical investigation to determine critical success factors influencing the success of tile industry in Iran. The study designs a questionnaire in Likert scale, distributes it among some experts in tile industry. Using Pearson correlation test, the study has detected that there was a positive and meaningful relationship between marketing planning and the success of tile industry (r = 0.312 Sig. = 0.001. However, there is not any meaningful relationship between low cost production and success of tile industry (r = 0.13 Sig. = 0.12 and, there is a positive and meaningful relationship between organizational capabilities and success of tile industry (r = 0.635 Sig. = 0.000. Finally, our investigation states that technology and distributing systems also influence on the success of tile industry, positively. The study has also used five regression analyses where the success of tile industry was the dependent variable and marketing planning, low cost production and organizational capabilities are independent variables and the results have confirmed some positive and meaningful relationship between the successes of tile industry with all independent variables.

  17. Succession Planning for Library Instruction

    Science.gov (United States)

    Sobel, Karen; Drewry, Josiah

    2015-01-01

    Detailed succession planning helps libraries pass information from one employee to the next. This is crucial in preparing for hiring, turnover, retirements, training of graduate teaching assistants in academic libraries, and other common situations. The authors of this article discuss succession planning for instruction programs in academic…

  18. Surprising Ripple Effects: How Changing the SAT Score-Sending Policy for Low-Income Students Impacts College Access and Success

    Science.gov (United States)

    Hurwitz, Michael; Mbekeani, Preeya P.; Nipson, Margaret M.; Page, Lindsay C.

    2017-01-01

    Subtle policy adjustments can induce relatively large "ripple effects." We evaluate a College Board initiative that increased the number of free SAT score reports available to low-income students and changed the time horizon for using these score reports. Using a difference-in-differences analytic strategy, we estimate that targeted…

  19. Higher survival drives the success of nitrogen-fixing trees through succession in Costa Rican rainforests.

    Science.gov (United States)

    Menge, Duncan N L; Chazdon, Robin L

    2016-02-01

    Trees capable of symbiotic nitrogen (N) fixation ('N fixers') are abundant in many tropical forests. In temperate forests, it is well known that N fixers specialize in early-successional niches, but in tropical forests, successional trends of N-fixing species are poorly understood. We used a long-term census study (1997-2013) of regenerating lowland wet tropical forests in Costa Rica to document successional patterns of N fixers vs non-fixers, and used an individual-based model to determine the demographic drivers of these trends. N fixers increased in relative basal area during succession. In the youngest forests, N fixers grew 2.5 times faster, recruited at a similar rate and were 15 times less likely to die as non-fixers. As succession proceeded, the growth and survival disparities decreased, whereas N fixer recruitment decreased relative to non-fixers. According to our individual-based model, high survival was the dominant driver of the increase in basal area of N fixers. Our data suggest that N fixers are successful throughout secondary succession in tropical rainforests of north-east Costa Rica, and that attempts to understand this success should focus on tree survival. © 2015 The Authors. New Phytologist © 2015 New Phytologist Trust.

  20. Biofilm community succession: a neutral perspective.

    Science.gov (United States)

    Woodcock, Stephen; Sloan, William T

    2017-05-22

    Although biofilms represent one of the dominant forms of life in aqueous environments, our understanding of the assembly and development of their microbial communities remains relatively poor. In recent years, several studies have addressed this and have extended the concepts of succession theory in classical ecology into microbial systems. From these datasets, niche-based conceptual models have been developed explaining observed biodiversity patterns and their dynamics. These models have not, however, been formulated mathematically and so remain untested. Here, we further develop spatially resolved neutral community models and demonstrate that these can also explain these patterns and offer alternative explanations of microbial succession. The success of neutral models suggests that stochastic effects alone may have a much greater influence on microbial community succession than previously acknowledged. Furthermore, such models are much more readily parameterised and can be used as the foundation of more complex and realistic models of microbial community succession.

  1. Preparing the Dutch delta for future droughts: model based support in the national Delta Programme

    Science.gov (United States)

    ter Maat, Judith; Haasnoot, Marjolijn; van der Vat, Marnix; Hunink, Joachim; Prinsen, Geert; Visser, Martijn

    2014-05-01

    Keywords: uncertainty, policymaking, adaptive policies, fresh water management, droughts, Netherlands, Dutch Deltaprogramme, physically-based complex model, theory-motivated meta-model To prepare the Dutch Delta for future droughts and water scarcity, a nation-wide 4-year project, called Delta Programme, is established to assess impacts of climate scenarios and socio-economic developments and to explore policy options. The results should contribute to a national adaptive plan that is able to adapt to future uncertain conditions, if necessary. For this purpose, we followed a model-based step-wise approach, wherein both physically-based complex models and theory-motivated meta-models were used. First step (2010-2011) was to make a quantitative problem description. This involved a sensitivity analysis of the water system for drought situations under current and future conditions. The comprehensive Dutch national hydrological instrument was used for this purpose and further developed. Secondly (2011-2012) our main focus was on making an inventory of potential actions together with stakeholders. We assessed efficacy, sell-by date of actions, and reassessed vulnerabilities and opportunities for the future water supply system if actions were (not) taken. A rapid assessment meta-model was made based on the complex model. The effects of all potential measures were included in the tool. Thirdly (2012-2013), with support of the rapid assessment model, we assessed the efficacy of policy actions over time for an ensemble of possible futures including sea level rise and climate and land use change. Last step (2013-2014) involves the selection of preferred actions from a set of promising actions that meet the defined objectives. These actions are all modeled and evaluated using the complex model. The outcome of the process will be an adaptive management plan. The adaptive plan describes a set of preferred policy pathways - sequences of policy actions - to achieve targets under

  2. Landscape structure and management alter the outcome of a pesticide ERA: Evaluating impacts of endocrine disruption using the ALMaSS European Brown Hare model.

    Science.gov (United States)

    Topping, Chris J; Dalby, Lars; Skov, Flemming

    2016-01-15

    There is a gradual change towards explicitly considering landscapes in regulatory risk assessment. To realise the objective of developing representative scenarios for risk assessment it is necessary to know how detailed a landscape representation is needed to generate a realistic risk assessment, and indeed how to generate such landscapes. This paper evaluates the contribution of landscape and farming components to a model based risk assessment of a fictitious endocrine disruptor on hares. In addition, we present methods and code examples for generation of landscape structures and farming simulation from data collected primarily for EU agricultural subsidy support and GIS map data. Ten different Danish landscapes were generated and the ERA carried out for each landscape using two different assumed toxicities. The results showed negative impacts in all cases, but the extent and form in terms of impacts on abundance or occupancy differed greatly between landscapes. A meta-model was created, predicting impact from landscape and farming characteristics. Scenarios based on all combinations of farming and landscape for five landscapes representing extreme and middle impacts were created. The meta-models developed from the 10 real landscapes failed to predict impacts for these 25 scenarios. Landscape, farming, and the emergent density of hares all influenced the results of the risk assessment considerably. The study indicates that prediction of a reasonable worst case scenario is difficult from structural, farming or population metrics; rather the emergent properties generated from interactions between landscape, management and ecology are needed. Meta-modelling may also fail to predict impacts, even when restricting inputs to combinations of those used to create the model. Future ERA may therefore need to make use of multiple scenarios representing a wide range of conditions to avoid locally unacceptable risks. This approach could now be feasible Europe wide given the

  3. Comparing fire spread algorithms using equivalence testing and neutral landscape models

    Science.gov (United States)

    Brian R. Miranda; Brian R. Sturtevant; Jian Yang; Eric J. Gustafson

    2009-01-01

    We demonstrate a method to evaluate the degree to which a meta-model approximates spatial disturbance processes represented by a more detailed model across a range of landscape conditions, using neutral landscapes and equivalence testing. We illustrate this approach by comparing burn patterns produced by a relatively simple fire spread algorithm with those generated by...

  4. Can North Korean Airborne Special Purpose Forces Successfully Conduct Military Operations Against the United States and South Korea?

    National Research Council Canada - National Science Library

    Allmond, Samuel

    2003-01-01

    ...; and secondly by producing the world's largest SPF. North Korea (NK) has built the world's largest SPF in the world with more than 100,000 men to support surprise attacks to disrupt CFC combat buildup and operations...

  5. Does Happiness Promote Career Success?

    Science.gov (United States)

    Boehm, Julia K.; Lyubomirsky, Sonja

    2008-01-01

    Past research has demonstrated a relationship between happiness and workplace success. For example, compared with their less happy peers, happy people earn more money, display superior performance, and perform more helpful acts. Researchers have often assumed that an employee is happy and satisfied because he or she is successful. In this article,…

  6. Medical abortion. defining success and categorizing failures

    DEFF Research Database (Denmark)

    Rørbye, Christina; Nørgaard, Mogens; Vestermark, Vibeke

    2003-01-01

    . The difference in short- and long-term success rates increased with increasing gestational age. The majority of failures (76%) were diagnosed more than 2 weeks after initiation of the abortion. At a 2-week follow-up visit, the women who turned out to be failures had a larger endometrial width, higher beta......-hCG values and smaller reductions of beta-hCG than those treated successfully. To optimize comparison of success rates after different medical abortion regimens, we suggest that the criteria for success are stated clearly, that the success rates are stratified according to gestational age...

  7. Key Success Factors in Business Intelligence

    Directory of Open Access Journals (Sweden)

    Szymon Adamala

    2011-12-01

    Full Text Available Business Intelligence can bring critical capabilities to an organization, but the implementation of such capabilities is often plagued with problems. Why is it that certain projects fail, while others succeed? The aim of this article is to identify the factors that are present in successful Business Intelligence projects and to organize them into a framework of critical success factors. A survey was conducted during the spring of 2011 to collect primary data on Business Intelligence projects. Findings confirm that Business Intelligence projects are wrestling with both technological and non-technological problems, but the non-technological problems are found to be harder to solve as well as more time consuming than their counterparts. The study also shows that critical success factors for Business Intelligence projects are different from success factors for Information Systems projects in general. Business Intelligences projects have critical success factors that are unique to the subject matter. Major differences can be found primarily among non-technological factors, such as the presence of a specific business need and a clear vision to guide the project. Success depends on types of project funding, the business value provided by each iteration in the project and the alignment of the project to a strategic vision for Business Intelligence at large. Furthermore, the study provides a framework for critical success factors that, explains sixty-one percent of variability of success for projects. Areas which should be given special attention include making sure that the Business Intelligence solution is built with the end users in mind, that the Business Intelligence solution is closely tied to the company’s strategic vision and that the project is properly scoped and prioritized to concentrate on the best opportunities first.

  8. A model-Driven Approach to Customize the Vocabulary of Communication Boards: Towards More Humanization of Health Care.

    Science.gov (United States)

    Franco, Natália M; Medeiros, Gabriel F; Silva, Edson A; Murta, Angela S; Machado, Aydano P; Fidalgo, Robson N

    2015-01-01

    This work presents a Modeling Language and its technological infrastructure to customize the vocabulary of Communication Boards (CB), which are important tools to provide more humanization of health care. Using a technological infrastructure based on Model-Driven Development (MDD) approach, our Modelin Language (ML) creates an abstraction layer between users (e.g., health professionals such as an audiologist or speech therapist) and application code. Moreover, the use of a metamodel enables a syntactic corrector for preventing creation of wrong models. Our ML and metamodel enable more autonomy for health professionals in creating customized CB because it abstracts complexities and permits them to deal only with the domain concepts (e.g., vocabulary and patient needs). Additionally, our infrastructure provides a configuration file that can be used to share and reuse models. This way, the vocabulary modelling effort will decrease our time since people share vocabulary models. Our study provides an infrastructure that aims to abstract the complexity of CB vocabulary customization, giving more autonomy to health professionals when they need customizing, sharing and reusing vocabularies for CB.

  9. Modelling SDL, Modelling Languages

    Directory of Open Access Journals (Sweden)

    Michael Piefel

    2007-02-01

    Full Text Available Today's software systems are too complex to implement them and model them using only one language. As a result, modern software engineering uses different languages for different levels of abstraction and different system aspects. Thus to handle an increasing number of related or integrated languages is the most challenging task in the development of tools. We use object oriented metamodelling to describe languages. Object orientation allows us to derive abstract reusable concept definitions (concept classes from existing languages. This language definition technique concentrates on semantic abstractions rather than syntactical peculiarities. We present a set of common concept classes that describe structure, behaviour, and data aspects of high-level modelling languages. Our models contain syntax modelling using the OMG MOF as well as static semantic constraints written in OMG OCL. We derive metamodels for subsets of SDL and UML from these common concepts, and we show for parts of these languages that they can be modelled and related to each other through the same abstract concepts.

  10. Designing the database for a reliability aware Model-Based System Engineering process

    International Nuclear Information System (INIS)

    Cressent, Robin; David, Pierre; Idasiak, Vincent; Kratz, Frederic

    2013-01-01

    This article outlines the need for a reliability database to implement model-based description of components failure modes and dysfunctional behaviors. We detail the requirements such a database should honor and describe our own solution: the Dysfunctional Behavior Database (DBD). Through the description of its meta-model, the benefits of integrating the DBD in the system design process is highlighted. The main advantages depicted are the possibility to manage feedback knowledge at various granularity and semantic levels and to ease drastically the interactions between system engineering activities and reliability studies. The compliance of the DBD with other reliability database such as FIDES is presented and illustrated. - Highlights: ► Model-Based System Engineering is more and more used in the industry. ► It results in a need for a reliability database able to deal with model-based description of dysfunctional behavior. ► The Dysfunctional Behavior Database aims to fulfill that need. ► It helps dealing with feedback management thanks to its structured meta-model. ► The DBD can profit from other reliability database such as FIDES.

  11. A statistical analysis of individual success after successful completion of Defense Language Institute Foreign Language Center Training

    OpenAIRE

    Hinson, William B.

    2005-01-01

    "The Defense Language Institute Foreign Language Center (DLIFLC) trains students in various foreign languages and dialects for the Department of Defense (DOD). The majority of students are firstterm enlistees in the basic program. This study uses classification trees and logistic regression to understand the military, academic and personal characteristics that influence first-term success after successfully completing DLIFLC training. Success was defined as completing a firstterm enlistme...

  12. Collective frame of reference as a driving force in technology development processes : the essential tension between path dependent and path breaking technology developments

    NARCIS (Netherlands)

    Draijer, C.

    2010-01-01

    A. Introduction A wide body of literature describes the relationship between innovation and the success of the organization. The evidence that innovative organizations are more successful is overwhelming [Bain 1956, Porter1983]. It is therefore not surprising that innovation is seen as one of the

  13. Styles of success

    DEFF Research Database (Denmark)

    Dahlgaard, Jens Jørn; Nørgaard, Anders; Jakobsen, Søren

    1997-01-01

    Corporate success stories tend to emphasize the "great men" theory of history. But now a European research project established the managerial attributes that can turn an ordinary leader into one ideal for the pursuit of business excellence. The emergence of five leadership styles as crucial drivers...... of business excellence points to a clear agenda for success. Setting clear strategic goals and the ability to take a long-term view of an organization's direction, combined with other leadership attributes such as creativity, teambuilding and learning, are principal keys to creating an excellent organization....... Leaders seeking to achive business excellence must view the high-level attainment of these sets of leadership competencies as their paramount objective. In striving for business excellence, European leaders may encounter resistance among their employees. Crucially, European employees place a markedly...

  14. A structural study of [CpM(CO)3H] (M = Cr, Mo and W) by single-crystal X-ray diffraction and DFT calculations: sterically crowded yet surprisingly flexible molecules.

    Science.gov (United States)

    Burchell, Richard P L; Sirsch, Peter; Decken, Andreas; McGrady, G Sean

    2009-08-14

    The single-crystal X-ray structures of the complexes [CpCr(CO)3H] 1, [CpMo(CO)3H] 2 and [CpW(CO)3H] 3 are reported. The results indicate that 1 adopts a structure close to a distorted three-legged piano stool geometry, whereas a conventional four-legged piano stool arrangement is observed for 2 and 3. Further insight into the equilibrium geometries and potential energy surfaces of all three complexes was obtained by DFT calculations. These show that in the gas phase complex 1 also prefers a geometry close to a four-legged piano stool in line with its heavier congeners, and implying strong packing forces at work for 1 in the solid state. Comparison with their isolelectronic group 7 tricarbonyl counterparts [CpM(CO)3] (M = Mn 4 and Re 5) illustrates that 1, 2 and 3 are sterically crowded complexes. However, a surprisingly soft bending potential is evident for the M-H moiety, whose order (1 approximately = 2 < 3) correlates with the M-H bond strength rather than with the degree of congestion at the metal centre, indicating electronic rather than steric control of the potential. The calculations also reveal cooperative motions of the hydride and carbonyl ligands in the M(CO)3H unit, which allow the M-H moiety to move freely, in spite of the closeness of the four basal ligands, helping to explain the surprising flexibility of the crowded coordination sphere observed for this family of high CN complexes.

  15. Success of Chemotherapy in Soft Matter

    OpenAIRE

    Trifonova, I.; Kurteva, G.; Stefanov, S. Z.

    2014-01-01

    The success of chemotharapy in soft matter as a survival is found in the paper. Therefore, it is found the analogous tumor stretching force in soft matter; ultrasonography is performed for this tumor; restoration in soft matter with such a tumor is found; Bayes estimate of the probability of chemotherapy success is derived from the transferred chemical energy and from soft matter entropy; survival probability is juxtaposed to this probability of success.

  16. Behind the Exporters’ Success: Analysis of Successful Hungarian Exporter Companies From a Strategic Perspective

    OpenAIRE

    Annamaria Kazai Onodi; Krisztina Pecze

    2014-01-01

    The purpose of the study is to provide an overview of export success from a strategic management perspective. The paper empirically tested the relationships between the firm’s export performance, strategic thinking, adaptation to the changing environment and companies’ capabilities. The research is based on the Hungarian Competitiveness Research database of 2013 that consists of 300 firms. Cluster analysis differentiated successful export-oriented and stagnant companies. Both of them had high...

  17. Navy Contracting Analyzing Critical Success Factors and Perceived Impact on Success within an Organization

    OpenAIRE

    Hill, Josh R.; McGraw, Kevin L.

    2012-01-01

    MBA Professional Report Approved for public release, distribution unlimited Critical Success Factors (CSF) are essential ingredients within an organization that are necessary to meet critical mission objectives. Identifying those factors can be a vital asset and assist leadership in achieving successful outcomes in contract management. This report will focus on three major contracting commands within the United States Navy Naval Supply Systems Command, Global Logistics Support (NAVSUP-G...

  18. Surprisingly high specificity of the PPD skin test for M. tuberculosis infection from recent exposure in The Gambia.

    Science.gov (United States)

    Hill, Philip C; Brookes, Roger H; Fox, Annette; Jackson-Sillah, Dolly; Lugos, Moses D; Jeffries, David J; Donkor, Simon A; Adegbola, Richard A; McAdam, Keith P W J

    2006-12-20

    Options for intervention against Mycobacterium tuberculosis infection are limited by the diagnostic tools available. The Purified Protein Derivative (PPD) skin test is thought to be non-specific, especially in tropical settings. We compared the PPD skin test with an ELISPOT test in The Gambia. Household contacts over six months of age of sputum smear positive TB cases and community controls were recruited. They underwent a PPD skin test and an ELISPOT test for the T cell response to PPD and ESAT-6/CFP10 antigens. Responsiveness to M. tuberculosis exposure was analysed according to sleeping proximity to an index case using logistic regression. 615 household contacts and 105 community controls were recruited. All three tests assessed increased significantly in positivity with increasing M. tuberculosis exposure, the PPD skin test most dramatically (OR 15.7; 95% CI 6.6-35.3). While the PPD skin test positivity continued to trend downwards in the community with increasing distance from a known case (61.9% to 14.3%), the PPD and ESAT-6/CFP-10 ELISPOT positivity did not. The PPD skin test was more in agreement with ESAT-6/CFP-10 ELISPOT (75%, p = 0.01) than the PPD ELISPOT (53%, pPPD skin test positive increased (pPPD skin test negative decreased (pPPD skin test has surprisingly high specificity for M. tuberculosis infection from recent exposure in The Gambia. In this setting, anti-tuberculous prophylaxis in PPD skin test positive individuals should be revisited.

  19. Project Success in IT Project Management

    OpenAIRE

    Siddiqui, Farhan Ahmed

    2010-01-01

    The rate of failed and challenged Information Technology (IT) projects is too high according to the CHAOS Studies by the Standish Group and the literature on project management (Standish Group, 2008). The CHAOS Studies define project success as meeting the triple constraints of scope, time, and cost. The criteria for project success need to be agreed by all parties before the start of the project and constantly reviewed as the project progresses. Assessing critical success factors is another ...

  20. Critical success factors for managing purchasing groups

    NARCIS (Netherlands)

    Schotanus, Fredo; Telgen, Jan; de Boer, L.

    2010-01-01

    In this article, we identify critical success factors for managing small and intensive purchasing groups by comparing successful and unsuccessful purchasing groups in a large-scale survey. The analysis of our data set suggests the following success factors: no enforced participation, sufficient

  1. Key performance indicators for successful simulation projects

    OpenAIRE

    Jahangirian, M; Taylor, SJE; Young, T; Robinson, S

    2016-01-01

    There are many factors that may contribute to the successful delivery of a simulation project. To provide a structured approach to assessing the impact various factors have on project success, we propose a top-down framework whereby 15 Key Performance Indicators (KPI) are developed that represent the level of successfulness of simulation projects from various perspectives. They are linked to a set of Critical Success Factors (CSF) as reported in the simulation literature. A single measure cal...

  2. Crystal structure of di-μ-chlorido-bis[dichloridobis(methanol-κOiridium(III] dihydrate: a surprisingly simple chloridoiridium(III dinuclear complex with methanol ligands

    Directory of Open Access Journals (Sweden)

    Joseph S. Merola

    2015-05-01

    Full Text Available The reaction between IrCl3·xH2O in methanol led to the formation of small amounts of the title compound, [Ir2Cl6(CH3OH4]·2H2O, which consists of two IrCl4O2 octahedra sharing an edge via chloride bridges. The molecule lies across an inversion center. Each octahedron can be envisioned as being comprised of four chloride ligands in the equatorial plane with methanol ligands in the axial positions. A lattice water molecule is strongly hydrogen-bonded to the coordinating methanol ligands and weak interactions with coordinating chloride ligands lead to the formation of a three-dimensional network. This is a surprising structure given that, while many reactions of iridium chloride hydrate are carried out in alcoholic solvents, especially methanol and ethanol, this is the first structure of a chloridoiridium compound with only methanol ligands.

  3. Mathematics a minimal introduction

    CERN Document Server

    Buium, Alexandru

    2013-01-01

    Pre-Mathematical Logic Languages Metalanguage Syntax Semantics Tautologies Witnesses Theories Proofs Argot Strategies Examples Mathematics ZFC Sets Maps Relations Operations Integers Induction Rationals Combinatorics Sequences Reals Topology Imaginaries Residues p-adics Groups Orders Vectors Matrices Determinants Polynomials Congruences Lines Conics Cubics Limits Series Trigonometry Integrality Reciprocity Calculus Metamodels Categories Functors Objectives Mathematical Logic Models Incompleteness Bibliography Index

  4. Do high-growth entrepreneurial firms have a specific system of governance?

    OpenAIRE

    Peter Wirtz

    2009-01-01

    From a meta-model linking a firm’s corporate governance system to managerial discretion (Charreaux, 2008), this article presents a specific corporate governance model for the high growth entrepreneurial firm. A survey of the empirical literature on the governance of entrepreneurial firms confirms the plausibility of this theoretical framework, especially with respect to the cognitive dimension of corporate governance.

  5. Using NLP meta, Milton, metaphor models, for improving the activity of the organization

    Directory of Open Access Journals (Sweden)

    Cornel Marian IOSIF

    2010-12-01

    Full Text Available The objective of this paper is the improving of the three methods from the neuro- linguistic programming – metaphor, Milton model and the meta-model, so by using this in daily activities by an organization to improve the activities witch, are performed and to have a more efficient allocation of the available resources.

  6. Defining successful aging: a tangible or elusive concept?

    Science.gov (United States)

    Martin, Peter; Kelly, Norene; Kahana, Boaz; Kahana, Eva; Willcox, Bradley J; Willcox, D Craig; Poon, Leonard W

    2015-02-01

    Everyone wants to age successfully; however, the definition and criteria of successful aging remain vague for laypersons, researchers, and policymakers in spite of decades of research on the topic. This paper highlights work of scholars who made significant theoretical contributions to the topic. A thorough review and evaluation of the literature on successful aging was undertaken. Our review includes early gerontological definitions of successful aging and related concepts. Historical perspectives reach back to philosophical and religious texts, and more recent approaches have focused on both process- and outcome-oriented models of successful aging. We elaborate on Baltes and Baltes' theory of selective optimization with compensation [Baltes, P. B., & Baltes, M. M. (1990a). Psychological perspectives on successful aging: The model of selective optimization with compensation. In P. B. Baltes & M. M. Baltes (Eds.), Successful aging: Perspectives from the behavioral sciences (pp. 1-34). United Kingdom: Cambridge University Press], Kahana and Kahana's preventive and corrective proactivity model [Kahana, E., & Kahana, B. (1996). Conceptual and empirical advances in understanding aging well through proactive adaptation. In V. Bengtson (Ed.), Adulthood and aging: Research on continuities and discontinuities (pp. 18-40). New York: Springer], and Rowe and Kahn's model of successful aging [Rowe, J. W., & Kahn, R. L. (1998). Successful aging. New York: Pantheon Books], outlining their commonalities and differences. Additional views on successful aging emphasize subjective versus objective perceptions of successful aging and relate successful aging to studies on healthy and exceptional longevity. Additional theoretical work is needed to better understand successful aging, including the way it can encompass disability and death and dying. The extent of rapid social and technological change influencing views on successful aging also deserves more consideration. © The Author

  7. Surprising finding on colonoscopy.

    Science.gov (United States)

    Griglione, Nicole; Naik, Jahnavi; Christie, Jennifer

    2010-02-01

    A 48-year-old man went to his primary care physician for his annual physical. He told his physician that for the past few years, he had intermittent, painless rectal bleeding consisting of small amounts of blood on the toilet paper after defecation. He also mentioned that he often spontaneously awoke, very early in the morning. His past medical history was unremarkable. The patient was born in Cuba but had lived in the United States for more than 30 years. He was divorced, lived alone, and had no children. He had traveled to Latin America-including Mexico, Brazil, and Cuba-off and on over the past 10 years. His last trip was approximately 2 years ago. His physical exam was unremarkable. Rectal examination revealed no masses or external hemorrhoids; stool was brown and Hemoccult negative. Labs were remarkable for eosinophilia ranging from 10% to 24% over the past several years (the white blood cell count ranged from 5200 to 5900/mcL). A subsequent colonoscopy revealed many white, thin, motile organisms dispersed throughout the colon. The organisms were most densely populated in the cecum. Of note, the patient also had nonbleeding internal hemorrhoids. An aspiration of the organisms was obtained and sent to the microbiology lab for further evaluation. What is your diagnosis? How would you manage this condition?

  8. More statistics, less surprise

    CERN Multimedia

    Antonella Del Rosso & the LHCb collaboration

    2013-01-01

    The LHCb collaboration has recently announced new results for a parameter that measures the CP violation effect in particles containing charm quarks. The new values obtained with a larger data set and with a new independent method are showing that the effect is smaller than previous measurements had  suggested. The parameter is back into the Standard Model picture.   CP violation signals – in particles containing charm quarks, such as the D0 particle, is a powerful probe of new physics. Indeed, such effects could result in unexpected values of parameters whose expectation values in the Standard Model are known. Although less precise than similar approaches used in particles made of b quarks, the investigation of the charm system has proven  to be intriguing. The LHCb collaboration has reported new measurements of ΔACP, the difference in CP violation between the D0→K+K– and D0→π+π– decays. The results are ob...

  9. Crop succession requirements in agricultural production planning

    NARCIS (Netherlands)

    Klein Haneveld, W.K.; Stegeman, A.

    2005-01-01

    A method is proposed to write crop succession requirements as linear constraints in an LP-based model for agricultural production planning. Crop succession information is given in the form of a set of inadmissible successions of crops. The decision variables represent the areas where a certain

  10. [Effects of climate change on forest succession].

    Science.gov (United States)

    Wang, Jijun; Pei, Tiefan

    2004-10-01

    Forest regeneration is an important process driven by forest ecological dynamic resources. More and more concern has been given to forest succession issues since the development of forest succession theory during the early twentieth century. Scientific management of forest ecosystem entails the regulations and research models of forest succession. It is of great practical and theoretical significance to restore and reconstruct forest vegetation and to protect natural forest. Disturbances are important factors affecting regeneration structure and ecological processes. They result in temporal and spatial variations of forest ecosystem, and change the efficiencies of resources. In this paper, some concepts about forest succession and disturbances were introduced, and the difficulties of forest succession were proposed. Four classes of models were reviewed: Markov model, GAP model, process-based equilibrium terrestrial biosphere models (BIOME series models), and non-linear model. Subsequently, the effects of climate change on forest succession caused by human activity were discussed. At last, the existing problem and future research directions were proposed.

  11. Bol d'Or success for all-women crew from CERN

    CERN Multimedia

    2001-01-01

    The boat 'Mic Mac' and its CERN's all-woman crew (left to right), Christine Theurillat, Ursula Haenger , Paola Catapano, Petra Riedler, and skipper Cristina Morone. Spectacular highlight of the Lake Leman sailing calendar is the annual Bol d'Or race. Held this year on 16 and 17 June, the event attracted nearly 500 teams who competed under extreme weather conditions for the honours. Among the competitors was an all-woman crew from the CERN Yachting Club, sailing their Surprise boat, Mic Mac. The team was not only among the 397 boats to finish, but also the first all-woman crewed single hull boat to cross the line.

  12. Successful international negotiations

    International Nuclear Information System (INIS)

    Gerry, G.

    1997-01-01

    These remarks on successful international trade negotiations deal with the following topics: culture and differences in psychology; building friendly relationships and letting both sides appear to win; well written proposals; security of negotiating information; the complexity and length of nuclear negotiations

  13. Measuring strategic success.

    Science.gov (United States)

    Gish, Ryan

    2002-08-01

    Strategic triggers and metrics help healthcare providers achieve financial success. Metrics help assess progress toward long-term goals. Triggers signal market changes requiring a change in strategy. All metrics may not move in concert. Organizations need to identify indicators, monitor performance.

  14. Incumbent Decisions about Succession Transitions in Family Firms

    DEFF Research Database (Denmark)

    Boyd, Britta; Botero, Isabel C.; Fediuk, Tomasz A.

    2014-01-01

    in (i.e., intra-family succession, out of family succession, or no succession). Building on the theory of planned behavior and the socioemotional wealth framework (SEW), this manuscript presents a conceptual framework to understand the factors that influence succession transitions and the role...... that contextual factors can play in this decision-making process. We present theory driven propositions and discuss the implications for understanding and evaluation of the succession process....

  15. Fear of success among business students.

    Science.gov (United States)

    Rothman, M

    1996-06-01

    The concept of "Fear of Success" was measured with 352 male and female business students using the prompt, After first term finals, Ann(John) finds her(him)self at the top of her(his) Medical/Nursing school class. Analysis indicated a greater frequency of fear-of-success imagery among men than women and in particular to the John in Medical school and Ann in Nursing school cues. In addition, the Ann cue and the Medical school cue generated more fear-of-success responses among men than women.

  16. Successfully combating prejudice

    Indian Academy of Sciences (India)

    Lawrence

    Sir Jagdish Chandra Bose, and fascinated by his work that showed that plants were ... U.S., in 1972, I was invited to take up a faculty position at the newly established ... success because of their different social commitments. Today when I look ...

  17. Project Success in Agile Development Software Projects

    Science.gov (United States)

    Farlik, John T.

    2016-01-01

    Project success has multiple definitions in the scholarly literature. Research has shown that some scholars and practitioners define project success as the completion of a project within schedule and within budget. Others consider a successful project as one in which the customer is satisfied with the product. This quantitative study was conducted…

  18. Reframing Success and Failure of Information Systems

    DEFF Research Database (Denmark)

    Cecez-Kecmanovic, Dubravka; Kautz, Karlheinz; Abrahall, Rebecca

    2014-01-01

    -networks of developers, managers, technologies, project documents, methodologies, and other actors. Drawing from a controversial case of a highly innovative information system in an insurance company-considered a success and failure at the same time- the paper reveals the inherent indeterminacy of IS success and failure......he paper questions common assumptions in the dominant representational framings of information systems success and failure and proposes a performative perspective that conceives IS success and failure as relational effects performed by sociomaterial practices of IS project actor...... that performed both different IS realities and competing IS assessments. The analysis shows that the IS project and the implemented system as objects of assessment are not given and fixed, but are performed by the agencies of assessment together with the assessment outcomes of success and failure. The paper...

  19. The characteristics of successful entrepreneurs

    OpenAIRE

    Pokrajčić Dragana M.

    2004-01-01

    This paper examines the economic, psychological and social-behavioral theories of the entrepreneur in order to determine the characteristics of a successful entrepreneur. The major contribution of economic theories of the entrepreneur is better understanding of the entrepreneur and his/her role in economic development. The psychological characteristic theory of entrepreneur argues that successful entrepreneurs possess certain personality traits that mark them out as special, and tries to dete...

  20. Fast Success and Slow Failure

    DEFF Research Database (Denmark)

    Mors, Louise; Waguespack, David

    Full Abstract: Do the benefits of cross boundary collaborations outweigh the costs? We seek to answer this question by examining 5079 collaborations in the Internet Engineering Task Force (IETF). Our findings suggest that crossing formal boundaries is positively related to success and efficiency ...... of success, suggesting that firms are better off investing in nondiverse projects. This finding has important implications for how we think about the benefits of seeking novelty....